Atmospheric Layers, The Biosphere, The Boundary Layer, Microclimate and Inadequate IPCC Models

models-vs-datasetsGuest essay by Dr. Tim Ball

During a university presentation I said the climate models do not include the Milankovitch Effect. A person challenged me saying he worked on climate models and it was included. My mistake was I forgot to say I was talking about Intergovernmental Panel on Climate Change (IPCC) models. Here is a summary of that and other missing pieces.

An IPCC climate modeller told me the time scale was not appropriate to include the Milankovitch Effect.  IPCC models project 50 years plus and Milankovitch variables change every year so when or where does a variable become important? What if one is omitted as inconsequential but becomes important even critical as thresholds change? A measure of the problem is given by the IPCC’s comment that underscores the subjectivity.

The differences between parameterizations are an important reason why climate model results differ.

What other variables, mechanisms or regions are omitted in the IPCC models? We know they don’t work because their predictions (projections) are wrong. Is it because of omitted variables or incorrect representation of atmospheric structure? Yes, but these limitations apply to all climate models. Of particular importance is the layer of air in contact with the land and water surface within the Biosphere. I love the Wikipedia definition that says

“It can also be termed the zone of life on Earth, a closed system (apart from solar and cosmic radiation and heat from the interior of the Earth) an largely self-regulating.”

This is like saying I am out of debt except for the house, the furniture and the car. The IPCC Report says,

“Nevertheless, models still show significant errors. Although these are generally greater at smaller scales, important large-scale problems also remain.” (My emphasis)

Generally the Biosphere lies within the Boundary Layer usually defined as the zone of turbulent flow below 1000 m. Climate below 2 m is very different yet ignored because it is below the standard Stevenson Screen weather station. Rudolf Geiger wrote about some of the differences in his classic 1957 book Climate Near The Ground now updated by Aron and Todhunter. Dynamics and constituents in this microclimate layer are very different from the rest of the atmosphere yet they’re excluded from climate models. In a 2012 WUWT article Eschenbach identified the time-lag situation. His postscript says—

“And yes, I’m sure that there are folks out there who knew this all along … but I didn’t, which is why I’m discussing it.”

Much more is bypassed because of the thirty-year hiatus caused by IPCC machinations and focus on human causes of climate change.

Other near surface measures like CO2 are taken above 2 meters.

“Air samples at Mauna Loa are collected continuously from air intakes at the top of four 7-m towers and one 27-m tower.”

How does that help understand energy flows in the atmosphere? CO2 and all other greenhouse gas (GHG) levels are higher in the first 2 m and the incoming solar radiation and outgoing long wave radiation have to pass through. At what level do greenhouse gases and aerosols start functioning?

Fundamental Problem

Models are three-dimensional mathematical representations of the atmosphere and oceans. Figure 1 shows the depiction at the IPCC web site.

clip_image002

Figure 1

The horizontal grid size rectangles are measured in latitude and longitude and determine the resolution of the picture you can draw. Pictures, whether printed or imaged on a screen are made up of individual dots that on computer screens are called pixels. The more dots the greater the clarity of the image. Similarly the more and smaller the rectangles for the climate model the better the picture. The trouble is it doesn’t matter how many or how small the grid, with no data the picture remains a blur.

There is virtually no weather data for some 85 percent of the world’s surface. Virtually none for the 70 percent that are oceans, and of the remaining 30 percent land there are very few stations for the approximately 19 percent mountains, 20 percent desert, 20 percent boreal forest and the 20 percent grasslands and 6 percent tropical rain forest.

It’s worse in the vertical with virtually no data in space and time and constantly changing very complex conditions. Again the illusion exists that increasing the number of layers built into the model creates better results. There are virtually no weather data measures and even fewer measures of atmospheric composition such as changing aerosol types and volumes.

General Atmospheric Aerosols levels

Hubert Lamb considered levels and nature of aerosols in the overall atmosphere with his 1970 work on a Dust Veil Index (DVI). How did changes in aerosols quantity and types result in changes in atmospheric opacity? The IPCC people knew of the issue because Michael Mann worked on the DVI.

Tisdale suggests it is another example of Mann’s “adjustment” of the record for a predetermined result. Generally, when any one associated with the IPCC is working on an issue it is not for enlightenment but because it undermines their hypothesis. The aerosol issue is much larger than the impact of volcanic aerosols. But even there the IPCC models are inadequate. We know from Pinatubo and all other major eruptions a significant factor is the amount of dust injected into the stratosphere. The IPCC models don’t appear to include the stratosphere as they state,

Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.

There appears some effort to get better measures of aerosols throughout the atmosphere.

The global Aerosol Model Intercomparison project, AeroCom, has also been initiated in order to improve understanding of uncertainties of model estimates, and to reduce them (Kinne et al., 2003).

Notice this is only the uncertainty of model estimates. Even if more accurate estimates are derived they face the problems of the physics applied to how these aerosols interact with radiation. Pierre Marie Robitaille explains some of the problems in a video.

Only crude estimates of the volume and nature of aerosols in the atmosphere exist at any level. Two major sources of condensation nuclei, prior to Svensmark’s addition of the third from cosmic radiation, were clay particles, particularly the smallest kaolinite with its hygroscopic characteristics, and salt particles from the ocean. The latter are in large quantities over the oceans. We had to wash down the planes after long anti-submarine low level patrols to remove the salt encrustations.

I was involved in measurements of aerosols as part of heat island studies for the City of Winnipeg in the late 1960s and 1970s. We had air samplers throughout the city and High volume samplers at ground level and on the roof (60m) of the university. Aerosol amounts and types varied considerably from hour to hour and on every other time scale. What was especially significant was the decrease in particle size with altitude as gravity and precipitation reduced the percentages of larger particles. The importance of this for input of IR to the surface and escape of long wave is dramatic, but also the direct heating of the atmosphere by the insolation they absorb and the LW emitted. The most critical portion is the layer beneath the Stevenson Screen as we found and Geiger identified.

In this layer physical quantities such as flow velocity, temperature, moisture, etc., display rapid fluctuations (turbulence) and vertical mixing is strong.

In a desperate understatement the IPCC tells us

Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.

 

Lower Layers of the Atmosphere

I was also involved in studies of the lower layers with weather instruments placed every 61m up a 305m tower just outside Winnipeg. Figure 2 shows a similar tower set up in Germany

clip_image004

Figure 2

On many days we found a remarkable number of temperature layers and more inversions than expected. These layers change in surprisingly short times hourly, daily, and all other time scales. This matched what I learned by taking and studying temperature layers in the ocean. We obtained temperature layers remotely using a Sonobouy. Dropped from the aircraft it released a thermometer on impact that measured to 100m and transmitted data back to the aircraft. We created bathythermographs indicating different temperature with depth from which we determined sound transmission layers.

Structure and dynamics of ocean and atmospheric layers within 100 meters of the surface are extremely complicated yet critical to movement of energy, especially vertically.

Characteristics of IPCC models are available in a table titled CMIP3 Climate Model Documentation, References, and Links. Models vary considerably, for example the Max Planck model says,

Boundary layer; Surface fluxes are computed from bulk relationships with transfer coefficients according to Monin-Obukhov similarity theory. Transpiration is limited by stomatal resistance and bare soil evaporation by the availability of soil water. Eddy viscosity and diffusivity are parameterized in terms of turbulent kinetic energy and length scales involving the mixing length and stability functions for momentum and heat respectively (Brinkop and Roeckner, 1995).

What are they using to create parameterized values? Most of the references in reports are to 1990s material. The number of vertical layers below 850 hPa varies: NCAR has 4, Planck 5 and NASA GISS 2. The Planck model has most, but layers every 300 m is inadequate.

The top two meters of the Earth’s surface and the bottom two meters of the atmosphere are the most critical layers. They are at an interface critical in understanding weather and climate. Aerosols, gas levels, energy exchanges, evaporation among other factors are far greater than for the rest of the atmosphere so the effect on insolation and long wave energy are significantly different. Too bad they are the least measured or understood of all the layers and omitted from the IPCC models.

0 0 votes
Article Rating
14 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Michael Gordon
May 19, 2014 9:31 am

Slightly OT but brings back fond memories…
“We had to wash down the planes after long anti-submarine low level patrols to remove the salt encrustations.”
Yes indeed. I’ve been on a few BT (Bathythermograph) and Ice Edge Recon flights in a Navy P3.

May 19, 2014 9:45 am

Regarding temperatures measured at sea, when I was serving on a weather observing cargo vessel the stevenson screen was always put on the weather bridge wing. In ballast it would have been about 60 feet above the sea and loaded about 47 feet. No-one ever thought about whether this made a difference. Similarly the sea temperature was taken from the main engine cooling water intake which was 15 or 28 feet below the water line depending on whether the ship was loaded or not.

richardscourtney
May 19, 2014 9:53 am

Tim Ball:
Thanks for your clear, concise and interesting article.
I especially liked your conclusion which says

The top two meters of the Earth’s surface and the bottom two meters of the atmosphere are the most critical layers. They are at an interface critical in understanding weather and climate. Aerosols, gas levels, energy exchanges, evaporation among other factors are far greater than for the rest of the atmosphere so the effect on insolation and long wave energy are significantly different. Too bad they are the least measured or understood of all the layers and omitted from the IPCC models.

YES!!!
And this is also true of carbon cycle models and not only climate models.

Richard

Doug Proctor
May 19, 2014 10:48 am

None of these points matters if observational evidence of change match the modeled expectations. The only reason we keep disputing the CAGW narrative is that we do not see the foretold events.
A huge peculiarity of the IPCC climate “science” is that after more than 27 years they have not whittled away at their 102-odd “scenarios”. Nowhere have I seen the idea pushed that Scenario “18” [hypothetical] can suddenly switch to Scenario “63”, but both 18 and 63 continue to be promoted with equal zeal, yet although observations only align with 3% or less of the models being used, there is no movement to declare ANY of the models invalidated. Somehow processes that produce a 0.7C rise by 2100 are compatible with models that produce a 5C rise by 2100.
The unwashed citizenry tremble before the outstretched arms of the White Coats: this seems to be the goal of not just the White Coats but the governors who sponsor them. The comparison to sixteenth century religious conflict is easy, Unfortunately the Cathars still exist, apparently, but now they are called Skeptics (or “Denialists”). Any student of history can predict what will happen to the Skeptics if Greenpeace and the Feds are given enough leeway.

May 19, 2014 11:18 am

Michael Gordon says:
May 19, 2014 at 9:31 am
Slightly OT but brings back fond memories…
=======
Yes indeed. Over 2500 hrs in P-3’s of various types. BT drops from islands around the Atlantic/Med. I have on occasions noted that lots of ocean temperature data should be available from the thousands of flights the Navy did around the world. The purples(flight summary) always included that info if the BT was successful. It is archived somewhere I would think.

Tom O
May 19, 2014 11:31 am

So many interesting points in here, and basically they all support my basic premise that if you don’t know all the factors involved in anything, you can’t “model it” successfully, and it’s ability to be predictive past the first minute after the computer run gets worse by the hour into the future. If you are “programming in” only the factors that “you believe are relevant, than you are building a program that will only support what you believe, and not what actually exists. Structure in, structure out. Basically, then, what they are doing with their climate models is trying to apply “mass psychology” to how an individual will react. Generalities can’t yield specifics.

PiperPaul
May 19, 2014 12:44 pm

“Generally, when any one associated with the IPCC is working on an issue it is not for enlightenment but because it undermines their hypothesis.”
“Underlines”?

BioBob
May 19, 2014 2:13 pm

Very interesting. Thanks.
Too bad we have wasted 30 years of ‘climate’ funding on so much pseudoscience while ignoring the need for real science investigations as you have described.

cnxtim
May 19, 2014 2:44 pm

To begin, it defies nature and reason to suggest all warning is bad for all the world – simply absurd.
It is also patently obvious the IPCC prunes anything that does not aid the CAGW mantra, Yet even so,, for the last 17 years they fail to prove the existence CAGW due to CO2 emission..
Cut the funds now – and stick to your guns Tony Abbott, one country leader who has been steadfast in his acceptance of the rights and facts of an alternative stance.

Keith Minto
May 19, 2014 3:00 pm

Climate below 2 m is very different yet ignored because it is below the standard Stevenson Screen weather station

conflicts with

The World Meteorological Organization (WMO) agreed standard for the height of the thermometers is between 1.25 m (4 ft 1 in) and 2 m (6 ft 7 in) above the ground.

I guess it has to be used by a person of average height.
But, a good article Dr Ball about the complexity and site problems in temperature measurement.
Roger Pielke also wrote about this in 2012 http://pielkeclimatesci.files.wordpress.com/2013/02/r-371.pdf

David Riser
May 19, 2014 5:18 pm

Part of the issue with US Meteorological standards is a lack of a single standard. In the US the standard sensor height is about 5ft, 5ft +- 1ft, or 1.5m depending on which NOAA document you are referencing sometimes those are referenced in the same document at different places. Interestingly it is common for instruments to be at some random height that is supposed to be included in the meta data for the station. regardless a NOAA discussion about this includes a graphic that discusses issues with sensor height, found here:
http://www.nws.noaa.gov/om/csd/pds/PCU6/IC6_2/tutorial1/Factors_exposure.htm
The relevent bit says: Figure 15 depicts the importance of adherence to the standard height of 5 feet above the ground for the placement of the temperature sensor.(data from Thornthwaite, 1948).Differences of only a few feet in the sensor height can make a one degree F difference in extremes for the day. This difference is significant in relation to the magnitude of the tenths of a degree climate signal researchers look for. Therefore, it is important we adhere to the standard sensor height (1.5 meters above the ground) to maximize detection of the climate change signal over time.
One other issue is the simple fact that instrumentation for temperature is required to have a accuracy of +_ 0.6C with a resolution of 0.1C so… .sarc “this is how we know we have definitely had .8C warming over the last 150 years” /sarc
Just some interesting silliness that demonstrates how little science is embedded in the temperature observing bit of climate research.
v/r,
David Riser

Mark Luhman
May 19, 2014 7:24 pm

The problem with models have and will always have, even if we knew all the variables, and had the computer power to calculate them. Every minute the model would the greater divergence it would have with reality, even if it followed the temperature directly/ The reason I say this there is no way to predict which way each of the million upon million of change occur independent of one or another or even dependent on others and we have no idea which is which. If a model every predicts the future climate all it would be is a happy accident, The next run may diverge wildly with the actual climate. Models are good for what if games, poor for the ability to predict the future. Models and modelers will never be able to solve the divergence problem since there are too many random event in climate, divergence will always remain. Think of it as modeling a coin flip. Yes we can build a model to simulate coin flipping but we cannot predict on any flip what it will be heads or tails. whether there will be a string of heads or tails, how long the string will last, how often they will occur. If you cannot model that how on earth can you think a model will be able to model climate? We will never be able to predict next years weather weather it will be warm or colder than last, wetter or drier. The same is true for the next decade the decade after it any decade coming as to the next century forget it. The same is for climate, it fairly easy to predict the future may be warmer because that the direction it is going, the trick is alway how much and when, even more so, will it trend the other way for a time. Let start using models for what they are really for, let use them to model known variables and deduct if we change a known variable how much it will affect something. As far as CO2 goes maybe we determine how much a doubling will affect the climate in a know and measured way, at this point the climate sensitivity has been state anywhere from less than .5 to as much as 5 degrees, how do you model that? Let us get some good numbers on what C02 does before we run off and do something stupid, the world has had millions die due to some do gooders have bake idea was implemented prematurely. Like vaccine are bad, x raying your feet for show fit was good. Asbestos was benign, blood letting was good. Cigarettes were healthy! Germs were not a problem and doctors did not need to wash their hands.

tty
May 20, 2014 12:23 am

We had to wash down the planes after long anti-submarine low level patrols to remove the salt encrustations.”
This effect also varies geographically. When the Swedish Airforce had fighter wings based on the West Coast (F9 and F10) the aircraft from these wings always were much more corroded because they flew over the saline North Sea, compared to aircraft from other wings that flew mostly over the brackish Baltic.

Berényi Péter
May 20, 2014 3:06 am

In a naked 1 dimensional radiative transfer model with no convection &. evaporation, only a semi transparent atmosphere with different optical depths in the short and long wave bands of electromagnetic spectrum, there is a huge temperature discontinuity right at the surface (~20 K).
In reality this discontinuity all but disappears as soon as non radiative heat exchange processes are introduced, but the manner it happens is of course heavily dependent on the very properties of the interface.
If these properties are neither resolved in the model nor measured in the actual climate system, the result can’t be anything but pure guesswork.