Scientific consensus revisited

Guest post by Juraj Vanovcan

“You can fool all the people some of the time, and some of the people all the time, but you cannot fool all the people all the time.” – Abraham Lincoln

––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––

As of today, climate models are the last realm where rise of trace concentration of carbon dioxide, vital gas for biosphere, causes catastrophic warming with all accompanying results. It cannot be forgotten, though, that these very models are a basis for regulations, taxes and restrictions imposed by policy-makers, developed countries destroy their own energy basis in environmental madness and media churn out apocalyptic prophecies on nearing end of the world in medieval style. There is some general consciousness that models are not perfect yet, but it is only a question of time or hardware power to get the temperature rise in year 2100 right; the question is how much, not if at all. However, considering our experience with fundamental claims of orthodox climatology, it might be of great interest to look closely at these crystal balls with powerful silicon hearts and to assess credibility of their projections for year 2100 by the simplest way – by comparing their outputs with present observations.

***

Virtual reality of climate models is to be met in the first chapter of IPCC AR4 from 2007. Under crazily rising curves of various colors for various “emission scenarios” (from usual “lights and heating on” to “now all of you, hold your breath for the next 100 years”) there is a black line of climate model output for 20th century, crawling along the x-axis.

Fig. 1 Climate model ensemble mean for 20th century and different emission scenarios until 2100 (Source: IPCC AR4, WG I: The Physical Science Basis)
Climate models strive to simulate the natural variability, from the upper atmosphere to the deep streams in ocean abyss, estimated intensity of Sun cycles and try to estimate changes in cloud cover, vegetation or amplification effects caused by changes of snow and ice cover. On this presumably natural background, “anthropogenic” forcing is piled on: from Chinese coal ash to carbon dioxide, from anticipated changes in ozone layer through the traces of methane to condensation trails crossing the sky.

According to their authors, it is not possible to explain the recent (post-1970) warming without the “anthropogenic forcing”, as presents a figure from the last IPCC report.

Fig. 2 Comparison between global mean surface temperature anomalies (°C) from observations (black) and AOGCM simulations forced with (a) both anthropogenic and natural forcings and (b) natural forcings only. (Source: IPCC AR4, WGI, Figure 9.5.)
Let’s have a close look now. Model ensemble mean (red) at the top chart resembles an exponentially rising curve, similar to that of CO2 in the atmosphere. Its rise is especially pronounced after 1970 and exhibits few temporary dips caused by volcanic eruptions in 80-ties and 90-ties. The instrumental record (black) looks similar at the first look, but there are some discrepancies.

Fig. 3 CMIP3 model ensemble mean against instrumental record, period 1900-2012. (Source: KNMI Climate Explorer, CRU)
It is obvious, that model cannot replicate warming between 1910-1945, when global mean temperature rose in mere 35 years by hefty 0.7 deg C (model suggests 0.1 deg C only). This was followed by 30-year period, when temperature stagnated or decreased, while models stubbornly exhibit a continuous rise. The only period where model and observation agree is period 1975-2002; since then, again a divergence occurs.

Using global data is, however, questionable due to several reasons. First, the instrumental record (here represented by HadCRUT3 global surface dataset) is skewed after 1945 by introducing an artificial adjustment related to uncertainties of sea surface temperature sampling methods (see bucket versus engine inlets), so instead of continuous decrease in 1945-1975 it shows a step decrease and stagnation. Another problem is that some part of the modern warming has been caused by urban heat effect and improper siting of meteorological stations, which inflates the temperature trend upwards. Third, a global average is a virtual number which can hide anything: increase of salary income of management in a company may overweigh stagnation or even decrease of income of other groups of employees, and though the average salary in the company has risen, it was definitely not “global”. Let’s now compare the model projections with observations for individual selected areas.

***

A 12,000 km long strip of tropical Pacific from the Peruvian coast to Salomon islands is a realm of ENSO phenomenon, where regular changes in trade winds trigger El Nino and La Nina events. These affect precipitation, global temperature and alter the weather patterns worldwide. Short-term climate models are poor in predicting the ENSO phases in a few month scales and even climatologists are not unified, whether and how future warming would shape their frequency, intensity or type. What is however predicted is a constant rise of surface temperature of tropical Pacific by 0.2-0.3 deg C/decade.

Fig. 4 Comparison of CMIP3 model ensemble mean against observations in NINO 3, NINO 3.4 and NINO regions for 1981-2012 period. (Source: KNMI Climate Explorer, SST dataset Reynolds OIv2)
We will not compare the interannual variability, which is considerable in that region and none of the models are able to predict it. In direct disagreement with climate models is, however, an overall surface temperature trend, which is in reality slightly negative during the last 30 years. Let’s admit that tropical Pacific is just a small part of the planet surface and repeat this test on a third of world oceans, on Eastern Pacific with surface area of 85 million km2.

Fig. 5 CMIP3 model ensemble mean against the sea surface temperature measured by satellite, East Pacific, period 1981-2012. (Source: KNMI Climate Explorer, SST Reynolds OIv2)
The jagged trend again shows ENSO pattern, but overall temperature trend of the most part of Pacific is in the contrary to climate model basically flat. But, if a third of oceans show no warming in three decades, it has no sense to claim the warming has been “global”, exactly as an employee with frozen wage will not agree he enjoys global salary rise. But it will be worse.

Based on “greenhouse effect” theory, its intensification should manifest itself by most in polar areas, where cold air contains only a little of water vapor. Here, a rise in carbon dioxide, the second “greenhouse gas after a water vapor, should block the outgoing long-wave radiation from the surface, resulting in a shift of radiation balance and increase of the surface temperature. Despite rise in carbon dioxide for given period by 50 millionths and warming by 0.6 deg C calculated by models, observations are exactly opposite: in 1979-2012 the lower troposphere above Antarctic and surrounding ocean cooled by half degree.

Fig. 6 CMIP3 model ensemble mean against the lower troposphere temperature measured by satellite, Antarctic, period 1979-2012. (Source: KNMI Climate Explorer, UAH MSU v5.4)
This surprising finding is, of course, in disagreement with common knowledge formed by media and seriously looking scientists, which keep on telling us about imminent danger to penguins, breaking icebergs and collapsing ice sheets. Who objects comparing the modeled surface temperature with measured lower troposphere, the following chart shows sea surface temperature of surrounding polar ocean and sea ice extent in Antarctic for the same period.

Fig. 7 Sea surface temperature south of 60S latitude and sea ice extent in Antarctic for period 1979-2012. (Source: Reynolds SST OIv2, NOAA)
Without any regard to modeled increase of the “greenhouse effect”, during the last 30 years of “human caused global warming”, Antarctic is getting colder, so as the surrounding ocean. Sea ice extent in accordance increases and now even reaches its maximum during the satellite period! This fact somehow slipped out of the radar of environmental newspapermen, since they now enjoy their regular September hysteria about the sea ice minimum on the opposite pole. Antarctic is the poor employee, whose income decreased and cannot be persuaded to enjoy the “global” salary increase within the company.

Of course, there are other areas where warming during last few decades really occurred. Sea ice extent in Arctic is measured by satellites since 1979 and in 2007 and now in 2012, summer minims were recorded. It is to be pitied, that similar measurements were not possible in 30ties; according to meteorological records, Arctic was just as “warm” as today. As becomes a rule, climate conditions north of 60N latitude are totally different compared to the model outputs for the same area.

Fig. 8 Modeled climate in Arctic and observed sea surface temperature north of 60N for 1900-2012. Periods of warming and cooling are differentiated by colors. (Source: KNMI Climate Explorer, HadSST2)
In the contrary of „state-of-the-art“ global circulation climate models powered by more and more computer power, Arctic has warmed by itself in the first half of 20th century by considerable 1.5 deg C. Exactly this period (sometimes also the mid of 19th century) is recognized as an end of Little Ice Age. After temperature plateau until 1960 Arctic cooled (a situation non-existent in the climate models) and warmed back to previous levels of late thirties, back then a natural state but today a sure sign of incoming catastrophe, forcing us to act now.

Arctic, in the contrary of Antarctic with its huge mass of ice cover as thick as several kilometers, is mostly an ocean with just a few meters of sea ice, which extent, if being less in September, does – nothing… melted sea ice even does not rise the sea level and after few weeks it will again refroze by rate of two to three millions km2 per month until the March maximum. Environmental correspondents will, however, be then fully occupied by that breaking chunk of ice in summer Antarctic, on which those amiable black-and-white penguins walk around and jump into the cold water.

There are masses of warmer water entering the Arctic from North Atlantic and Pacific, which give more sense than to rely on fiery sword of back-radiation from Kiehl-Trenberth diagrams, if we want to get some real understanding. Comparing the climate model outputs with observations is again so desperately different, that one wonders whether the climatologists feel embarrassed by themselves.

Fig. 9 Comparison of climate models versus sea surface temperature observations for the North Atlantic, period 1900 – 2012. (Source: KNMI Climate Explorer, HadSST2)
North Atlantic shows distinctive multidecadal climate variability, with alternating periods of linear warming and cooling. Obsessive urge of climatologist to put a linear trend through the record and victoriously claim “but it is warming” is, however, as naïve as to claim that function y = sin(x) is rising, because in range [–/2 : 21/2] its linear trend is positive.

CMIP3 model ensemble mean, based on which the fourth IPCC report predicts warming in 2100 by 3 – 7 deg C and because of which we have to keep at home a cartoon box full of classic light bulbs, is not able to simulate surface temperatures of North Pacific as well.

Fig. 10 Comparison of climate models versus sea surface temperature observations for the North Pacific, period 1900 – 2012. (Source: KNMI Climate Explorer, HadSST2)
According to the climate model, with whom agrees 97% of climatologists (but disagrees with real observations), North Pacific should be warmer today by 1 deg C, compared to 1940. In reality, sea surface temperature of North Pacific is exactly the same, and the cooling trend just began 7 years ago. It is not a rocket science to expect the Arctic to cool soon, being fed by ocean currents from both cooling oceans.

The same model-reality discrepancies can be found for the surface stations. As an example, here is a temperature record from Hungarian Debrecen meteorological station since 1860, compared to model ensemble mean for given area. These records bear no resemblance, save the climatologists insisting that the post-1970 warming must be anthropogenic, because their models say so.

Fig. 10 Comparison of climate models versus meteorological observations for Debrecen, Hungary, period 1860/1900 – 2012. (Source: KNMI Climate Explorer)
***

Comparing the climate model outputs against real observations for 20th century, following conclusions can be made:

a) so-called “global warming” in period 1975-2002 was not global at all, but was an average of local areas of which some warmed, some cooled and some remained stable

b) models, driven by virtual fiction of “increased greenhouse effect” expect an uniform and increasing warming on the whole surface, being strongest in polar areas, which is in direct disagreement with observations

c) climate models cannot replicate natural climate fluctuations (massive warming in the first part of 20th century, followed by cooling)

d) the only parameter, which matters in the models, is carbon dioxide

e) northern hemisphere (both oceans and surface) presents a distinctive multi-decadal variability as a cycle with 60 years period, which bears no relation to the carbon dioxide levels

f) areas of world oceans of considerable size show no warming at all, despite claims about “global warming”

g) south polar areas are cooling during the last three decades, despite of carbon dioxide steady increasing; this is confirmed by measurements of lower troposphere, sea surface temperature and increase of sea ice extent

h) “Global temperature” since 2002 steadily decreases in direct contrast with climate models; according to observations from the past, at least 30-year cooling period is to be expected.

Fig. 11 Comparison of global mean temperature by CMIP3 model ensemble mean and meteorological and satellite dataset for 2001 – 2012 period. (Source: KNMI Climate Explorer, HadCRUT3, RSS v3.3)
It is obvious, that climatologists are plain wrong. Models are calculated on faster supercomputers, but the only difference is that the same wrong curves are calculated more quickly. Models cannot model natural fluctuations and just copy the CO2 curve. Individual surface areas show poor correlation with carbon dioxide levels, until we do not selectively find a limited correlation and claim it for causation, which is standard practice of climatology today.

It is difficult to say what is the reason; whether the models are just oversensitive to “greenhouse gases”, or there is a flaw in their very physical basis. Let’s dig deeper here.

There is one basic assumption; never proved or measured, but generally accepted that “presence of greenhouse gases warms the surface by 33K”. Number 33 is a difference between a hypothetical Earth without the “greenhouse gases” (but calculated with the same albedo made by clouds, paradox which nobody minds) and our planet as we know it. But start with the Moon, which gets the same Sun insulation and ask two simple questions: why is the Moon surface so warm (+110 deg C) during the day, against 20 deg C on Earth? Why is the Moon side in shadow so cold (-150 deg C) compared to our night time temperature of 10 deg C?

There is an easy answer to the first part: daily temperature on Earth is reduced by clouds (= condensed water vapor, by far the strongest “greenhouse gas”), by reflection of Sun rays by ice and snow (= “greenhouse gas” in solid form), by evaporation cooling of the surface (phase change of water to “greenhouse gas”) and by convective cooling by air, where warmed air continually rises and is being replaced by a cool one. The atmosphere – consisting of 99% nitrogen, oxygen and argon – and the main “greenhouse gas” are the sole reason, why we do not need to wear space suits in a daytime like Apollo 11 crew on the Moon.

On the other side, our night is by far warmer than that on the Moon. If anyone believes, that our night is warmed by a steady flow of radiation in hundreds Watts from “greenhouse gases”, warming us by incredible 160 deg C (exactly this show the radiation diagrams), then covering oneself against it means he freezes instantly… or not?

The basic problem with the virtual concept of “greenhouse effect” is that it reduces energy transfers between the surface and atmosphere to hypothetical radiation arrows. It is based on one single and nonsensical assumption, that the surface is warmed above its theoretical temperature ONLY by radiation, which has been captured and re-radiated by molecules of “greenhouse gases”. However, atmosphere is 99.9% nitrogen, oxygen and argon, which, obeying the Boltzmann law, MUST also emit an infrared radiation, since their temperature is above the absolute zero. It does not matter, whether they gained their energy by absorbing the outgoing long-wave radiation as the water vapor or carbon dioxide, or they obtained it by contact with the surface, warmed by sun light. Good luck then, trying to recognize the radiation from that anthropogenic CO2 molecule, mixed with other twenty thousand atmospheric molecules. And if, despite physics, nitrogen, oxygen and argon do not radiate at night and still have temperature of 10 deg C, then they simply retain the daily heat and our night is warmer, because of mere presence of atmosphere. Infrared radiation becomes then a secondary sign of temperature, because atmosphere radiates because it is warm and it is not warm because it radiates, similarly as the river flows not because it is powered by water mills. The whole Babylon tower of modern climatology seems to be based on flawed assumptions and total inconsistency between models and reality confirms it.

***

Then, other questions appear: if the models fail so blatantly to simulate the 20th century, how reliable are their projections for the year 2100? Is it possible at all, that these very models are a basis for economical central planning and distribution? How many years to watch the growing difference between the models and reality since 2002, until somebody admits the models are wrong? How long will be that circus paid by taxes of ordinary citizens? When apologize scientists, media and politicians for that three lost decades of pseudoscientific hype called “anthropogenic global warming“? Who will be held responsible for wasted billions, corruption with carbon credits, devastated environment and whole generations dumbed down by environmentalist propaganda? These are legitimate questions, and citizens in developed countries have the right to ask them and expect answers.

***

“Future generations will wonder in bemused amazement that the early 21st century’s developed world went into hysterical panic over a globally averaged temperature increase of a few tenths of a degree, and, on the basis of gross exaggerations of highly uncertain computer projections combined into implausible chains of inference, proceeded to contemplate a roll-back of the industrial age.” – Richard Lindzen, MIT

0 0 votes
Article Rating
45 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
October 22, 2012 7:54 am

Thanks Anthony!

October 22, 2012 8:08 am

Excellent presentation, Mr. Vanovcan.
Thank you.

David Ball
October 22, 2012 8:13 am

Would you trust a computer to drive your car autonomously? With you and your family in the car of course. They are attempting the same thing on a societal scale. Perhaps one day, but not today.

rgbatduke
October 22, 2012 8:26 am

Good top article through the analysis of the data and comparison of observation and model prediction. Not so good when it discusses the “greenhouse effect”. The point is that there could well be (and almost certainly is) a strong greenhouse effect from carbon dioxide when increasing its concentration from nothing at all up to the point where it is optically opaque with a short path length (which it is). It is the marginal increase in greenhouse warming that results from an increase in concentration once it is already thoroughly saturated, in an atmosphere that (as you do point out) has multiple modes for the reflection, absorption, transport, and loss of heat, with complex (and largely unknown) feedbacks, driven by a surprisingly variable star whose microstate appears to be strongly correlated with global temperature even though nobody quite understands why or how, as the “direct” effects of simple variability around the long term mean insolation appear to be amplified compared to what we might expect based on simple energy models (and then, it isn’t clear that the long term mean itself is a constant).
So I like the conclusion that the predictive models suck at both prediction and hindcasting (because they do) and that a sober look at the 100+ year data is not terribly convincing as far as CO_2 driven CAGW is concerned (because it isn’t). It diminishes from the quality of the argument when you pick on greenhouse warming and assert that it has nothing to do with the Earth’s mean temperature difference relative to e.g. the moon. That’s simply unnecessary and probably false. The greenhouse effect doesn’t have to not exist at all for CAGW to be false — it just has to be saturated and coupled to a variety of feedback mechanisms that are no worse than neutral to slightly positive. I think that those mechanisms are rather more likely to turn out to be globally slightly negative, even though locally they may well be positive (UHI, for example).
I firmly believe in local warming. Temperatures in any city are far warmer than they are in the surrounding countryside. So are temperatures recorded at most airports. Temperatures on or above a cultivated field are almost certainly significantly different from what they would be if the field were the original old-growth forest that was cut down to make room for it. The Sahara is hotter and drier than it would be if humans hadn’t introduced goatherding 10,000 or so years ago and the goats hadn’t eaten all of the scrub brush that bound up the moisture and moderated its temperature. Whole strips of the US with excess population, cultivation, towns, airports and cities built alongside our network of roadways are almost certainly a degree or more warmer than the forested countryside in the centers of the civilized network. Sadly, most of our temperature measurements come from the connected part of the network, relatively few from the hard-to-reach-but-vastly-greater areas in between.
Which is why the satellite measurements and ARGUS measurements are, and will continue to be, so important. They are far less biased, far more difficult to artificially bias, and truly global in scope. In another fifty or sixty years, we’ll have enough data and science to (perhaps) resolve the problem of predicting global climate. In the meantime, it’s ok to just say “We don’t know”, or in the words of the eight-ball, “Answer cloudy, try again later”.
rgb

October 22, 2012 8:30 am

David Ball says:
October 22, 2012 at 8:13 am
Would you trust a computer to drive your car autonomously? With you and your family in the car of course. They are attempting the same thing on a societal scale. Perhaps one day, but not today.
David – The answer is YES….I would. In fact you have probably flown on an airplane which has TAKEN OFF, CRUISED and LANDED under computer control.
However, as an ENGINEER I can assure you, the computational difficulty (while fairly high) is MUCH MUCH MUCH less than modeling the Earth Surface/Atmosphere/Sun interface and the very stochastic system we call “weather” which leads to CLIMATE.
Thus, I would not make the argument against computer usage in this manner.
The presentation here, is the best way to go about it. I.e., we have DATA now, fairly reliable, and we have large scale/long term “computer models”. Let’s back fit, run them against the data we have, use them to predict futures we know (i.e. from a back dated starting point) and see how good they are, predicatively. The answer seems to be VERY POOR.
THAT is the reason we can’t trust those models or use them for “policy” decisions.
Max H.

October 22, 2012 8:46 am

Yes, the 33 K assumption is laughable, the surface is pre-dominantly cooled by non-radiative fluxes (evaporation and convection) and secondarily by radiation to atmosphere and a small radiative flux directly to space.
http://science-edu.larc.nasa.gov/EDDOCS/images/Erb/components2.gif
What I find interesting is that the total terrestrial cooling flux (70% of incoming solar) is only 9% direct surface radiation, the rest (91%) is atmospheric radiation (including clouds).

David Ball
October 22, 2012 9:02 am

Max Hugoson says:
October 22, 2012 at 8:30 am
You are correct regarding the plane, which is why I used the car analogy. Has an autonomous (this is an important word, as the plane is still constantly monitored by the crew) ground vehicle been developed? The short comings are the amount of variables to be accurately dealt with, as with computer modelling of climate.

richardscourtney
October 22, 2012 9:09 am

Juraj Vanovcan:
You provide a good summary. Thankyou. I write to provide a comment.
You say

d) the only parameter, which matters in the models, is carbon dioxide

And

It is difficult to say what is the reason; whether the models are just oversensitive to “greenhouse gases”, or there is a flaw in their very physical basis.

There is a demonstrated major flaw in their very physical basis. And it is demonstrated by the fact that carbon dioxide is NOT the only parameter which matters in the models: aerosols are also important.
None of the models – not one of them – could match the change in mean global temperature over the past century if it did not utilise a unique value of assumed cooling from aerosols. So, inputting actual values of the cooling effect (such as the determination by Penner et al.
http://www.pnas.org/content/early/2011/07/25/1018526108.full.pdf?with-ds=yes )
would make every climate model provide a mismatch of the global warming it hindcasts and the observed global warming for the twentieth century.
This mismatch would occur because all the global climate models and energy balance models are known to provide indications which are based on
1.
the assumed degree of forcings resulting from human activity that produce warming
and
2.
the assumed degree of anthropogenic aerosol cooling input to each model as a ‘fiddle factor’ to obtain agreement between past average global temperature and the model’s indications of average global temperature.
More than a decade ago I published a peer-reviewed paper that showed the UK’s Hadley Centre general circulation model (GCM) could not model climate and only obtained agreement between past average global temperature and the model’s indications of average global temperature by forcing the agreement with an input of assumed anthropogenic aerosol cooling.
The input of assumed anthropogenic aerosol cooling is needed because the model ‘ran hot’; i.e. it showed an amount and a rate of global warming which was greater than was observed over the twentieth century. This failure of the model was compensated by the input of assumed anthropogenic aerosol cooling.
And my paper demonstrated that the assumption of aerosol effects being responsible for the model’s failure was incorrect.
(ref. Courtney RS An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).
More recently, in 2007, Kiehle published a paper that assessed 9 GCMs and two energy balance models.
(ref. Kiehl JT,Twentieth century climate model response and climate sensitivity. GRL vol.. 34, L22710, doi:10.1029/2007GL031383, 2007).
Kiehl found the same as my paper except that each model he assessed used a different aerosol ‘fix’ from every other model. This is because they all ‘run hot’ but they each ‘run hot’ to a different degree.
He says in his paper:

One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.
The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy. Kerr [2007] and S. E. Schwartz et al. (Quantifying climate change–too rosy a picture?) available at http://www.nature.com/reports/climatechange, 2007
recently pointed out the importance of understanding the answer to this question. Indeed, Kerr [2007] referred to the present work and the current paper provides the ‘‘widely circulated analysis’’ referred to by Kerr [2007]. This report investigates the most probable explanation for such an agreement. It uses published results from a wide variety of model simulations to understand this apparent paradox between model climate responses for the 20th century, but diverse climate model sensitivity.

And, importantly, Kiehl’s paper says:

These results explain to a large degree why models with such diverse climate sensitivities can all simulate the global anomaly in surface temperature. The magnitude of applied anthropogenic total forcing compensates for the model sensitivity.

And the “magnitude of applied anthropogenic total forcing” is fixed in each model by the input value of aerosol forcing.
Thanks to Bill Illis, Kiehl’s Figure 2 can be seen at
http://img36.imageshack.us/img36/8167/kiehl2007figure2.png
Please note that the Figure is for 9 GCMs and 2 energy balance models, and its title is:
”Figure 2. Total anthropogenic forcing (Wm2) versus aerosol forcing (Wm2) from nine fully coupled climate models and two energy balance models used to simulate the 20th century.”
It shows that
(a) each model uses a different value for “Total anthropogenic forcing” that is in the range 0.80 W/m^-2 to 2.02 W/m^-2
but
(b) each model is forced to agree with the rate of past warming by using a different value for “Aerosol forcing” that is in the range -1.42 W/m^-2 to -0.60 W/m^-2.
In other words the models use values of “Total anthropogenic forcing” that differ by a factor of more than 2.5 and they are ‘adjusted’ by using values of assumed “Aerosol forcing” that differ by a factor of 2.4.
So, each climate model emulates a different climate system. Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth. And the fact that they each ‘run hot’ unless fiddled by use of a completely arbitrary ‘aerosol cooling’ strongly suggests that none of them emulates the climate system of the real Earth.
Richard

October 22, 2012 9:17 am

Thanks Juraj; Excellent article!
Thanks Anthony, WUWT is the best science blog, no doubt.

ConfusedPhoton
October 22, 2012 9:20 am

I doubt Juraj Vanovcan will be receiving a Christmas card from Kevin Trenberth this year!
Such a travesty!

October 22, 2012 9:22 am

Thanks Richard, excellent comment!

October 22, 2012 9:29 am

An interesting piece, but it appears that in Fig. 7 Sea surface temperature south of 60S latitude and sea ice extent in Antarctic for period 1979-2012. (Source: Reynolds SST OIv2, NOAA), you have used 90-60 N SSTs versus Southern SIE.

kim
October 22, 2012 9:58 am

Years ago I said that the modelers were trying to keep their toys on circular tracks, on the ceiling.
==============

Editor
October 22, 2012 10:06 am

Juraj says:

However, atmosphere is 99.9% nitrogen, oxygen and argon, which, obeying the Boltzmann law, MUST also emit an infrared radiation, since their temperature is above the absolute zero.

Oh, please, learn some physics before uncapping your electronic pen. There is nothing in the Stefan-Boltzmann law that says that all things MUST emit infrared. Instead, we find that according to Kirchhoff’s law, the emission of radiation at a given wavelength is equal to the absorption of radiation at that same wavelength.
And in the real word, most diatomic gases (oxygen, nitrogen) and symmetrical monoatomic gases like argon neither absorb nor emit any significant amount of radiation in the thermal infrared spectrum.
In other words, you don’t have a clue what you are talking about with respect to infrared emissions, which makes all of your claims very suspect. I hate it when that happens, because then I can’t believe anything you say, and I have to go through each and every word to find out where you might be right.
And that’s generally far too large a chore to undertake with someone clueless enough to claim that atmospheric nitrogen and oxygen emit thermal radiation. Come back when you’ve done your homework.
w.

richard telford
October 22, 2012 11:20 am

Why would you expect the multimodel mean to match the single realisation of observed climate, especially at regional scales? This is like expecting each model to give the same result.

October 22, 2012 11:22 am

Quoting from the article…
“There is an easy answer to the first part: daily temperature on Earth is reduced by clouds (= condensed water vapor, by far the strongest “greenhouse gas”), by reflection of Sun rays by ice and snow (= “greenhouse gas” in solid form), by evaporation cooling of the surface (phase change of water to “greenhouse gas”) and by convective cooling by air, where warmed air continually rises and is being replaced by a cool one. The atmosphere – consisting of 99% nitrogen, oxygen and argon – and the main “greenhouse gas” are the sole reason, why we do not need to wear space suits in a daytime like Apollo 11 crew on the Moon. ”
And not wishing to labour the point too much but you omit:
1) That greenhouse gasses prevent about half the direct solar radiation in the near infra-red from reaching the Earth’s surface in the first place.
2) While non GHGs do radiate they are by no means as effective as GHGs which can convert thermal energy to radiant energy. The latter is the only way energy can leave the planet. It is that simple. Without GHGs the planet would be much warmer. Non GHGs and a black body Earth can only radiate more if their temperature is increased, therefore remove GHGs from the atmosphere and the temperature would sky-rocket.
At least you are getting there J.
Get Cool everyone

beng
October 22, 2012 11:35 am

***
rgbatduke says:
October 22, 2012 at 8:26 am
The Sahara is hotter and drier than it would be if humans hadn’t introduced goatherding 10,000 or so years ago and the goats hadn’t eaten all of the scrub brush that bound up the moisture and moderated its temperature.
***
I think the proxy evidence suggests the Sahara climate varies w/precession (~20kyr cycles). When max summer insolation occurs (like 10kyrs ago), the monsoon expands north to varying extents & transforms the Sahara to a savannah/tropical-dry type forests w/rivers & lakes. Eventually summer insolation wanes & the monsoon slowly decreases & the desert expands.

October 22, 2012 11:39 am

Re: Dave Wendt, KNMI Climate Explorer uses -60N/-90N latitude, which is actually 60S/90S. It is correct.
Re: Willis Eschenbach: here I quote a NASA article:
“In solids, the molecules and atoms are vibrating continuously. In a gas, the molecules are really zooming around, continuously bumping into each other. Whatever the amount of molecular motion occurring in matter, the speed is related to the temperature. The hotter the material, the faster its molecules are vibrating or moving.
Electromagnetic radiation is produced whenever electric charges accelerate – that is, when they change either the speed or direction of their movement. In a hot object, the molecules are continuously vibrating (if a solid) or bumping into each other (if a liquid or gas), sending each other off in different directions and at different speeds. Each of these collisions produces electromagnetic radiation at frequencies all across the electromagnetic spectrum.
Any matter that is heated above absolute zero generates electromagnetic energy. The intensity of the emission and the distribution of frequencies on the electromagnetic spectrum depend upon the temperature of the emitting matter.”
Anyway, I have all bases covered. If 99.9% of the atmospheric mass do not emit any significant radiation and it is still warm, then it works as a real blanket and warms the surface. Nobody yet demonstrated, or event attempted to, how much of our night surface warmth comes from recycled IR and how much from the Sun energy, simply retained by the air. It is 1,3 kg/m3 of mass, which somehow slipped out the radiation diagrams.
But the main claim of mine stands unchanged: the assumption, that the surface is warmed above its theoretical temperature ONLY by radiation, which has been captured and re-radiated by molecules of “greenhouse gases”, is basically wrong.

Vince Causey
October 22, 2012 11:58 am

Juraj V
“Electromagnetic radiation is produced whenever electric charges accelerate – that is, when they change either the speed or direction of their movement. ”
Thanks for that Juraj. I must admit I was confused before. As I understand Kirchoffs law, a molecule will only emit radiation if it can absorb it – just as Willis says. But then I was wondering, if that is the case, what would happen to the temperature of a ball of inert gas (eg N2) suspended in space. If the Kirchoff law says that the molecules cannot emit radiation, then how will the gas loose heat? It would remain at a fixed temperature for ever. This cannot be right.
But – if all moving molecules will emit radiation as they accelerate, then the gas will cool down over time in accordance with our common sense understanding.
Thank you for clearing up this point.

JJ
October 22, 2012 12:43 pm

rgbatduke says:
Good top article through the analysis of the data and comparison of observation and model prediction. Not so good when it discusses the “greenhouse effect”.

I concur.

JJ
October 22, 2012 1:02 pm

David Ball says:
You are correct regarding the plane, which is why I used the car analogy. Has an autonomous (this is an important word, as the plane is still constantly monitored by the crew) ground vehicle been developed?

Yes. Google runs a fleet of them. They claim more than 300,000 road miles without an accident.
Modeling the climate of the planet is the most complex task humans have ever undertaken, and we haven’t been at it very long. That, along with political interference, is why we still suck at it. ‘We’ likely will still suck at it long after we are all gone. We may find the problem intractable.

Ike
October 22, 2012 1:15 pm

The various disagreements in the comments regarding the ‘greenhouse gas’ effects and the AGW hypothesis overlook a couple of unavoidable facts. First, mankind’s production of CO2 is approximately 4% (.04) of the total CO2 from all sources. Second, the effect of CO2 as a ‘greenhouse gas’ accounts of approximately 4% (.04) of the overall effects of all those gases. What is 4% of 4%? A very small number, that being .0016 if my multiplication is correction. In order to be a valid hypothesis, it must be shown that an increase of .16% (.0016) in the ‘greenhouse gas’ effect will cause the planet to heat up catastrophically. While it may be true that some other as yet unestablished mechanism will cause the planet to heat to catastrophic levels, in the absence of proof – not arguments to authority or claims of consensus nor yet ad hominem attacks on those who oppose the AGW hypothesis, but observations that prove or at least tend to prove the existance of such a mechanism, we must say the hypothesis fails. Indeed, there is not only no proof of the AGW hypothesis in the form of observational evidence, but disproof of it in the actual data. Please note that unlike professional scientists I do not believe that the output of a computer model consistitutes data. Politicians conveniently believe such things because it provides excuse for the expansion of their power; I do not have the need for such ego support. Alas, politicians make laws and rules etc which are enforced by ‘men with guns’ as someone once wrote, so the ignorant and thoughtless opinions of politicians are what we must obey to our detriment.

Billy Liar
October 22, 2012 1:44 pm

Who will be held responsible for wasted billions, corruption with carbon credits, devastated environment and whole generations dumbed down by environmentalist propaganda?
I’ll have a stab at it. James Hansen?

Rosco
October 22, 2012 3:19 pm

I find lots of things interesting in the “greenhouse effect” theory.
Nitrogen, Oxygen and Argon don’t emit radiation ? I am amazed at this – I thought everything that had a temperature above absolute zero emitted radiation – am I wrong or is it really that 99 % of the atmosphere is stone cold ?
“Trapping” radiation is, I assume, equivalent to really good insulation. Has there ever been any experimental evidence that really well insulated materials heat themselves up ? It is hard to beat a vacuum flask but even here liquids eventually cool down.
And don’t infra-red photographs show IR passing straight through windows in houses, for example, yet the “impermeability” of glass to IR is supposed to be the evidence of how glass greenhouses heat up ?
If the solar constant is truly onstant and CO2 and water vapour “trap” radiation causing warming then we have the interesting paradox of something heating itself up by radiating less BUT only things that are actually cooling radiate less – this is clearly a consequence of the Stefan-Boltzmann equation – I don’t see how any other interpretation is even logical.
If such a paradox can exist in the atmosphere – hell I don’t know for sure one way or the other – then the basic presumption of climate science – radiative equilibrium – never existed in the first place and all the Maths is shot to hell anyway.
I’m convinced the Maths of climate science is completely wrong because it is based on averages.
I’m convinced the basics interpretations of the physics is wrong because it has duplicated energy by considering radiation as some sort of phenomenum seperate from other established and quantified properties such as thermal conductivity.
It is not possible to measure thermal conductivity or specific heat capacity that does not already account for radiation effects – to deny this is plain silly unless you can show how it could be done.
The final point I want to make was made by Alan Siddons in his analysis of planetary atmosphere in the solar system.
At sea level on Earth, or 1 Bar of atmospheric pressure, the “greenhouse effect” is said to raise Earth’s temperature from the “Blackbody” effective temperature of ~255 K to the surface temperature of ~288 K – approximately 8 % above the theoretical Blackbody temperature.
Jupiter receives a mere 50.5 W/sq metre solar insolation and has a theoretical “Blackbody” effective temperature of 110 K. At 1 Bar of pressure in the atmosphere Jupiter has a temperature of 165 K – approximately 50 % above the theoretical “Blackbody” temperature.
With an atmosphere almost entirely Hydrogen and Helium and no surface to heat there can be no “greenhouse effect” yet Jupiter’s atmospheric system manages an increase in temperatures above theoretical radiative calculations which completely swamps Earth’s increases in temperatures.
Alan Siddons sums up his brief analysis thus –
“But by whatever means atmospheric pressure due to gravity generates
heat by itself, the evidence clearly indicates that it does.”
Doesn’t this cast significand doubt on the “settled science” hypothesis ?

Rosco
October 22, 2012 3:34 pm

Just one other thing on this 99% of the atmosphere doesn’t emit IR – and I don’t pretend to know – but doesn’t that mean, as has been asserted before, that a large component of the Earth’s ability to radiate to space comes from trace gases almost exclusively and if true, which I do not claim to know, shouldn’t increasing concentrations of GHGs equally increase this ability as well offsetting any backradiative effects equally ?
Again, such a scenario is in agreement with the laws of physics quoted but is totally ignored in the hypothesis.

Rosco
October 22, 2012 3:40 pm

And – Nitrogen, Oxygen and Argon DO actually have a spectrum of absorption and emission albeit narrow – but at 99% of the atmosphere I find it agregious to dismiss this as negligible in favour of 4 molecules in 10,000 – the majority of which sit mainly at our feet dues to the fact they are predominantly heavier than air and are unlikely to ever be “well mixed”.

Rosco
October 22, 2012 3:41 pm

Wish I could spell – can’t even blame big fingers – “a” and “e” aren’t even next to each other.

Rosco
October 22, 2012 4:06 pm

I really should turn this thing off BUT –
Science has established significant numbers of the physical and chemical properties of materials. For example we can calculate the bending moment of structural components. We can calculate the amount of energy in coal and the expected capacity of electrical output from power plants.
We have used this knowledge to create the modern world with marvels that allow me to sit here in Australia typing stuff that will appear to anyone who cares to view it halfway around the globe.
Then along comes some people with an unproven, unverifiable and probabaly designedly undisprovable, theory that is at odds, at least according to some equally qualified individuals, with all the proven engineering and science which delivered the modern world.
To make matters worse they demand we sacrifice everything because they KNOW we are foolishly or greedily killing everything !
As I said – I don’t know and sometimes I can almost convince myself that it is possible – especially if one thinks about how the slow period of rotation of the Moon MUST explain the really cold temperatures.
Shouldn’t slowing down the cooling of Earth during a night allow the next day to be hotter ?
Maybe – but I believe it is the Sun that heats – not backradiation. And only because Earth’s surfaces never approach the theoretical blackbody temperature the Solar Radiation could induce which MUST mean the Oceans and Atmosphere act to cool the surface – of that I am certain – The daytime Lunar temperatures prove that to my satisfaction !

David Ball
October 22, 2012 4:26 pm
Rosco
October 22, 2012 4:31 pm

My last rant – I promise.
I find it difficult to reconcile the divide by four thingie. Don’t think me simple – I can see the geometry as well as anyone.
I find this statement from Wikipedia disturbing because of the confidence with which it is made YET the author has absolutely no evidence to make it – IN FACT what we do actually KNOW is the opposite of the claim.
“If an ideal thermally conductive blackbody was the same distance from the Sun as the Earth is, it would have a temperature of about 5.3 °C.”
How the hell does this turkey know this is true ???
It is merely theoretical speculation and does not fit observable facts.
How does this crap apply on a planetary scale where instantaneous equilibrium of heat distribution is as pie in the sky as this whole BS navel gazing ??
After all this”blackbody” would be permanently subject to a radiation “field” of ~1368 W/sq metre over half of its sphere – if indeed it were a sphere – what if it were a cube – would its equilibrium be 1/6 instead of 1/4 ??
And what about the fact that it is permanently irradiated by 1368 W /sq metre and absorbing all of this until thermal equilibrium – so shouldn’t the factor of 1 in four become 1 in two as we have half a sphere illuminated and half not ? Or some horrendously complicated calculation involving integrating over the surface “N” to “S” and “E” to “W” considering rate of rotation – presumably this blackbody would rotate as its orbit is unlikely to be stable if not.
I stiil have trouble with this one and I think the calculations of an effective temperature based on spurious geometrical considerations don’t mean a damn thing !

David Ball
October 22, 2012 4:45 pm

JJ says:
October 22, 2012 at 1:02 pm
My main point was ; “Would you put your family in one of those?”

Rosco
October 22, 2012 4:52 pm

OK – I lied.
BUT – I don’t need to be a practitioner publishing in the field or, indeed, even a Herpetologist, to spot a snake oil salesman when I see one.
Climate scientists have yet to convince me that they are any different to the practitioners of this “black” art from the past.

Henry Clark
October 22, 2012 6:55 pm

The instrumental record (black) looks similar at the first look, but there are some discrepancies.
There are far more discrepancies if compared to real temperature history instead of HADCRUT3, by CRU of Climategate, which was part of recent revisionism of rewriting inconvenient history. The far more severe temperature drop which actually led to the global cooling scare of the 1970s can be seen in the articles of the time such as the following:
http://2.bp.blogspot.com/-o30PNIBahS0/T2KTNlu3RsI/AAAAAAAAAkY/cItxzMamChk/s1600/newsweek-global-cooling.jpg
http://img240.imagevenue.com/img.php?image=40530_DSCN1557_nat_geog_1976_1200x900_122_75lo.JPG
Without the combination of (1) using fudged temperature data (2) making up imaginary values for past aerosol forcings until models semi-fit the past, albeit thus collapsing in future prediction, those models would be blatantly ludicrously far from correlation, as in more like CO2 versus temperature in recent history in non-revisionist data like http://www.appinsys.com/globalwarming/GW_Part6_SolarEvidence_files/image024.gif (in contrast to http://www.appinsys.com/globalwarming/GW_Part6_SolarEvidence_files/image023.gif) and, for 200 to 11000 years ago, in http://wattsupwiththat.files.wordpress.com/2012/04/gisp220temperaturesince1070020bp20with20co220from20epica20domec1.gif
Computer models are used to impress the naive public because they superficially sound sophisticated, but GIGO (garbage-in garbage-out) applies when they are created to fit an agenda.
Sea ice extent in Arctic is measured by satellites since 1979 and in 2007 and now in 2012, summer minims were recorded. It is to be pitied, that similar measurements were not possible in 30ties; according to meteorological records, Arctic was just as “warm” as today.
Actually in fact:
Arctic temperature over a century (warmer in 1930s than 1990s):
http://earthobservatory.nasa.gov/Features/ArcticIce/Images/arctic_temp_trends_rt.gif
Arctic ice extent as an annual average without cherry-picking a single month post-storm, comparable in recent years to mid 1990s:
http://www.webcitation.org/6AKKakUIo
Arctic ice extent over a century (once again recent years nothing special compared to 1930s in non-fudged data unlike such as Cryosphere Today propaganda):
http://nwpi.krc.karelia.ru/e/climas/Ice/Ice_no_sat/XX_Arctic.htm
The real primary driver of global temperatures over the past several centuries, what the CAGW movement hides by fudging the data but blatant in non-fudged data:
http://s8.postimage.org/nz4ionvit/suntemp0.jpg
Demonstrations more recently as well, the past several decades:
http://s10.postimage.org/z7wcdc56x/suntemp.jpg
and
http://s18.postimage.org/n9nm5glc7/solar_GCRvswatervapor.jpg
————-
I don’t think I’ll try submitting the preceding in more formal and refined form as I don’t think it would be accepted, but, really, if all of the preceding graphs were shown all at once together, they would make an extraordinarily demonstrative WUWT article. It is good that this article’s text mentions the issues with HADCRUT3, but, unfortunately as all too typical even of skeptic articles, the plots leave something to be desired. Don’t just say the arctic was warm in the 1930s. Show how much it was, as my plot link does. A picture is worth a thousand words, if displayed inline rather than only to the few who click on a text link. Don’t just say HADCRUT3 was adjusted. Show graphically what history was before it was rewritten.

JJ
October 22, 2012 8:02 pm

David Ball says:
My main point was ; “Would you put your family in one of those?”

And my main point is that there are already plenty of people that would give their left nut for a ten minute ride in one of those cars, more people that would give an unequivocal “Yes” to that question, and many more that think that it isn’t long before they will order that with the leather seats. Legislation specifically permitting their use on public roads has already been passed in three states (in most states they aren’t illegal, they just aren’t specifically permited). It is simply a bad analogy to use if you are trying to convey that trust in something is outlandish.
The technology for autonomous automobiles is … well first of all a technology, not a science. One of the propaganda techniques used by the warmists is to recast ‘global warming’ as a technology problem, rather than a question of scientific knowledge and ability to predict the future. We are good at technology problems. When we throw resources at them, we tend to solve them pretty quickly. And when we have solved them, we know it.
Scientific knowledge comes more slowly, and is susceptible to overstated claims and fraud. Predicting the future is notoriously difficult. People need to face that.

Mike McMillan
October 22, 2012 10:12 pm

Max Hugoson says: October 22, 2012 at 8:30 am
David Ball says:October 22, 2012 at 8:13 am
Would you trust a computer to drive your car autonomously? With you and your family in the car of course. They are attempting the same thing on a societal scale. Perhaps one day, but not today.
David – The answer is YES….I would. In fact you have probably flown on an airplane which has TAKEN OFF, CRUISED and LANDED under computer control.

We usually cruise under computer control, sometimes land under computer control, but we never take off that way.

MikeB
October 23, 2012 3:06 am

Juraj,
All objects above absolute zero emit electromagnetic radiation. Well yes – I made exactly the same point just a couple of days ago, commenting on the Daily Mail article about Global Warming having stopped for the last 16 years. However, it is not quite true! Few things are simple as that, and the statement needs expansion. This is a source of many misunderstandings and is one of the first difficulties I had had when I started to look at explanations of the greenhouse effect. I made some queries on the Science of Doom site at that time but was unable to get a satisfactory answer. Whilst the statement is true for solids and liquids it does not apply in the same way to ‘thin’ gases. These gases are not seen to absorb and emit over a broad continuous spectrum, Instead they absorb and emit radiation only at discrete wavelengths or narrow bands. This is not ‘opinion’ you understand. It is demonstrated by empirical measurements. There are many graphs available on the internet showing the absorption bands of various gases.
The main constituents of our atmosphere, oxygen and nitrogen, do NOT absorb in the spectral region where the Earth radiates most strongly, although nitrogen has a weak absorption band between 4 and 5 microns and oxygen a weak absorption band between 6 and 7 microns. If the atmosphere consisted of just these gases the Earth’s outgoing radiation would escape to space with little attenuation, The Earth would then adopt a surface temperature at which the outgoing radiation balanced the incoming solar radiation. This would result in an ‘effective’ surface temperature of about -18 deg. Celsius.
Some gases in the atmosphere, however, such as CO2, do absorb radiation within the Earth’s emission spectrum. Some of this radiation is subsequently re-emitted back to the surface and represents an ADDITIONAL warming flux on the planet’s surface. At higher altitudes gases like CO2 help to cool the atmosphere by radiating directly to space. But we live on the surface and the surface temperature is higher because of the presence of the radiatively active gases that we call ‘greenhouse gases’.
It is essential to understand this mechanism before you can speak for or against the greenhouse effect.

John Marshall
October 23, 2012 4:00 am

Totally agree JV. Many thanks.

John Marshall
October 23, 2012 4:08 am

Mike B. Your atmosphere of only O2 and N2 would, might, radiate more heat BUT more heat would get to the surface because no energy would be adsorbed on its way to the surface. I think that you would end up with the same temperature range as today. The GHG theory may not be required at all.

MikeB
October 23, 2012 4:20 am

Rosco says: October 22, 2012 at 4:31 pm
“If an ideal thermally conductive blackbody was the same distance from the Sun as the Earth is, it would have a temperature of about 5.3 °C.”
Rosco, this is such an amazingly simple calculation that I don’t believe you can’t do it yourself. You provide all the numbers required except for the Stefan Boltzmann equation, which I am sure you know.
The problem with me doing it for you is that you are obviously in ‘reject mode’, refusing to believe in anything however obvious because you don’t like the answer.
By the way, I haven’t read the Wikipedia article but from my calculation I can tell you that it’s a sphere.

David Ball
October 23, 2012 6:52 am

JJ says:
October 22, 2012 at 8:02 pm
“Scientific knowledge comes more slowly, and is susceptible to overstated claims and fraud. Predicting the future is notoriously difficult. People need to face that.”
So,…… we are in agreement then?

cdquarles
October 23, 2012 8:08 am

Gravitational fractionation of gases in our atmosphere does not happen until you reach the top of the atmosphere, where the relative velocities allow the lighter gases (which move faster than the heavier ones at the same temperature and pressure) to escape faster. All gases are miscible.
The adiabatic lapse rate is determined by gravitational work done on the ‘parcel’ of air. The partial pressure of Nitrogen is the same all of the way to the top, as it is for oxygen (but keep in mind there are fluxes of oxygen caused by biological organisms and where there are pressure differences diffusion will take place to even it out in addition to advection and convection).
Water vapor pressure is not the same all of the way to the top. Water undergoes phase changes. Evaporation adds it near the surface (more over the oceans, less over fresh water bodies, even less over dry land and the least over ice where sublimation can occur). Condensation removes it, where we either get the most variable aerosol (clouds) or clouds plus precipitation. Carbon dioxide does not undergo such a phase change, but it too has major sources and sinks (biological and non-biological). Once in the atmosphere, carbon dioxide mixes readily but it is subject to partition where it contacts liquid water aerosols or the ocean (pure solvation partition 50:1 into liquid water versus air via Henry’s law). In the ocean, carbon dioxide is subject to buffering reactions as well as fluxes from biological organisms.

October 23, 2012 8:42 am

MikeB, thanks for your comment. However, that -18°C number is definitely wrong, because it is calculated with as a case without “greenhouse gases” but with present Earth albedo of 0,3, which is created by clouds (condensed “greenhouse gases”), so this mental exercise is totally off. With albedo of Moon, theoretical temperature should be higher by some 15K.
Anyway, there is 6,000 ppm of CO2 in Martian atmosphere and theoretical and practical temperature is the same – 210K, so greenhouse gases by themselves are obviously not enough.
http://nssdc.gsfc.nasa.gov/planetary/factsheet/marsfact.html
The point is, Mars has no real atmospheric mass which would hold the heat – heat accumulated by contact between the bulk atmosphere with the warmed surface by Sun, whether IR active or not.

MikeB
October 23, 2012 10:51 am

I must concede a point to you. I did assume an albedo which is largely dependent on cloud cover, and in the imaginary scenario that I postulated this would not of course be possible. I think I got something wrong once before – but such a long time ago I can’t really remember. How about minus 5 deg. C with just an oxygen and nitrogen atmosphere.
I don’t understand your point about the Moon – is there one? And Mars has such a thin atmosphere it is not appropriate to make simple comparisons – same for Venus where there are so many other factors at play.
But note: the main point is that oxygen and nitrogen do not absorb in the long wave infrared and do not therefore emit anything at those wavelengths.

October 23, 2012 9:34 pm

The scientific consensus among computer scientists is that no combination of computer hardware and software will ever be able to accurately model or predict a future state of the climate.
Why are these people considered the experts in climate prediction? They are not even remotely qualified. They are (supposedly) climate scientists, not computer scientists. They are fraudulently misrepresenting themselves as well as their multi-billion dollar little toy models.

David Cage
October 24, 2012 12:36 am

David Ball says:
October 22, 2012 at 8:13 am
Would you trust a computer to drive your car autonomously? With you and your family in the car of course.
I trust the fly by wire system and the instrument landing system but both are based on equations that have been proved to be right in all known tests to within a smaller error than any human can manage. Contrast this with climate science where there is no consensus on which figures to use and if you take the error between the best and worst model estimates and then take reality, the model errors are almost an order of magnitude greater than any change.
Having spent most of my life on computer models I trust them implicitly where they earn trust but where they are over simplified facile junk they should be treated as such.