By Christopher Monckton of Brenchley
Joel Shore, who has been questioning my climate-sensitivity calculations, just as a good skeptic should, has kindly provided at my request a reference to a paper by Dr. Andrew Lacis and others at the Goddard Institute of Space Studies to support his assertion that CO2 exercises about 75% of the radiative forcings from all greenhouse gases, because water vapor, the most significant greenhouse gas because of its high concentration in the atmosphere, condenses out rapidly, while the non-condensing gases, such as CO2, linger for years.
Dr. Lacis writes in a commentary on his paper: “While the non-condensing greenhouse gases account for only 25% of the total greenhouse effect, it is these non-condensing GHGs that actually control the strength of the terrestrial greenhouse effect, since the water vapor and cloud feedback contributions are not self-sustaining and, as such, only provide amplification.”
Dr. Lacis’ argument, then, is that the radiative forcing from water vapor should be treated as a feedback, because if all greenhouse gases were removed from the atmosphere most of the water vapor now in the atmosphere would condense or precipitate out within ten years, and within 50 years global temperatures would be some 21 K colder than the present.
I have many concerns about this paper, which – for instance – takes no account of the fact that evaporation from the surface occurs at thrice the rate imagined by computer models (Wentz et al., 2007). So there would be a good deal more water vapor in the atmosphere even without greenhouse gases than the models assume.
The paper also says the atmospheric residence time of CO2 is “measured in thousands of years”. Even the IPCC, prone to exaggeration as it is, puts the residence time at 50-200 years. On notice I can cite three dozen papers dating back to Revelle in the 1950s that find the CO2 residence time to be just seven years, though Professor Lindzen says that for various reasons 40 years is a good central estimate.
Furthermore, it is questionable whether the nakedly political paragraph with which the paper ends should have been included in what is supposed to be an impartial scientific analysis. To assert without evidence that beyond 300-350 ppmv CO2 concentration “dangerous anthropogenic interference in the climate system would exceed the 25% risk tolerance for impending degradation of land and ocean ecosystems, sea-level rise [at just 2 inches per century over the past eight years, according to Envisat], and inevitable disruption of socioeconomic and food-producing infrastructure” is not merely unsupported and accordingly unscientific: it is rankly political.
One realizes that many of the scientists at GISS belong to a particular political faction, and that at least one of them used to make regular and substantial donations to Al Gore’s re-election campaigns, but learned journals are not the place for über-Left politics.
My chief concern, though, is that the central argument in the paper is in effect a petitio principii – a circular and accordingly invalid argument in which one of the premises – that feedbacks are strongly net-positive, greatly amplifying the warming triggered by a radiative forcing – is also the conclusion.
The paper turns out to be based not on measurement, observation and the application of established theory to the results but – you guessed it – on playing with a notorious computer model of the climate: Giss ModelE. The model, in effect, assumes very large net-positive feedbacks for which there is precious little reliable empirical or theoretical evidence.
At the time when Dr. Lacis’ paper was written, ModelE contained “flux adjustments” (in plain English, fudge-factors) amounting to some 50 Watts per square meter, many times the magnitude of the rather small forcing that we are capable of exerting on the climate.
Dr. Lacis says ModelE is rooted in well-understood physical processes. If that were so, one would not expect such large fudge-factors (mentioned and quantified in the model’s operating manual) to be necessary.
Also, one would expect the predictive capacity of this and other models to be a great deal more successful than it has proven to be. As the formidable Dr. John Christy of NASA has written recently, in the satellite era (most of which in any event coincides with the natural warming phase of the Pacific Decadal Oscillation) temperatures have been rising at a rate between a quarter and a half of the rate that models such as ModelE have been predicting.
It will be helpful to introduce a little elementary climatological physics at this point – nothing too difficult (otherwise I wouldn’t understand it). I propose to apply the IPCC/GISS central estimates of forcing, feedbacks, and warming to what has actually been observed or inferred in the period since 1750.
Let us start with the forcings. Dr. Blasing and his colleagues at the Carbon Dioxide Information and Analysis Center have recently determined that total greenhouse-gas forcings since 1750 are 3.1 Watts per square meter.
From this value, using the IPCC’s table of forcings, we must deduct 35%, or 1.1 Watts per square meter, to allow for negative anthropogenic forcings, notably the particles of soot that act as tiny parasols sheltering us from the Sun. Net anthropogenic forcings since 1750, therefore, are 2 Watts per square meter.
We multiply 2 Watts per square meter by the pre-feedback climate-sensitivity parameter 0.313 Kelvin per Watt per square meter, so as to obtain warming of 0.6 K before any feedbacks have operated.
Next, we apply the IPCC’s implicit centennial-scale feedback factor 1.6 (not the equilibrium factor 2.8, because equilibrium is thousands of years off: Solomon et al., 2009).
Accordingly, after all feedbacks over the period have operated, a central estimate of the warming predicted by ModelE and other models favored by the IPCC is 1.0 K.
We verify that the centennial-scale feedback factor 1.6, implicit rather than explicit (like so much else) in the IPCC’s reports, is appropriate by noting that 1 K of warming divided by 2 Watts per square meter of original forcing is 0.5 Kelvin per Watt per square meter, which is indeed the transient-sensitivity parameter for centennial-scale analyses that is implicit (again, not explicit: it’s almost as though They don’t want us to check stuff) in each of the IPCC’s six CO2 emissions scenarios and also in their mean.
Dr. Lacis’ paper is saying, in effect, that 80% of the forcing from all greenhouse gases is attributable to CO2. The IPCC’s current implicit central estimate, again in all six scenarios and in their mean, is in the same ballpark, at 70%.
However, using the IPCC’s own forcing function for CO2, 5.35 times the natural logarithm of (390 ppmv / 280 ppmv), respectively the perturbed and unperturbed concentrations of CO2 over the period of study, is 1.8 Watts per square meter.
Multiply this by the IPCC’s transient-sensitivity factor 0.5 and one gets 0.9 K – which, however, is the whole of the actual warming that has occurred since 1750. What about the 20-30% of warming contributed by the other greenhouse gases? That is an indication that the CO2 forcing may have been somewhat exaggerated.
The IPCC, in its 2007 report, says no more than that between half and all of the warming observed since 1950 (and, in effect, since 1750) is attributable to us. Therefore, 0.45-0.9 K of observed warming is attributable to us. Even taking the higher value, if we use the IPCC/GISS parameter values and methods CO2 accounts not for 70-80% of observed warming over the period but for all of it.
In response to points like this, the usual, tired deus ex machina winched creakingly onstage by the IPCC’s perhaps too-unquestioning adherents is that the missing warming is playing hide-and-seek with us, lurking furtively at the bottom of the oceans waiting to pounce. However, elementary thermodynamic considerations indicate that such notions must be nonsense.
None of this tells us how big feedbacks really are – merely what the IPCC imagines them to be. Unless one posits very high net-positive feedbacks, one cannot create a climate problem. Indeed, even with the unrealistically high feedbacks imagined by the IPCC, there is not a climate problem at all, as I shall now demonstrate.
Though the IPCC at last makes explicit its estimate of the equilibrium climate sensitivity parameter (albeit that it is in a confused footnote on page 631 of the 2007 report), it is not explicit about the transient-sensitivity parameter – and it is the latter, not the former, that will be policy-relevant over the next few centuries.
So, even though we have reason to suspect there is a not insignificant exaggeration of predicted warming inherent in the IPCC’s predictions (or “projections”, as it coyly calls them), and a still greater exaggeration in Giss ModelE, let us apply their central estimates – without argument at this stage – to what is foreseeable this century.
The IPCC tells us that each of the six emissions scenarios is of equal validity. That means we may legitimately average them. Let us do so. Then the CO2 concentration in 2100 will be 712 ppmv compared with 392 ppmv today. So the CO2 forcing will be 5.35 ln(712/392), or 3.2 Watts per square meter, which we divide by 0.75 (the average of the GISS and IPCC estimates of the proportion of total greenhouse forcings represented by CO2) to allow for the other greenhouse gases, making 4.25 Watts per square meter.
We reduce this value by about 35% to allow for negative forcings from our soot-parasols etc., giving 2.75 Watts per square meter of net anthropogenic forcings between now and 2100.
Nest, multiply by the centennial-scale transient-sensitivity parameter 0.5 Kelvin per Watt per square meter. This gives us a reasonable central estimate of the warming to be expected by 2100 if we follow the IPCC’s and GISS’ methods and values every step of the way. And the warming we should expect this century if we do things their way? Well, it’s not quite 1.4 K.
Now we go back to that discrepancy we noted before. The IPCC says that between half and all of the warming since 1950 was our fault, and its methods and parameter values seem to give an exaggeration of some 20-30% even if we assume that all of the warming since 1950 was down to us, and a very much greater exaggeration if only half of the warming was ours.
Allowing for this exaggeration knocks back this century’s anthropogenic warming to not much more than 1 K – about a third of the 3-4 K that we normally hear so much about.
Note how artfully this tripling of the true rate of warming has been achieved, by a series of little exaggerations which, when taken together, amount to a whopper. And it is quite difficult to spot the exaggerations, not only because most of them are not all that great but also because so few of the necessary parameter values to allow anyone to spot what is going on are explicitly stated in the IPCC’s reports.
The Stern Report in 2006 took the IPCC’s central estimate of 3 K warming over the 20th century and said that the cost of not preventing that warming would be 3% of 21st-century GDP. But GDP tends to grow at 3% a year, so, even if the IPCC were right about 3 K of warming, all we’d lose over the whole century, even on Stern’s much-exaggerated costings (he has been roundly criticized for them even in the journal of which he is an editor, World Economics), would be the equivalent of the GDP growth that might be expected to occur in the year 2100 alone. That is all.
To make matters worse, Stern used an artificially low discount rate for inter-generational cost comparison which his office told me at the time was 0.1%. When he was taken apart in the peer-reviewed economic journals for using so low a discount rate, he said the economists who had criticized him were “confused”, and that he had really used 1.4%. William Nordhaus, who has written many reviewed articles critical of Stern, says that it is quite impossible to verify or to replicate any of Stern’s work because so little of the methodology is explicit and available. And how often have we heard that before? It is almost as if They don’t want us to check stuff.
The absolute minimum commercially-appropriate discount rate is equivalent to the minimum real rate of return on capital – i.e. 5%. Let us oblige Stern by assuming that he had used a 1.4% discount rate and not the 0.1% that his office told me of.
Even if the IPCC is right to try to maintain – contrary to the analysis above, indicating 1 K manmade warming this century – that we shall see 3 K warming by 2100 (progress in the first one-ninth of the century: 0 K), the cost of doing nothing about it, discounted at 5% rather than 1.4%, comes down from Stern’s 3% to just 0.5% of global 21st-century GDP.
No surprise, then, that the cost of forestalling 3 K of warming would be at least an order of magnitude greater than the cost of the climate-related damage that might arise if we just did nothing and adapted, as our species does so well.
But if the warming we cause turns out to be just 1 K by 2100, then on most analyses that gentle warming will be not merely harmless but also beneficial. There will be no net cost at all. Far from it: there will be a net economic benefit.
And that, in a nutshell, is why governments should shut down the UNFCCC and the IPCC, cut climate funding by at least nine-tenths, de-fund all but two or three computer models of the climate, and get back to addressing the real problems of the world – such as the impending energy shortage in Britain and the US because the climate-extremists and their artful nonsense have fatally delayed the building of new coal-fired and nuclear-fired power stations that are now urgently needed.
Time to get back down to Earth and use our fossil fuels, shale gas and all, to give electricity to the billions that don’t have it: for that is the fastest way to lift them out of poverty and, in so doing, painlessly to stabilize the world’s population. That would bring real environmental benefits.
And now you know why building many more power stations won’t hurt the climate, and why – even if there was a real risk of 3 K warming this century – it would be many times more cost-effective to adapt to it than to try to stop it.
As they say at Lloyds of London, “If the cost of the premium exceeds the cost of the risk, don’t insure.” And even that apophthegm presupposes that there is a risk – which in this instance there isn’t.
The Viscount Monckton of Brenchley
===========================================================
Part 1 of Sense and Sensitivity can be found here
Werner Brozek says:
January 19, 2012 at 10:10 am
“Myrrh says:
January 19, 2012 at 3:18 am
Then you should know what you’re saying is bullshit pure and simple”
I just have five questions for you. And be sure that any answer you give to any of the questions cannot be contradicted by anything you said on this thread.
If you put a radio under a pillow, you can still hear it. Why can weak radio waves go through a pillow and why can strong x-rays go through a pillow but visible light, that is in between in energy, cannot penetrate a pillow? etc.
Werner, I’ve pointed out where you are wrong, you’re following the AGW fiction meme fisics and what else I know or don’t know is not relevant here, it’s a distraction.
But as I said, size is important…
R. Gates said @ur momisugly January 16, 2012 at 8:23 am
Much to my surprise, I find myself in complete agreement with R. Gates. Think I’ll take a Becks and have a lie down.
Brad Tittle said @ur momisugly January 16, 2012 at 9:31 am
Antarctica. The cold record for Earth is -89.6C. CO2 freezing point is -78.5C according to my Chambers Science and Technology Dictionary.
Terry Oldberg said @ur momisugly January 16, 2012 at 9:31 pm
You would seem to be dismissing the possibility of a reductio ad absurdum argument.
cohenite said @ur momisugly January 16, 2012 at 4:53 pm
I suspect that everyone who grows crops knows this.
wayne said @ur momisugly January 16, 2012 at 5:17 pm
I’m sure he’ll do better next time 😉
I did not mean to start an argument. I was looking for the mechanism for a non greenhouse gas to lose energy. I thought that the only real answer was conducting it to the earth to be released through long wave radiation, and that seems to be the case.
I was worried that there might be phase changes for our non greenhouse gases, but it seems those occur at much lower temperatures than we have between ground and top of atmosphere?
This was really a question to answer an earlier post which had the elevator question, and it seems to me that with a non greenhouse gas atmosphere what would happen is that the atmosphere would in fact increase the absolute surface temperature of the surface beyond the Stephan/Boltzmann equation.
The sun heats the earth surface. The Earth surface heats the air contacting it which then rises as it becomes less dense than the colder air. The Earth surface warms more air which also rises. This would go on until the Earth surface and the Air temperature at equilibrium are the same. Convection would move the heat around the earth and through the atmosphere making the Earth surface away from the sun significantly warmer than it would otherwise be. Since the only mechanism for cooling the atmosphere is direct contact with the surface of the earth, and with the depth of the atmosphere being as deep as it is, would that not help keep the dark side only a few degrees cooler than the light side?
thepompousgit (Jan. 19, 2012 at 4:22 pm):
As the equilibrium temperature is not an observable, claims such as Monckton’s and the IPCC’s regarding the magnitude of the equilibrium climate sensitivity cannot be falsified, thus lying outside science. How does reductio ad absurdum fit in with this?
Werner, I do feel very strongly about this, that a whole generation of children have now been taught nonsense physics about the world and particularly about carbon dioxide, but I was appalled to find when I began exploring that it was being depicted as a poison; governments began putting in on their toxic lists, ads on tv of a young child being read a bedtime story on its danger.. The same guy from whom I learned the physics they were giving to ‘prove’ what they said about carbon dioxide being well-mixed and unable to separate out from the atmosphere, went to some effort to show how too much carbon dioxide in the body ‘was a poison’. This was someone educated to PhD standard in physics, taught the subject, was, as you are, far more knowlegeable than I about science, but on this point his teaching was not only wrong, it itself was dangerous, toxic. That children are being educated to think carbon dioxide an evil when it is the building block of life on Earth, we’re not called carbon life forms on a whim, was diabolical. There was a lot of that ‘in the air’ around the time, and then of course the 10 10 video came out, how could it not or something like it from a long campaign to demonise carbon dioxide?
I’ve told the story a few times here of how I learned the ‘physics’ from him, recently here: http://wattsupwiththat.com/2012/01/12/earths-baseline-black-body-model-a-damn-hard-problem/#comment-864575
While exploring this I came to the conclusion that these ‘memes’ about carbon dioxide and the other fictional physics produced to support AGW, were deliberate, it takes someone who knows the science very well to be able to do such subtle tweaks with it, to give the property of one thing to another, to use laws out of context, this is a planned alternative physics. That’s how they can take the water cycle out of the picture completely, and, to make sure the majority don’t notice, all the arguments are about radiation… And so whenever someone with applied science knowledge in the subject tries to show that they are breaking the second law, they think he’s presenting a false physics! As a result of this, these memes have taken on a life of their own, they are so much in the background that ‘everyone’ thinks they are the real physics about it – ‘carbon dioxide accumulates for hundreds and thousands of years in the atmosphere’ is believed because it has been repeated so many times, and the children a few decades ago who were first indoctrinated with this are now grown up and if involved in any way in the ‘climate change’ subject, naturally take it for granted.
Many of those are now scientists of one kind or another, but if not actually specialising in the use of carbon dioxide where they would have to learn that this is impossible to be able to do their job, why would they know it was fiction? Latour makes this point about too, in deconstructing a back-radiation ‘proof’ –
“Echoing the analysis of another climatologist, Dr. Tim Ball, Latour insists that the apparent errors in atmospheric physics made by climatologists are because they work in a ‘generalist’ field of science, unlike most ‘hard’ sciences such as physics, chemistry, biology, engineering and medicine where detailed and in-depth specialization is essential so that products and services actually work.” http://johnosullivan.livejournal.com/43659.html
And I’ve told the story before of one such climatologist who set out to prove that methane didn’t separate out and failed and couldn’t understand why because he simply couldn’t believe it was happening, he thought there must be another source coming into the mine which they couldn’t find which was ‘somehow’ stopping the methane from quickly diffusing and thoroughly mixing with the rest of the air. It’s the basic physics of properties and processes that have been corrupted here. If teachers sometime in the future go missing, none of these would be able to devise the many applications we now have built on basic knowledge of the difference between light and heat..
Anyway, all that as background, to apologise for my tone earlier, sorry.
Myrrh (Jan. 20, 2012 at 2:30 am):
IMO, the blame for the phenomenon you descibe can be pinned on academia, for allowing science students to receive degrees while knowing nothing about the inductive branch of logic. In consequence of their ignorance, in selecting the inferences that are made by their models theorists use intuitive rules of thumb called “heuristics.” In each case in which a particular heuristic selects a particular infererence for being made, a different heuristic selects a different inference. In this way, the method of heuristics violates the law of non-contradiction. Using the violated law as a false premise to an argument, one can extract from the evidence whatever conclusion one wishes. In climatology, this kind of argument has generated the false conclusion that there is not the need for a model to be testable in order for it to be scientific.
Myrrh – you are absolutely right about the demonisation of CO2. These same people also manage to make a warming planet sound bad for us. But you can’t fool all the people all the time, and ordinary people have cottoned on in droves. Eventually, our leaders will have to follow, and there are signs of that already. But it has been a long hard fight and it’s not over yet. The final irony is that a cooling planet may help us to win, and that is not a pleasant prospect.
I have been reading much on the AGW debate, and trying very hard to arrive at logical conclusions. To date, I find that much more evidence and assessment of it, supports many aspects of the inadequacy of the AGW proponents science. It is indeed very shaky.
Yet, as others point out, education systems are being contaminated along with many government organizations and the press.
Yesterday, I followed a link to the ‘Science of Doom’ post “CO2 – An Insignificant Trace Gas? Part One” http://scienceofdoom.com/2009/11/28/co2-an-insignificant-trace-gas-part-one/. It is followed by further parts.
(Introduced by R Gates. and commented upon by Pompousgit, who found it convincing).
I did not find it convincing!
In this linked post, an extensive and supposedly comprehensive and scientific explanation, claims to prove why CO2 has the capability of being a strong “green house gas” even though it has trace quantities and limited absorption bandwidths.
He presents a lot of data on absorption wavelengths, including graphs and claims that they show the CO2 effectiveness as significant.
I fail to see the logic or adequacy of his “proof”. But I can see why even scientific people could be taken in. I hope to examine his article with a clear enough mind to pinpoint where his argument specifically becomes false. It is in his presentation of the absorption graphs, his under-estimation of H2O quantity and his over-estimation of CO2 absorption, but one needs to be more precise and scientific. Perhaps some-one else will have a better skill in achieving this.
In the meantime, it is that type of “proof” that is used to convince the gullible and the less open-minded.
The proof of the inadequacy of CO2 to “drive” global warming is quite effectively provided in “http://www.geocraft.com/WVFossils/greenhouse_data.html”
Perhaps a reader will comment if they can pick the fallacy in the “doom” post better than I?
My sentiments entirely. I carried out spreadsheet calculations and estimated LWIR absorptions for both CO2 and H2O. Using emissivities calculated for both gases I estimated extinction distances for the appropriate wavebands and convinced myself that CO2 could not possibly cause global warming, I am also convinced that massive expenditure committed to reducing CO2 emissions is completely unnecessary. I have read nothing in the warmist literature to change that view; not even from Joel Shore. I am sure Willis Eschenbach will come up with a great post on this oon, to shoe how shaky agw is.
Addendum to previous comment: An attempt to activate the referenced link’
http://www.geocraft.com/WVFossils/greenhouse_data.html
“Myrrh says:
January 20, 2012 at 2:30 am
Werner, I do feel very strongly about this, that a whole generation of children have now been taught nonsense physics about the world and particularly about carbon dioxide….
Anyway, all that as background, to apologise for my tone earlier, sorry.”
I fully accept your apology.
Did you see the article by Joe Bastardi at:
http://wattsupwiththat.com/2012/01/19/global-temps-in-a-crash-as-agw-proponents-crash-the-economy/
I loved the article but he made a mistake regarding the mixing of CO2. He apparently had the same idea you did. A number of people pointed it out, including myself. As far as I know, all of these people are totally on his side so to speak and hated to see him make this mistake. Unfortunately, it really distracted from an excellent article. For a representative sample of comments on this issue, see the four below. I do NOT agree with CAGW, but basic science is basic science. And yes, we do teach the gas laws and we realise the limitations of the ideal gas law and point them out to students. And unless you have extremely high pressures or extremely low temperatures, the gas laws work very well. To teach things at the high school level, we have to make all kinds of simplifications which make very little difference in most cases. Now just because warmists say that CO2 is well mixed, that does not mean they are wrong on this point. I disagree with them on many things, but the mixing of CO2 is not one of them. About 80 to 90% of what you say is good, but when you throw in things like “carbon dioxide is heavier than air. One and a half times heavier. Therefore, it will always displace air in the atmosphere to come down to the ground unless work is done on it, and so also will not readily rise up into the atmosphere.”, you lose credibility and people may question everything else you say. For more on this topic of CO2 mixing, see:
http://wattsupwiththat.com/2012/01/19/global-temps-in-a-crash-as-agw-proponents-crash-the-economy/#comment-869343
http://wattsupwiththat.com/2012/01/19/global-temps-in-a-crash-as-agw-proponents-crash-the-economy/#comment-869356
http://wattsupwiththat.com/2012/01/19/global-temps-in-a-crash-as-agw-proponents-crash-the-economy/#comment-869370
http://wattsupwiththat.com/2012/01/19/global-temps-in-a-crash-as-agw-proponents-crash-the-economy/#comment-870037
This theory of David Evans, a mathematician and engineer, with six university degrees, including a PhD from Stanford University in electrical engineering, has been posted in a few places, maybe also WUWT. It is one of the sources supporting my beliefs in “CAGW is a fraud”.
Its a simple explanation, but is it correct?
“Carbon dioxide is a greenhouse gas, and other things being equal, the more carbon dioxide in the air, the warmer the planet. Every bit of carbon dioxide that we emit warms the planet. But the issue is not whether carbon dioxide warms the planet, but how much.
Most scientists, on both sides, also agree on how much a given increase in the level of carbon dioxide raises the planet’s temperature, if just the extra carbon dioxide is considered. These calculations come from laboratory experiments; the basic physics have been well known for a century.
The disagreement comes about what happens next.
The planet reacts to that extra carbon dioxide, which changes everything. Most critically, the extra warmth causes more water to evaporate from the oceans. But does the water hang around and increase the height of moist air in the atmosphere, or does it simply create more clouds and rain? Back in 1980, when the carbon dioxide theory started, no one knew. The alarmists guessed that it would increase the height of moist air around the planet, which would warm the planet even further, because the moist air is also a greenhouse gas.
This is the core idea of every official climate model: For each bit of warming due to carbon dioxide, they claim it ends up causing three bits of warming due to the extra moist air. The climate models amplify the carbon dioxide warming by a factor of three — so two-thirds of their projected warming is due to extra moist air (and other factors); only one-third is due to extra carbon dioxide.
That’s the core of the issue. All the disagreements and misunderstandings spring from this. The alarmist case is based on this guess about moisture in the atmosphere, and there is simply no evidence for the amplification that is at the core of their alarmism.
Weather balloons had been measuring the atmosphere since the 1960s, many thousands of them every year. The climate models all predict that as the planet warms, a hot spot of moist air will develop over the tropics about 10 kilometres up, as the layer of moist air expands upwards into the cool dry air above. During the warming of the late 1970s, ’80s and ’90s, the weather balloons found no hot spot. None at all. Not even a small one. This evidence proves that the climate models are fundamentally flawed, that they greatly overestimate the temperature increases due to carbon dioxide.
His full comments are posted here – http://opinion.financialpost.com/2011/04/07/climate-models-go-cold/
Ken McMurtrie (Jan. 21, 2012 at 12:03 am):
Like Lord Monckton, Dr. Evans bases his argument upon the legitimacy of the equilibrium climate sensitivity (TECS) as a concept and disputes only the magnitude of TECS. However, the proposition that TECS is scientifically legitimate is false.
By definition, TECS is the increase in the global equilibrium surface air temperature (GESAT) in response to a doubling of the atmospheric CO2 concentation but the GESAT is not an observable feature of the climate system. It follows from the lack of observability that when Evans, Monckton or anyone else makes a claim about the magnitude of TECS, this claim cannot be tested. It follows that claims of this sort are not scientific claims, by the definition of “scientific.”
If climatological claims were scientific, they would be framed in terms of observable entities. These entities are called “statistical events.”
When the methodology of a study is “scientific” this study centers on the study’s complete set of statistical events, the study’s so-called “statistical population.” A sample from this population provides the sole basis for testing whatever theory or model that emerges from the study. Surprisingly, after spending 100 billion US$ on research, global climatologists have yet to identify their study’s statistical population. In the years in which I designed and managed scientific studies for a living, to identify a study’s statistical population was always the first order of business for if this task were not completed, the study would not be scientific.
By the way, in the years in which I made my living by designing and managing scientific studies, I learned that people with PhD degrees in science or engineering from elite research universities usually were ignorant of basics in the design of a scientific study that included the necessity for a statistical population. Apparently, in this respect, academia had inadequately trained its graduates..
In describing global climatology’s statistical population, one of the tasks would be to identify the period over which a climatological variable is averaged. According to the World Meteorological Organization, the “classical” period is 3 decades.
The period of a climatological event can be no less than the averaging period for the variables that define this event’s outcomes. Thus, taking the averaging period to be the “classical” period, the period of an event can be no less than 3 decades. If this is so, then the recent 1 decade long hiatus in the escalation in global temperatures is irrelevant. That this hiatus seems relevant to a person suggests this person’s ignorance of elements in the framing of a scientific study.
Terry Oldberg says:
January 20, 2012 at 9:17 am
Myrrh (Jan. 20, 2012 at 2:30 am):
IMO, the blame for the phenomenon you descibe can be pinned on academia, for allowing science students to receive degrees while knowing nothing about the inductive branch of logic. In consequence of their ignorance, in selecting the inferences that are made by their models theorists use intuitive rules of thumb called “heuristics.” In each case in which a particular heuristic selects a particular infererence for being made, a different heuristic selects a different inference. In this way, the method of heuristics violates the law of non-contradiction. Using the violated law as a false premise to an argument, one can extract from the evidence whatever conclusion one wishes. In climatology, this kind of argument has generated the false conclusion that there is not the need for a model to be testable in order for it to be scientific.
===============
I’ve been told that ‘modern science’ is beyond my grasp.. So this is how, I’ve just posted on this elsewhere, ‘science is subjective interpretation’ comes about? So heuristics itself has acquired a modern re-interpretation, from trial and error in the process of discovery to whatever one imagines without the next step? No wonder they never get to, and are so dismissive of, testing.
Werner – I’ll have a read through the discussion tomorrow, but not sure how much time I’ve got in the next few days to take part in it. Except one of your links has just given me a nudge..
Ken McMurtrie says:
January 21, 2012 at 12:03 am
Its a simple explanation, but is it correct?
“Carbon dioxide is a greenhouse gas, and other things being equal, the more carbon dioxide in the air, the warmer the planet. Every bit of carbon dioxide that we emit warms the planet. But the issue is not whether carbon dioxide warms the planet, but how much.
============
Ken – that’s a fairly standard argument around here from those anti-AGW, but not one held by me. I say that carbon dioxide being fully part of the Water Cycle cools the planet, in other words, that the ‘greenhouse gas warming’ is a fiction. My proof is that they have taken the water cycle out of their ‘energy budget’, the AGW KT97 and variations which both pro and anti-AGW use and where the arguments about just how much warming is CO2 adding comes from.
I’ve expanded on it and put in a link to this a few posts back.
@ur momisugly Terry Oldberg (21/1, 9:50am).
You overwhelm me with scientific jargon. This approach may perhaps be what often happens within the AGW world in supporting their beliefs. I am not saying that it is necessarily wrong, but that it consequently becomes more difficult to understand and therefore to assess its validity.
For another example, I commented earlier (20th, 3:51pm), about the ‘science of doom’ post where it was claimed that evidence based on gas absorption spectrum graphs proved CO2 was a powerful “GHG” sufficient to drive GW. It was difficult to hone in on the specific incorrect or illogical parameter(s), but I fail to be convinced that the proof was actually there.
First you say that TECS (‘equilibrium climate sensitivity’) is not a scientifically legitimate concept, therefore its magnitude is not an arguable factor.
Then, that TECS is the “increase in the global equilibrium surface air temperature (GESAT) in response to a doubling of atmospheric CO2 concentation(sic). Why specifically a ‘doubling’ and not an increase, or decrease for that matter? You go on to say that GESAT is not an observable feature, therefore not meaningful.
This is followed by a blurb on statistics which is completely irrelevant to climate “science”. The values of temperature, radiation energies, energy balance, are almost meaningless. They are averages of such wide-ranging temperature variables, seriously challenged computations and even unknowns like clouds and cosmic ray effects, together with solar cycles, sunspots, solar magnetic influences, winds, ocean currents, ocean heat, pollution, night and day variations, equatorial and polar extremes, the list goes on and on.
I agree that short term trends, even if legitimately arrived at, are not statistically significant. It was a serious mistake for the IPCC representation of projected global temperatures to show a smooth curve and not include all the well-known cycles. But this was only one of their many mistakes. Their credibility is shattered and the claims of CO2 as a GT driver is unsupportable.
Statistical analyses require inputs of substance and mathematical definable value and accuracy. None of the climate parameters meet this criteria.
Ignorance exists in both ‘camps’, perhaps, but scientific integrity is more evident in the case of the ‘unbelieving’ supporters, plus lack of honesty is demonstrably a feature of the AGW proponents.
Ken McMurtrie (Jan. 22, 2012 at 4:07 am):
Thanks for taking the time to respond! Sorry about the jargon. Perhaps if we work together, I can help you to understand what I’m saying.
TECS is the proportionality constant in the equation
T – To = TECS * ( log2 C – log2 Co )
where log2 represents the logarithm to the base 2, T represents the global equilibrium surface air temperature at time t, To represents the global equilibrium surface air temperature at time to, C represents the CO2 concentration at time t and Co represents the CO2 concentration at time to. In the circumstance that C is twice Co, ( log2 C – log2 Co ) has a value of 1 and it follows that
T – To = TECS
Thus, it is often said that TECS is the “increase in the GESAT from a doubling of the CO2 concentration.” Note, however, that TECS is defined in circumstances in which C is not twice Co
The term “equilibrium temperature” may be unfamiliar. It is the temperature that is attained when the magnitudes of all of the various heat fluxes are held constant and one waits for an infinite amount of time before measuring the temperature. Engineers call the same concept the “steady state temperature.” In reality, we cannot wait for an infinite amount of time and it follows that the equilibrium temperature cannot be observed.
While the equilibrium temperature is a valid and useful theoretical concept, it is not a useful empirical concept in view of the fact that its magnitude cannot be observed. Thus, the empirical side of climatology has be built upon measurable temperatures rather than equilibrium temperatures. In particular, the empirical side cannot be built upon the concept of the GESAT but can be built upon the concept of the GSAT (global surface air temperature) for the magnitude of the GSAT is observable.
By the definition of a “climate,” the GSAT is not an instantaneous temperature but rather is a time average of a temperature over a specified period. The time average can be taken to be the outcome of a statistical event for which the endtime is the endtime of the averaging period. The starttime of this event can be no greater than the starttime of the averaging period else the GSAT cannot be computed.
Events that do not overlap in time are said to be “statistically independent.” In concept, statistically independent events stretch billions of years backward in time and billions of years forward. The complete set of these events is an example of a “statistical population.”
A statistically independent event in which the GSAT is measured at the end of the event and the state of the climate is measured at the beginning is said to be “observed.” A set of observed events is called a “statistical sample.”
In concept, the elements of the statistical population bear a one-to-one relationship with the predictions of a climatological theory (aka model). This theory is susceptible to testing in which predicted values of the GSAT are compared to observed values in the statistical sample; the observed events in the sample must not have been used in the construction of the theory.
The events in the above referenced statistical sample provide the sole basis for testing the model. If there is not a population and sample, the theory is not testable. If it is not testable, this theory is not scientific, by the definition of “science.”
The IPCC’s inquiry references no statistical population or sample. You can verify this for yourself by searching the report of Working Group I in Assessment Report 4 on the word “population.” Thus, the IPCC’s inquiry was not a scientific one.
For interest, the paper here on forecasting reliability and the IPCC:
GLOBAL WARMING: FORECASTS BY SCIENTISTS VERSUS SCIENTIFIC FORECASTS by Kesten C. Green and J. Scott Armstrong
http://www.forecastingprinciples.com/files/WarmAudit31.pdf
A FORECASTING AUDIT FOR GLOBAL WARMING
In order to audit the forecasting processes described in Chapter 8 of the IPCC’s report,
..
The IPCC WG1 Report was regarded as providing the most credible long-term
forecasts of global average temperatures by 31 of the 51 scientists and others involved
in forecasting climate change who responded to our survey. We found no references
in the 1056-page Report to the primary sources of information on forecasting methods
despite the fact these are conveniently available in books, articles, and websites. We
audited the forecasting processes described in Chapter 8 of the IPCC’s WG1 Report
to assess the extent to which they complied with forecasting principles. We found
enough information to make judgments on 89 out of a total of 140 forecasting
principles. The forecasting procedures that were described violated 72 principles.
Many of the violations were, by themselves, critical.
The forecasts in the Report were not the outcome of scientific procedures. In
effect, they were the opinions of scientists transformed by mathematics and
obscured by complex writing. Research on forecasting has shown that experts’
predictions are not useful in situations involving uncertainly and complexity. We
have been unable to identify any scientific forecasts of global warming. …
#######
A fudge I’ve seen repeated a few times is the defence that ‘the IPCC doesn’t make predictions, it makes forecasts’, but as I learned from the above, that’s a real science discipline…
Myrrh (Jan. 22, 2012 at 11:20 am):
“Prediction” and “forecast” are synonymous but neither term is synonymous with “projection.” In their study, Green and Armstrong found that climatologists used the three terms synonymously.
In AR4, climatologists employed this usage in falsely implying the existence of predictions from the IPCC models and thus one’s ability to test these models. In reality, the IPCC models made only “projections” and these projections did not support testing of the models.
Thanks Terry for your response (22/1, 9:43am), and attempt to clarify your earlier comments.
With respect, I will be satisfied with not trying too hard too get my brain around the statistical procedures as you offer them.
My point being, as I tried to stress in my comment: In the climate change field, other than time periods and some physical laws which may or may not be precisely and appropriately applied, there are no parameters, variable or constant, of sufficiently known accuracy for a statistical procedure to have any meaningful outcome. At least not sufficiently meaningful to draw firm conclusions, even ones with degrees of certainty attached.
If the science was that ‘real’ and definable, we wouldn’t be experiencing all this interminable debating. Someone would be saying “Aha, that’s what is really going on!” and there would a general “consensus”, ha ha 🙂
Regards!
Ah – maybe I got that wrong, and it’s the word ‘projections’ I’ve heard as defence against failed predictions, with the claim that the IPCC doesn’t make predictions from models, but projections.
However,
http://www.ipcc-data.org/ddc_definitions.html
Definition of Terms Used Within the DDC Pages
“Projection
The term “projection” is used in two senses in the climate change literature. In general usage, a projection can be regarded as any description of the future and the pathway leading to it. However, a more specific interpretation has been attached to the term “climate projection” by the IPCC when referring to model-derived estimates of future climate.
Forecast/Prediction
When a projection is branded “most likely” it becomes a forecast or prediction. A forecast is often obtained using deterministic models, possibly a set of these, outputs of which can enable some level of confidence to be attached to projections.”
Hmm, just found a yahoo answer
http://ca.answers.yahoo.com/question/index?qid=20110616120409AAqtnpv which talks about the wording for confidence levels, and makes a distinction between “very likely” and “most likely” – says there’s not a definition of “most likely” and makes a reasonable stab at guessing what it means, but is left bemused. Now I do recall, with this reminder, that “most likely” and “likely” are directly linked to actual confidence levels, but this definition page I posted does have “most likely” to be an actual prediction.
So they do make predictions, and they say this is in context of climate models, so, maybe somewhere in all the heavy tomes of reports there are most likelies as predictions which some have taken to be very likelies – which can be dismissed as only a confidence level and subject to change from better models etc. ..But I’m not looking for them…. 🙂
Actually, I made a start but found the first link to reports was blank and all the snapshots had disappeared from archives, I really don’t have the time…
I’ve just found a page from someone showing an analysis against actual temperatures who uses “most likely” to describe what we’re seeing, but it’s not used in the spiel about the lines, so I take it this refers to the middle solid between the likelies? http://www.wunderground.com/blog/Seastep/comment.html?entrynum=2
OK, one more:
http://www.ipcc.ch/publications_and_data/ar4/wg2/en/ch3s3-3-1.html
So, definitely a forecast/prediction as they define it.
Myrrh (Jan. 22, 2012 at 4:12 pm):
Contrary to what the IPCC says, a “projection” is not a “prediction” under any circumstances. A “prediction” is an extrapolation to the outcome of a statistical event. The complete set of statistical events form a “statistical population” but the IPCC has yet to identify the statistical population underlying its conjectures.
Ken McMurtrie (Jan. 22, 2012 at 3:36 pm):
In addition to pondering the uncertainties in the parameters of a climate model, I wish you would ponder the uncertainty in the identity of the model. Do we know that the data are distributed as represented by this or that parametric model? For example, do we know that the data are distributed according to the univariate normal model. Generally, the answer is no.