
Whoo boy… just a few days ago it was argued in a new peer reviewed paper published in Science that climate sensitivity might be lower than the IPCC stated in AR4. Now we have this damning admission from Dr. Tom Wigley of NCAR that it can’t be determined at all from the data we have. Of course they’d never tell anyone publicly such things.
Bold mine. From email 0303.txt
cc: Simon Tett <sfbtett@meto.xxx>
date: Fri, 30 Jun 2000 12:30:43 -0600 (MDT)
from: Tom Wigley <wigley@meeker.xxxx>
subject: Re: PRESCIENT: Draft plan — updated
to: Keith Briffa <k.briffa@uea.xxx>
Keith and Simon (and no-one else),
Paleo data cannot inform us *directly* about how the climate sensitivity
(as climate sensitivity is defined). Note the stressed word. The whole
point here is that the text cannot afford to make statements that are
manifestly incorrect. This is *not* mere pedantry. If you can tell me
where or why the above statement is wrong, then please do so.
Quantifying climate sensitivity from real world data cannot even be done
using present-day data, including satellite data. If you think that one
could do better with paleo data, then you’re fooling yourself. This is
fine, but there is no need to try to fool others by making extravagant
claims.
Tom
On Fri, 30 Jun 2000, Keith Briffa wrote:
> Dear all ,
> I should first say that I have communicating directly with Simon on a
> few points, but realize that it is better to send these comments to
> everyone. My only feeling now is that we are tinkering too much at the
> margins and have passed the point of diminishing returns for this effort
> some time ago. As long as the plan does not give a false impression of
> exclusion to some of the community , it is time to get it out. The open
> meeting will provide an opportunity for soliciting the full range of
> potential proposals. The SSC will then have to decide on the balance of
> priorities. The plan expresses the rationale of the Thematic Programme well
> enough now.
> In the area of pedantry, however, I do not like the inclusion of the
> statement
> saying that palaeo -data are not likely to be able to inform us directly about
> climate sensitivity . This is a moot point , and even if true , is not needed.
> However, I do feel we need to put a limit on discussion and issue this call
> now.
> At 04:22 PM 6/30/00 +0100, Simon Tett wrote:
> >Dear All,
> > I got some more faxed comments from Tom and have incorporated
> > them into
> >the draft. I attach it for you all to look at.
> >Tom made two comments which I think need to be drawn to your attention.
> >
> >1) The current draft has a tone that suggests that model development and
> >simulations would not be funded by PRESCIENT. I don’t think that was our
> >intention so I’ve added some text which I hope reduces that danger. Some
> >of that added text is ugly! (it was friday after all!) Please let me
> >know what you think!
> >
> >2) Tom also made a comment about paleo-estimates of climate sensitivity
> >– the current text reflects (I hope) his faxed comment. However, I
> >don’t think I agree with it! Comments please.
> >
> >3) The draft contains various comments which I’d appreciate responses on
> >as well.
> >
> >Simon
>
> —
> Dr. Keith Briffa, Climatic Research Unit, University of East Anglia,
> Norwich, NR4 7TJ, United Kingdom
> Phone: xxxx
>
>
**********************************************************
Tom M.L. Wigley
Senior Scientist
ACACIA Program Director
National Center for Atmospheric Research
P.O. Box 3000
Boulder, CO 80307-3000
USA
Phone: xxxx
Fax: xxxx
E-mail: wigley@xxxx
Web: http://www.acacia.ucar.edu
**********************************************************
Just check the date that e-mail was sent: June 2000!
Yes, I had to look twice – because the Team knew then they couldn’t quantify climate sensitivity, but pressed on nevertheless, so that our politicians are now strangling our Economies in order to prevent temperatures from rising more than two centigrades.
What have they done since?
Obfuscate, deny, lie.
Have they advanced their ‘science’?
We all know the answer to that.
The diagram can not possibly be correct. Even the 0.6C must be too high. In order for there to be 0.6C due to CO2 then it must imply that nearly ALL of the temperature rise out of the LIA must be due to only CO2 and nothing else. I find that hard to believe.
Are they talking about funding from Prescient Weather Ltd?
Of course you can’t quantify climate sensitivity directly, without doubling CO2 and waiting to see what happens. This is neither news, a secret “admission”, nor “damning”. It just means that it must be quantified indirectly, using a physical calculation to relate what you observe, such as temperature data, to climate sensitivity. This is perfectly possible but obviously depends on the assumptions in your calculation as well as the data used (hence the different answers obtained by different studies).
Follow the Money…
When I read these posts, I’m just astonished. But big money can lead to big lies. Once you cross that boundary of coverups, collusion and coercion its hard to step back. I hope all the big players lose their jobs ending up teaching chemistry at a high school in Albuquerque NM.
AYZ,
Climate sensitivity to CO2 amounts to a WAG. The fact that carbon dioxide has risen by more than 40% while the global temperature has been flat for many years is strong evidence that temperature sensitivity to carbon dioxide has been vastly overestimated.
Maybe grant money expenditure is a more reliable indicator of what not to do ….
So far as I can tell, the only estimates of the climate sensitivity for a doubling of CO2 are based on the assumption that these estimates can be done by ONLY looking at radiation effects. Conduction, convection and the latent heat of water are ignored. Personally, I find this difficult to believe. However, I can find no justification for this assumption. Does anyone know where this has been justified?
We need to start referring to it as the huge mountain of evidence showing that AGW alarmism is a fraud when ever the occasion rises to insert the phrase natural-like into conversation.
@crosspatch says:
November 28, 2011 at 12:14 pm
//////////////////////////////////////////////////////////////////
I made a similar observation a few days ago (may be a week ago) when commenting on the recent paper that suggests that sensitivity is likely to be less than the IPCC claim.
I suggested that if one compares the 1940 to 2000 temp rise which consists of natural variation PLUS CO2 with the 1900 to 1940 temp rise which consists SOLELY of natural variation (pre 1940 there was relatively little manmade CO2 emissions) then the difference in temperature rise gives some indication of CO2 sensitivity. On rough and ready ball park basis.
The difference between these two parts of 20th century warming is only about 0.1degC over a 60 year period in which CO2 went up from about 290 ppm to about 390ppm. This would suggest that a 33% rise in CO2 emissions creates about 0.1degC rise in temperature which would place atmospheric sensitivity to CO2 at no more than 0.3degC per doubling. In fact less if sensitivity is logarithmic.
Of course such a comparison is rather simplistic and it assumes that the natural variation forcing present berween 1900 to 1940 is still ongoing and ongoing at about the same amplitude. Of course there is no positive evidence of that, nor positive evidence to refute that. The null hypothesis would be to assume that the natural variation forcing is still ongoing during the 1940 to 2000 period so I consioder the ball park figure to carry some weight.
Given large enough error bars, anything can be explained.
AYZ says:
November 28, 2011 at 12:24 pm
This is perfectly possible but obviously depends on the assumptions in your calculation as well as the data used (hence the different answers obtained by different studies).
So there is no way of testing the various answers obtained by different studies to gauge their accuracy. In other words “Quantifying climate sensitivity from real world data cannot even be done“.
Thanks for playing.
We really should start referring to the growing mountain of evidence that Climate Change Alarmism is a fraud. Just slip it in there whenever the conversation turns in that direction.
Observations have consistently shown NOTHING can “trap” heat – Engineers have been searching for this holy grail and never found it.
Perfect insulation is simply a myth – everything, in the absence of an energy source is cooling down – always.
The best our atmosphere can do is reduce the heat build up during daylight hours through convection of warmed air and the oceans, and the atmosphere to a lesser extent, lose heat more slowly than would occur without them and before everything becomes very uncomfortable the Earth rotates and that big nuclear furnace commences heating again.
The climate scientists have gotten away with the myth that the Sun can’t heat the Earth and “greenhouse fases” somehow account for 33 degrees C.
Why isn’t this nonsense challenged – we know the sun heats the moon to 120 degrees C during the lunar day – NASA says so – although NASA’s involvement shakes my confidence a little. Some idiots say it is because the lunar day is longer than a day on Earth but this displays ignorance – a given radiative flux is associated with a MAXIMUM temperature as given by the Stefan-Boltzman equation.
Plug in 1368 W/sq m – the solar constant – into Stefan-Boltzman and the answer is ~ 120 degrees C.
How do they get away with this obfuscation of quoting averages when all that matters is maximum ?
How do they get away with the “Energy Budget” and the famous “Here we assume a “solar constant” of 1367 W m-2 (Hartmann 1994), and because the incoming solar radiation is one-quarter of this, that is, 342 W m-2, a planetary albedo of 31% is implied.”
It is this incoming solar radiation is a quarter rubbish that supports the whole of the rest of the gibberish.
Please someone – where did the other three quarters go ? It was there a millimeter above the atmosphere/space boundary – I know there is no distinct cut off.
And if the incoming solar radiation is a mere 342 W/sq m as these people claim then how come the IPCC use this in AR4 ?
“Between 1902 and 1957, Charles Abbot and a number of other scientists around the globe made thousands of measurements of TSI from mountain sites. Values ranged from 1,322 to 1,465 W m–2, which encompasses the current estimate of 1,365 W m–2.”
As these were terrestrial observations shouldn’t they have observed 342 w/sq m ?
Or did they remember to multiply by 4 ?
What a joke – “science” built on either stupidity or a lie – take your choice !
Not sure what “greenhouse fases” are.
crosspatch says:
November 28, 2011 at 12:14 pm
But isn’t that the real point of the diagram? Until recently, the warmist message was that the powerful greenhouse gas CO2 utterly dominated natural climate variability. On that basis then, the blue line in the diagram would show the upper bound of climate sensitivity to increases in atmospheric CO2 concentration. Now that nature is not cooperating, the warmists must embrace some natural climate variability as partially negating the effects of rising CO2 in order to save their “cause”.
All the data sets raise quality issues but we do have a century or so of (dubious quality) observational data from which a stab can be made. I would suggest that a review of this observational data suggests that atmospheric/climate sensitivity is probaly within the range of 0.6 degC to nil.
AYZ says: November 28, 2011 at 12:24 pm says
“Of course you can’t quantify climate sensitivity directly, without doubling CO2 and waiting to see what happens…”
I agree strictly speaking. However, as noted above we have a century and a half of observational data during which time CO2 levels have increased by about 1/3rd. Leaving aside the argument that the CO2 increase is a response to the observed temperature increase, and assuming that instead that CO2 is to some extent a temperature driver this amount of data does permit us to make an estimated guess at a ball park figure for CO2 sensitivity. Last week, I commented:
richard verney says:
November 25, 2011 at 2:04 am
Why must they always use models? Models are GIGO. We have observational data which would give us a better projection.
If one considers the HADCRUT3 data set (and we all know the issues with data sets such as this) for the period say 1900 to 1940 one observes warming of about about 0.4 deg. This is predominantly due to natural variation since there was relatively little increase in atmospheric CO2 during this period. If that is compared to the period 1940 to 2010, one observes a warming of about 0.5 deg. This period comprises of natural variation plus an increase in CO2 of about 1/3rd. Simplistically, this suggests that natural forcings are running at a rate of 0.4 deg C for 40 years, ie., 0.1deg C per decade.
If the natural variation drivers that were present during the period 1900 to 1940 are still operating during the period 1940 to 2010 and operating with the same force (admittedly we do not know whether this is so or whether those natural forcings are operating with more or less force), it suggests that the CO2 forcing component cannot be more than 0.1 deg C (ie., 0.5 degC – 0.4 degC) and arguably nil since during the 70 year period 1940 to 2010 temperatures rose by 0.5 deg C which is 0,07deg C per decade and this is less than the 0.1 deg C per decade of natural forcings which were operating during the 1900/1940 period.
Of course, one can take different periods and different data sets but all suggests that the response to an increase of say 33% in CO2 concentrations is modest. If the response to C02 .is logarithnic then a doubling in CO2 will result in less than 3 times the response to an the 1/3rd increase of CO2 seen these past 70 to 80 years.
Observational data during the instrument periiod suggests that the climate sensitivity to CO2 is small. This is, of course, one reason why model projections are running so warm when their projected results are viewed against current observational data.
I agree with what you say. The entire “thing” seems to be the need for some catastrophic consequence of CO2 emission, well, actually, conventional energy production. The rest of the exercise is then a 30 year period of flailing as they develop models and fit curves and select time series to justify shutting down conventional energy production in the Western economies. That seems to be the primary goal here. Anyone exposing all that curve fitting and various whittling of data sets to fit model projections is viciously attacked by being accused of launching a “vicious attack”! The projection is astounding!
Imagine I were stealing shovels from the garages of my neighbors. Now imagine the neighbors get together to have a conference about the missing shovels. Maybe they notice that I am one of the few not missing a shovel and Mr Smith across the street mentions that he has seen me in the neibors’ yards at various times. Thinking quickly, I mention that I have been very careful not to leave my garage open when Mr. Smith across the street is at home (implying that he is the thief and is trying to frame me) and that I have a cat that often goes missing that I must sometimes search for.
Now I have diverted attention away from myself and put Mr. Smith in a position of possibly having to prove he didn’t do something, which is impossible. You can’t prove something DIDN’T happen, you can only prove something that DID happen. In a similar fashion, Mann, Jones, Hansen, et al put the “skeptics” in a position of having to prove warming HASN’T happened, which can’t be done.
So when someone finds a flaw in their data and/or methods, they turn the tables and launch a vicious attack on the accuser including going over their heads to their superiors and to the institution itself accusing that person of having launched a “vicious attack” on the highly esteemed “climate science community” which these few seem to like to speak for in the royal fashion. Then rather than defend their own data and methods, they place the “accuser” on the defensive to defend theirs. And they get away with it every single time. Imagine Mann and Jones actually having to defend their conclusions! That would be heresy. The whole thing stinks.
Hey Rosco
Clouds reflect almost like a mirror. Ice relfects like a mirror. At low angles, water reflects. Ice high in the stratosphere reflects.
Does that get rid of energy in your budget?
It is also interesting that “thousands of measurements of TSI from mountain sites. Values ranged from 1,322 to 1,465 W m–2, which encompasses the current estimate of 1,365 W m–2.”
So scientists made measurements of the solar “constant” that varied by almost 10% – seems a little higher than so called experts today acknowledge. If even half of the 10% variance is accurate that accounts for way more warming than observed.
70% of 1465 (~1025) W/sq m and Stefan-Boltzman gives ~ 94 degrees C as the maximum versus about 87.5 degrees C for 70 % of 1368 W/sq m .
The equipment these guys had was probably not as sophisticated as today but they weren’t corrupted by a crusade – in this case I’ll take their word for it and believe the so called “solar constant” isn’t !
After all our Sun is known as a “variable” star so why should anything be constant ?
re: Jaye Bass: I hope all the big players lose their jobs ending up teaching chemistry at a high school in Albuquerque NM.
What has Albuquerque ever done to you, Jaye?
Another theme you might notice with “The Cause” is that the slightest flaw even in how something is worded in a skeptical paper will apparently invalidate their entire work, while flaws in their own work are always “insignificant”.
Q: How do you know an IPCC scientist is lying?
A: He talks to the press.
If it can’t be accurately quantified, perhaps it’s worse than we thought!?
Rosco says:
November 28, 2011 at 1:23 pm
“It is also interesting that “thousands of measurements of TSI from mountain sites. ”
Any clouds?
Is there such a thing as funding sensitivity?