Quote of the week #9 – "negative thermometers"

qotw_cropped

Image from WUWT reader “Boudu”

Its has been awhile since I had a QOTW, but the last couple of weeks have been full of travel, and I’ve been out of the comment loop until recently. But this response from RyanO to the incorrigible commenter “TCO”, over at the Air Vent left me in stitches:

Negative thermometers ARE sh**. 100% sh**. I shouldn’t even need to say it to make it so. If the math results in negative thermometers, then something is wrong with the math.

Yet, we have ample evidence of negative thermometers (actual surface stations measuring air temperature where the resultant data is inverted after processing) in the Steig et al “Antarctica is Warming” paper, ( Nature, Jan 22, 2009) thanks to the careful analysis of Jeff Id and RyanO

Here’s one view of a negative thermometer:

negative_thermometer

And here’s what they look like in the Steig et al paper:

Bar Plot station weights trends

Jeff Id writes:

It is of course nonsensical to flip temperature data upside down when averaging but that is exactly what Steig et al does. This alone should call into question the paper’s result.

You can read all about it here and here.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
64 Comments
Inline Feedbacks
View all comments
Editor
June 11, 2009 6:17 am

jeez (21:57:25) :

anubisxiii
I think you might be on to something:
Imaginary temperature.

And from there, and a climate system that has both real and imaginary temperature components would be called complex. Since we already know our climate is complex, this merely confirms the postulate. Our job is complete.

George
June 11, 2009 6:50 am

Rich (05:33:55) :
Of course it is doctored. [snip – just a bit over the top, see my response to Rich above. Anthony]
/sarcasm

Tom in Florida
June 11, 2009 6:51 am

I suppose we should also have a negative barometer, a negative anemometer and a negative odometer.

William Sears
June 11, 2009 6:59 am

Although not applicable here, negative absolute temperature (in degrees kelvin) does exist. It is produced by a population inversion as seen in stimulated emission (lasers). The energy required is quite finite. It must be noted that negative temperatures are not below absolute zero, but above infinite temperature. At infinite temperature all levels are equally populated. This follows from a statistical mechanics definition of temperature and does not apply to a largely classical system as used in climatology. Here, temperature can be defined as internal energy per degree of freedom, in appropriate units, which will never be negative. You can, of course, have negative celsius temperatures which could be averaged in the same way as any group of numbers. Except that an average temperature is a meaningless concept! Temperature is an intensive and not an extensive parameter such as volume. If you can’t define a total temperature, you cannot define an average. I also assume that all celsius temperatures in antarctica are negative, so I have no idea what Steig et al are doing. Some sort of strange weighting, no doubt. Cheers.

John Galt
June 11, 2009 7:06 am

Negative thermometers are used to measure the negative heat given off by negative energy generators. If we get enough of those negative energy generators going, it will reduce the UHI effect and negate the need to paint our roofs white.

David Ball
June 11, 2009 7:36 am

Come on people, the thermometer is half full, ……… :^]

wws
June 11, 2009 7:38 am

I propose the use of an additional factor to be used in combination with ‘g’. (In fact, I believe Steig, Mann, et all, have been using this factor extensively for years)
It is known as ‘ff’, and in any climate formula it will be used to modify the measured value (mv) in this way:
(mv)(g)(ff) = whatever you want it to be.
‘ff’, of course, is a situation specific constantly variable value. As Ric Werme has pointed out, you could not expect a complex situation to require anything less.

pyromancer76
June 11, 2009 7:49 am

Pubic ridicule is the best antidote to poisonous lying pretending to be Divine Truth. Thanks to Anthony, Jeff Id especially, and Ryan O; now commenters can pile it on and readers around the world can join in with guffaws.
Personally, I like this one: John W. (05:54:19) :
Oh, come on people! Everyone knows that negative temperatures (in degrees Kelvin) are necessary to explain how the Dark Suckers in your house work. We only call them “light bulbs” because that’s what you get after they suck up all the dark.
;^)
And I subscribed to Nature for years; but I stopped a nmumber of years ago. I wonder how many others want a reputable science publication, not “yellow journalism”.
Finally, Steig’s pretense at measuring Antarctica’s “warming” with only a few, mostly penninsula, stations along with switching some others, I find to be most reprehensible — even with turning some thermometers upside down to get desired results. [snip]

John Shoesmith
June 11, 2009 8:16 am

I think we are missing a great opportunity here. I propose a circular thermometer with the bulb at the bottom. High temperatures would be read on the Left side and low temperatures would be read on the Right. If the mercury joins at the top entropy is reached and we are taxed into oblivion.

David Ball
June 11, 2009 8:50 am

My father was interviewed by an Italian organization. They put him comment to comment with Stephen Schnieder and they seem to give equal time to both. Perhaps the paradigm is shifting. I do not speak Italian, so I would welcome any translation and information as to whether this is as even handed as it seems. Too good to be true? http://www.avoicomunicare.it/#topvideo

MartinGAtkins
June 11, 2009 9:37 am

Negative temperature is by definition a condition were energy is absorbed but nothing is produced. This is achieved by most wind turbines.

mbabbitt
June 11, 2009 9:37 am

I guess its like a proctologist looking for throat cancer.

AEGeneral
June 11, 2009 9:40 am

rbateman (00:53:19) :
I imagine at this point that the code for the models has to be spaghettied beyond all recognition. No wonder they won’t show it.

Does their profession not have any kind of quality control procedures that have to be adhered to when using models? Any minimum guidelines for disclosure? Or are they still claiming proprietary rights to the code (which is laughable to me)?
Lord knows if I do a financial projection with a model, I’ve got pretty specific guidelines I have to follow, and my end-users would pale in comparison to those who write policy based on this.

Sean P.
June 11, 2009 9:53 am

George (06:50:22) :
Of course it is doctored. Do you think any member of the AGW church would even ‘touch’ a mercury thermometer? Touch alone is enough to kill you.
They won’t touch a mercury thermometer but a CFT light is perfectly safe. LOL

June 11, 2009 10:51 am

I just got in some big crazy battle about how it might be reasonable to have inverted thermometers. There is absolutely no hope that I’ll ever figure people out.

Gary Hladik
June 11, 2009 11:13 am

Anthony, thanks for the link to the “perfect thermometer” article. Measurement is the foundation of science, and it’s good to be reminded from time to time that the foundation of CAGW “science” is basically sand.

TAG
June 11, 2009 11:18 am

Someone on the air vent provided a layamn’s explanation.
The method used by Steig and the others is a form of weighted averaging. It is like trying to determine an average price for gold by taking the values in all markets and weighting them by the size of their own market market. This creates a valid average
If someone tried to do this and ended up with a negative weight somehow then it would be clear that their is something wrong with his method. Thus the Steig method is in error because it has negative weights. The explanation for the reason that it is wrong is that the method that he uses to determine the weights is sensitive to noise and by truncating the number of PCs at 3, he is introducing noise.
This is the explanation provided at the Air Vent as I “understand” it.
The rationalizations being given for possible negative covariances are wrong because this calculation has nothing to do with co-variances. This is a simple averaging.

John Galt
June 11, 2009 12:28 pm

AEGeneral (09:40:05) :
rbateman (00:53:19) :
I imagine at this point that the code for the models has to be spaghettied beyond all recognition. No wonder they won’t show it.
Does their profession not have any kind of quality control procedures that have to be adhered to when using models? Any minimum guidelines for disclosure? Or are they still claiming proprietary rights to the code (which is laughable to me)?
Lord knows if I do a financial projection with a model, I’ve got pretty specific guidelines I have to follow, and my end-users would pale in comparison to those who write policy based on this.

Can you imagine Einstein saying here’s the results of my calculations, but I’m not going to show you the data input or how I processed the data? That’s what we get with these climate models.
Now how can a computer climate model study be ‘peer-reviewed’ without the actual input and the source code also being included?
More important, how can the results be replicated? I know, just run the model again with the same inputs and you get the same results. See, it’s verified.
These models should all be fully disclosed. Inputs, adjustments, full source code and full documentation.

June 11, 2009 12:33 pm

ROM (00:28:56),
Nature and Science risk going the way of professional wrestling, and for the same reasons.

Antonio San
June 11, 2009 12:45 pm

Well if these guys have their say, soon thermometers will become a forbidden piece of hardware, instead we’ll get the CO2 concentration info… indeed scary “science”:
“The proportionality of global warming to cumulative carbon emissions
H. Damon Matthews1, Nathan P. Gillett2, Peter A. Stott3 & Kirsten Zickfeld2
1. Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve Blvd W., Montreal, Quebec, H3G 1M8, Canada
2. Canadian Centre for Climate Modelling and Analysis, Environment Canada, 3800 Finnerty Road, Victoria, British Columbia, V8P 5C2, Canada
3. Met Office Hadley Centre, FitzRoy Road, Exeter, Devon, EX1 3PB, UK
Nature 459, 829-832 (11 June 2009) | doi:10.1038/nature08047
http://www.nature.com/nature/journal/v459/n7248/abs/nature08047.html?lang=en
The global temperature response to increasing atmospheric CO2 is often quantified by metrics such as equilibrium climate sensitivity and transient climate response. These approaches, however, do not account for carbon cycle feedbacks and therefore do not fully represent the net response of the Earth system to anthropogenic CO2 emissions. Climate–carbon modelling experiments have shown that: (1) the warming per unit CO2 emitted does not depend on the background CO2 concentration; (2) the total allowable emissions for climate stabilization do not depend on the timing of those emissions; and (3) the temperature response to a pulse of CO2 is approximately constant on timescales of decades to centuries. Here we generalize these results and show that the carbon–climate response (CCR), defined as the ratio of temperature change to cumulative carbon emissions, is approximately independent of both the atmospheric CO2 concentration and its rate of change on these timescales.
From observational constraints, we estimate CCR to be in the range 1.0–2.1 °C per trillion tonnes of carbon (Tt C) emitted (5th to 95th percentiles), consistent with twenty-first-century CCR values simulated by climate–carbon models. Uncertainty in land-use CO2 emissions and aerosol forcing, however, means that higher observationally constrained values cannot be excluded. The CCR, when evaluated from climate–carbon models under idealized conditions, represents a simple yet robust metric for comparing models, which aggregates both climate feedbacks and carbon cycle feedbacks. CCR is also likely to be a useful concept for climate change mitigation and policy; by combining the uncertainties associated with climate sensitivity, carbon sinks and climate–carbon feedbacks into a single quantity, the CCR allows CO2-induced global mean temperature change to be inferred directly from cumulative carbon emissions.”

John W.
June 11, 2009 1:20 pm

Antonio San (12:45:24) :

Climate–carbon modelling experiments

OK. “Modeling experiments.” When I read that, The meaning I get is that they are running an invalidated, unverified model, and calling the run an experiment. Do I understand that correctly? Are these people really asserting, either explicitly or implicitly, that they don’t have to look at the real world because they can get all the information they need by simulating the real world? If this is what they are doing, could someone please explain in what way, shape form or manner their activity has anything to do with science?

George E. Smith
June 11, 2009 1:45 pm

Need I point out that “imaginary numbers” are in fact just that. They don’t exist; except that we defined them to exist. Well nothing else in mathematics exists either so why not imaginary numbers. And the concept of the imaginary being perpendicular to the real is also just a figment of a definition of a method of representing complex numbers whaich also have no real existence.
So if you want to define “pi” as -[sqrt(-1) ] *ln(-1) , nobody is going to stop you; but trying to calculate that value, to verify if that is correct; well that is a different matter. It is like Cardan’s solution to the roots of a cubic equation; the only case which it can solve is the case of all real roots; but it gives you the roots as complex numbers so you still can’t compute them.
I had a colleague who had a perfectly rational argument for why it is that we are all born knowing everything; just ask our parents.
And as our lives progress we gradually discover that some of what we know is no longer true, until finally we know nothing at all, at which time we are due to leave this veil of tears.
But none of us are born stupid; that we have to be taught; and there are plenty of people who are willing and able to teach stupidity.
Just try to get a rational question answered over at Real Climate, to see how the Gurus of Stupid hold their disciples rapt attention.
George; who learns at WUWT.

George E. Smith
June 11, 2009 1:58 pm

Speaking of negative thermometers, in standard optical ray tracing software it is conventional to treat ordinary mirrors as being a boundary between two media with refractive indices of +1 and -1, so then if you do that, you can calculate the ray propagation by treating it as simply a refraction across a +1/-1 boundary.
So what if you have a mirror inside a piece of glass; so the refractive index before the mirror is +n, why not have the index following the mirror be -n, so the ratio of the two is -1 as before.
So I once asked a ray tracing software vendor, why his program would crash, if I entered a negative refractivew index (other than -1). He replied that it was not relevent since negative refractive index materials do not exist; so he refused to change his code (fortran) so it accepted -n just as easily as it accepted -1.
So much for him and his now off the market program; we n0ow do have optical materials with negative refractive indices, and the refracted ray, now lies on the same side of the normal as the incident ray.
You reject a good idea, and you go out of business.

Ray
June 11, 2009 2:38 pm

We haev to get the temperature to go up…
Waitresses cover up against cold as burned Maine coffee shop reopens in tent
By THE ASSOCIATED PRESS – 1 day ago
VASSALBORO, Maine — A topless coffee shop in Maine is now shopless after a fire.
But waitresses are serving coffee again after the Grand View Coffee Shop in Vassalboro, just north of Augusta, returned to business this week in a tent. One difference though.
The waitresses are covering up because of a recent cold snap, with many of them sporting sweat shirts.
Owner Donald Crabtree says the waitresses are volunteering their time and working for tips as he spruces up a couple of rooms that weren’t damaged by the fire that ravaged his shop in a converted motel.