The bottom of the data page for the GISS Land-Ocean Temperature Index used to (past tense) include a useful reference. But that reference was deleted in January 2016 and has been gone ever since, as far as I can tell.
The data page for the GISS Land-Ocean Temperature Index used to contain a statement that began:
Best estimate for absolute global mean for 1951-1980 is 14.0 deg-C or 57.2 deg-F, …
In other words, it was a factor for converting their annual anomaly data, which are referenced to the based period of 1951-1980, from anomalies to absolute temperatures.
The statement continued:
…so add that to the temperature change if you want to use an absolute scale
(this note applies to global annual means only, J-D and D-N !)
Below that, GISS provided a few examples, as can be seen in a screencap of a December, 2015 data page, Figure 1, taken from a WaybackMachine-archived webpage here.
Figure 1
A month later, the factor for converting their data in anomaly form to absolute form was gone, magically disappearing. As far as I know, it’s been gone since then. To confirm the disappearance, Figure 2 includes a screencap of the bottom of the GISS January 2016 data page from the WaybackMachine archive. See the webpage here.
Figure 2
GISS continues to list that absolute value on their The Elusive Absolute Surface Air Temperature (SAT) webpage here. See the Q&A that reads (my boldface):
What do I do if I need absolute SATs, not anomalies?
A. In 99.9% of the cases you’ll find that anomalies are exactly what you need, not absolute temperatures. In the remaining cases, you have to pick one of the available climatologies and add the anomalies (with respect to the proper base period) to it. For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, but it may easily be anywhere between 56 and 58°F and regionally, let alone locally, the situation is even worse.
Hmm, “…the most trusted models produce a value of roughly 14°C…” Curiously, for the period of 1951-1980, the average absolute global mean surface temperature is 15.1 deg C for the 6 ensemble members of the GISS Model E2 that are stored in the CMIP5 archive with historic and RCP8.5 forcings. If you wish to confirm that little tidbit of information, you can download the monthly global surface air temperatures (TAS) from the GISS Model E2 in absolute form for those ensemble members stored in the CMIP5 archive with those forcings (and others) at the KNMI Climate Explorer. Isn’t the GISS Model E2 GISS’s “most trusted” model?
SIDE NOTE TO GAVIN: Maybe GISS should consider updating that Q&A to reflect the outputs of the latest and greatest GISS climate models.
NOTE REGARDING MY USE OF THE TERM ABSOLUTE TEMPERATURE
The term Absolute is commonly used by the climate science community when discussing Earth’s surface temperature when they aren’t using anomalies. See the quotes from GISS above.
[End note]
I’ll let you speculate as to why GISS deleted that useful adjustment factor.
That’s it for this post. Have fun in the comments and enjoy the rest of your day.
STANDARD CLOSING REQUEST
Please purchase my recently published ebooks. As many of you know, this year I published 2 ebooks that are available through Amazon in Kindle format:
- Dad, Why Are You A Global Warming Denier? (For an overview, the blog post that introduced it is here.)
- Dad, Is Climate Getting Worse in the United States? (See the blog post here for an overview.)
To those of you who have purchased them, thank you. To those of you who will purchase them, thank you, too.
Regards,
Bob
‘For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, but it may easily be anywhere between 56 and 58°F’
So, anywhere between 56 and 58? Remind me again how much it is thought to have rise, because if they can’t say how much it was to start with, how can they know how much it’s risen?
So the overall historic rise in temperature is less than the variability in estimating the historic mean?
Which explains why the passage had to be disappeared.
At least the SIDC still has a live link to the old sunspot record, but has abandon updating it in June 2015, so I thought it is worth keep record going, but the conversion is approximate. I am not aware of any web link that has continued with the old count.
November solar activity was almost static, the ‘classic’ sunspot count (Wolf SSN) rose fraction of a point to 4.5 while the new SIDC reconstructed number was at 5.9
Composite graph is here
SC24 is nearing what might be the start of a prolong minimum (possible late start of SC25 too), not even a dead cat could bounce back from the levels recorded during the last 12 months.
Adjustments can make a dead cat spring back to life.
I suspect this is why the Paris Accord etc. avoids absolute temperature in its target scenarios. A target of a return to 1950 temperatures plus a smidge may not be a very attractive proposition to some.
For my part I recall those winter chilblains back in those days.
“include a useful reference”
It isn’t a useful reference, and in the section cited (The Elusive Absolute Surface Air Temperature) they explain why. What GISS actually calculates is a global anomaly average. They explain why that is the right thing to do, and I spend weary hours explaining it here too. That is their product. They say “In 99.9% of the cases you’ll find that anomalies are exactly what you need, not absolute temperatures.” and then “In the remaining cases, you have to pick “. IOW, don’t do it, it’s a dumb thing to do, but if you really insist….
If you do insist on adding someone’s estimate of global average temperature, you can take the responsibility of that yourself. GISS isn’t adding anything here by picking a figure for you, especially when they so explicitly advise against doing it. It’s just pointless, and the note should never have been there.
I have read that we need the anomalies because change is all that counts. I’m not buying it. Show me the evidence that change is all that is important.
(You’re right on time here with the comment.)
“because change is all that counts”
Change is what GISS can tell you about. If you think you really want to know the “Elusive Absolute Surface Air Temperature”, that is up to you. Ironically, sceptics often talk approvingly of stuff like Essex and McKitrick Does a Global Temperature Exist?. That is the usual muddle that comes from refusing to distinguish between average temperature and average anomaly, but under the muddle there is a point. It’s hard to derive an average temperature (not anomaly) by sampling, and it doesn’t much matter for anything anyway. And those things go together. If it mattered, it would have consequences, and we could measure it via those consequences. But it doesn’t, and we can’t. Anomaly, on the other hand…
Change isn’t all that easy. They make modifications to account for changes to surroundings, replaced sensors, etc. Those changes are rarely explained, but seem to change annually (at least).
Actual measurements would allow anyone to draw their own conclusions, or at least develop their own hypotheses. As it is, only GISS has any idea of the integrity of their dataset.
As an engineer working for an aerospace company, our customers would never have accepted that. Then again, they could pull funding. GISS doesn’t have that concern.
Stokes said, “Anomaly, on the other hand…”
“HEADLINE; 48,000 premature deaths resulting from temperature anomaly.” /sarc
Biological organisms respond to actual temperatures, not anomalies. Water changes phases at particular temperatures, not at particular anomalies. Solubility of oxygen and carbon dioxide are controlled by temperatures, not anomalies. Absolute humidity is controlled by temperature, not anomalies. So, ‘absolute’ temperatures ARE important.
Clyde,
Especially when one invokes SB and raises them to the power of 4.
Geoff.
“So, ‘absolute’ temperatures ARE important.”
Yes. But the spatial average of ‘absolute’ temperature is not. That won’t kill anyone either. Whereas if average anomaly is higher than usual, then chances are it’s hotter than usual where you are. In places where that matters, it matters.
“They explain why that is the right thing to do, and I spend weary hours explaining it here too.”
However what the GCM produce are absolute temperatures, not anomalies. And what thermometers give are absolute temperatures, not anomalies.
“And what thermometers give are absolute temperatures”
Yes. But there is no thermometer that gives a global average temperature. That is the quantity whose pursuit is futile. And it is a quantity that GCMs also do not characterise very well. There are reasons for that.
No one lives at the average temperature therefore temperature anomaly is only interesting to those who need to make it interesting to formulate material to proliferate doom and gloom messaging about Co2.
As we cannot stop our climate from changing the actual thermometer reading specific to where you live is the only pertinent data. Everything else just fuels paranoia about our climate which is how all of this nonsense started in the first place. And the reason why Wattsupwiththat.com is so critical to unpicking the fraud.
NASA say temperature has risen by 1C since 1880 some say 2C since 1820 but in 1850 there were 3 temperature stations for the whole planet. Therefore the farrago of cant, humbug and hysteria is related to an average from 3 stations to satellite data which covers near 360 degrees of the planet with small exceptions with both poles.
Therefore when comparisons are made those comparisons equate to apples and pears and maybe a few oranges and more than a few lemons. Because of the gross lack of information everything the IPCC pontificates about is irrelevant.
Maybe if we have a few centuries of satellite data we might just be able to gain an understanding of some of the way in which our climate functions. Until then the abiding level of fearmongering is just noise. If anyone believes humanity can manipulate, modify, frustrate, abort or change the way in which our climate changes by nibbling at Co2 emissions they are indulging in Alice in Wonderland fantasy.
If the 100,000 billion tons of Co2 released between 2000 and 2010 nearly one third of all Co2 ever emitted failed to cause a climate apocalypse how come the projected 4100 billion tons of Co2 the IPCC predict will be emitted between now and 2100 has that potential?
As I remember from 1998 temperature plateaued for 18 years and 9 months and because it did RSS modified their behaviour because it wasn’t in tune with the global rumble of climate change fantasy needed to extort more trillions from the working poor to the already rich.
“No one lives at the average temperature therefore temperature anomaly is only interesting to those who need to make it interesting…”
Here we go round and round again. Yes, no-one lives at the global average temperature. But we do experience anomalies, and they do have real consequences. And if the average anomaly goes up, we are all likely to be affected, in various ways.
Nick said “But we do experience anomalies, and they do have real consequences. And if the average anomaly goes up, we are all likely to be affected, in various ways.”
Yes, affected in very positive ways. I know that to be correct because my wife said so this morning, right here in snow country. She is never wrong.
And if the average anomaly goes down I suspect the impact will be greater than would be if it increases. At least climate history indicates that would be the case.
Nick, your comment begins, “It isn’t a useful reference…”
It most certainly was a useful reference. In fact, at one time, GISS must’ve though it was useful because they included it on that data page. With it gone from that data webpage, now we have to go searching around the GISS webpages or through internet archives to find it when we want it.
The fact that GISS doesn’t use absolutes to calculate global mean surface temperature is immaterial, especially when Berkeley Earth has shown that calculating global mean surface temperatures using absolutes is possible.
Much of the remainder of your comment is pure speculation interlaced with comical extremely one-sided interpretation, like “IOW, don’t do it, it’s a dumb thing to do, but if you really insist….” and there’s no reason for me to reply to that sort of nonsense.
Nice try at misdirection, though, Nick. But everyone here can see through stuff like that.
Regards,
Bob
Bob,
“In fact, at one time, GISS must’ve though it was useful”
Stuff happens. Someone must have once thought it was useful, but when GISS thought more about it, they decided, rightly, that it wasn’t.
“the remainder of your comment is pure speculation”
It isn’t speculation. When GISS says “In 99.9% of the cases you’ll find that anomalies are exactly what you need, not absolute temperatures.” it isn’t speculation to say that GISS is advising against quoting an absolute average temperature. And has been, for a long time.
Nick, that fact that a political body is advising me to not look at available data is disconcerting.
…These aren’t the droids you’re looking for… to coin an apt phrase…
“…It isn’t speculation. When GISS says “In 99.9% of the cases you’ll find that anomalies are exactly what you need…”
Its arrogant. What they are probably saying is “In 99.9% of the cases WE use the anomalies…” How would they know what consumers of the data need? They presume that their use is the only meaningful use. How hard is it to include the baseline number? They have to know it. Obviously its just embarrassing to demonstrate their error margin is so wide for an historic number.
“How would they know what consumers of the data need? “
Providers of data are generally expected to advise on what they think their data is useful for.
“How hard is it to include the baseline number? They have to know it.”
No, they don’t know it, and explain why. It isn’t a “baseline” number. They did provide at one stage a reference to other data that they thought people could use if they really wanted to. But, thinking that was a futile effort, they no longer wanted to be responsible for it.
I suspect someone also pointed out it isn’t a science baseline and can’t be called that in any scientific sense.
Nick said: (How would they know what consumers of the data need? “
Providers of data are generally expected to advise on what they think their data is useful for.
“How hard is it to include the baseline number? They have to know it.”
No, they don’t know it, and explain why. It isn’t a “baseline” number. )
Wait a minute Nick, do you have any idea of how stupid this sounds to a reasonably educated layman reading this post? Let’s see, GISS subtracts and unknown number from a base line unknown number and gets an anomaly that you believe in? What, out of thin air!
That is bazar. The only time I have seen answers like that espoused from someone educated is when the elephant called “politics” is sitting in the room.
“do you have any idea of how stupid this sounds to a reasonably educated layman reading this post?”
I guess that depends on “reasonably”. I’ll set it out again:
1. For each site and month, GISS calculates a “normal” based on history (1951-80). That is a known number.
2. For each data point, GISS subtracts the relevant normal to create an anomaly. That is a known number.
3. GISS then spatially averages the anomalies. That is the anomaly average.
4. GISS does not spatially average the normals, or the sums of normal and anomaly. They explain why here. That would, if performed, be the baseline number being discussed here. But they don’t.
Ok Nick, now I understand why the prefer anomalies. Anomalies are a smaller number with the full range of change so they look far more meaningful and far larger in change than if they were actual temperatures. Actual temperature changes would not scare anyone and they could do little to obscure that fact. Not good for funding if somebody could easily lay hands on the real average global temps so they don’t calculate them. Protect the funding.
“but we do experience anomalies, and they do have consequences” They may well have consequences but 1C since 1880 and a few hundredths of a degree not even the difference in temperature between top of head and bottom of feet hardly deserves the degree of hysteria it gets does it?
Unless you can find a way of persuading mother nature to find another hobby wittering on about anomalies being better than actual temperature is for the birds or vice versa is for the birds.
Each is important it in its own way for those whose specific interests need one or the other for their deliberations.
Forget the hyperbole, the climate will continue to change and nibbling at Co2 cannot make a difference let alone manage climate change or abort its progress. The debate is academic, dancing on the head of a pin.
Nick, you quoted me incorrectly and incompletely, from what I had written in my earlier reply to you. I wrote, full sentence, “Much of the remainder of your comment is pure speculation interlaced with comical extremely one-sided interpretation, like ‘IOW, don’t do it, it’s a dumb thing to do, but if you really insist….’ and there’s no reason for me to reply to that sort of nonsense.”
You, on the other hand, quoted me out of context without ellipses at either end “the remainder of your comment is pure speculation”.
Lousy debate strategy, Nick. You then continued:
“It isn’t speculation. When GISS says ‘In 99.9% of the cases you’ll find that anomalies are exactly what you need, not absolute temperatures.’ it isn’t speculation to say that GISS is advising against quoting an absolute average temperature. And has been, for a long time.”
Wrong again, Nick. They aren’t “advising” anything. They are stating what they believe to be fact, that anomalies are exactly what’s needed in most cases. How do I know? Because GISS continues, “In the remaining cases…”
In the same comment, you stated “Stuff happens. Someone must have once thought it was useful, but when GISS thought more about it, they decided, rightly, that it wasn’t.”
Again, Nick, another example of your pure speculation interlaced with comical extremely one-sided interpretation.
Nick, you’re boring me with you lousy debate tactics and your comical interpretations. In other words, you’re wasting my time. And everyone who reads this thread can see what you’re doing.
You used to be a reasonable person, Nick. What happened?
Good-bye,
Bob
Really, I must have missed that Bob!
In any case, it is interesting that the NCEP Initialisation temperature anomalies continue to be much closer to UAH than GISS, by using the same baseline. GISS should just do the same, because if you change the baseline on the website it comes close to what these two datasets show. Given the hardcore adjustments to GISS, its a pointless dataset offering nothing as it does not reflect reality in any shape or form.
Please forgive my ignorance but how do you get an anomaly without a baseline figure?
Do you not extract this from a value to determine a variance?
You get an anomaly at a time and place, relative to a baseline figure for that time and place. Then you average the anomalies. They don’t average the baselines (normals), and they explain why.
Stokes,
But, every year we get pronouncements about how Earth is going to Hell in a hand basket because the global average anomaly has increased by some hundredths of a degree. The problem is that the whole Earth isn’t increasing in lock step. That is, areas that are warming most rapidly, such as the northern high-latitudes, carry more weight in calculating the annual average when the anomalies are used for calculating an average. That is in part because in calculating the anomalies the range is reduced to just a few degrees, whereas the ‘absolute’ temperatures may vary over 2 orders of magnitude. Therefore, ANY anomaly represents a larger percentage of the total anomaly range than the ‘absolute’ temperatures compared to the range. What I’m saying is that change carries more weight in averaging anomalies than change in the the ‘absolute’ temperatures because the anomalies have been reduced to nothing BUT change. Thus, anomalies accentuate change.
Stokes,
““In 99.9% of the cases…” Where are the error bars? That isn’t a scientifically derived number, it is hyperbole — bureaucratic CYA verbiage to rationalize having removed a constant that would allow one to convert a calculated number (anomaly) back to the original units of measurement, ‘absolute’ temperature. It is all disingenuous game playing and you can be depended upon to come to their defense with your sophistry to rationalize anything that the ‘consensus’ does. It is as though the alarmists can never be wrong. Which is pretty improbable!
Nick,
An anomaly of 0.5 deg C on a baseline of 12 deg C is radically different than an anomaly of 0.5 deg C on a baseline of 16 deg C. Both in what the result is (12.5 deg C average is much colder, overall, than 16.5 deg C), but in magnitude of change as well. The baseline is critical; you’d only want to retreat to “anomaly” if the anomaly really isn’t that big of a deal.
In other words, you make a mountain out of a molehill by just focusing on the anomaly.
@Nick Stokes,
“if you insist on adding someone’s ” not a correct statement.
I insist on being able to add their own baseline to their product, because by definition it must exist.
They do not know what I wish to do with MY product, as I am using their knowledge to extend information in a way I choose and it just might be helpful. Because they can’t seem to comprehend my desire to have the tempurature that they are preporting; to have and use ; doesn’t mean they have a right to slam me, nor assume that my ideas are poor because they haven’t thought of what I might do.
You claim of “pointless” , if the pronoun refers to creating an absolute scale is minimally arrogant, clearly un-scientific.
“Who would want more than 640k ram?”
“Why would a blind person want a stereo?”
“Why would anyone want to drive 50 mph?”
Additionallly, I remember being told that ‘it’s just pointless’ when I was asking about other orthogonal basis to a rather ignorant lecturer and was told this exact thing. Seems Legendre Polynomials, Gram-Schmidt processes or almost any other usefull methods were not known to this lecturer.
Critizing what someone wants to investigate is for an advisor, not for science really as a whole.
Trivial example and quite relevant to the present subject:
You say the anamolies are the really important part. A rise of 3 degrees is what we need to understand. Nope, a rise of 3 degrees for ice does not determine if it melts or sublimates. So if we want to talk sea level rise, the absolute temps, humidity and air pressure are needed.
You are welcome to be fascinated with anamolies, but they have a baseline they use, it should be clearly stated.
” because by definition it must exist”
Whose definition is that? Could you state it please?
“I am using their knowledge to extend information in a way I choose”
The figure removed isn’t their knowledge. But you can get it from the source. No-one is stopping you. GISS just are not going to take responsibility for it.
“Nope, a rise of 3 degrees for ice does not determine if it melts or sublimates.”
Nor does a global average of 14°C.
“they have a baseline they use”
They don’t have a baseline number.
@Nick Stokes From GISS faq:
“Temperature anomalies indicate how much warmer or colder it is than normal for a particular place and time. For the GISS analysis, normal always means the average over the 30-year period 1951-1980 for that place and time of year. This base period is specific to GISS, not universal.”
“Note that regional mean anomalies (in particular global anomalies) are not computed from the current absolute mean and the 1951-80 mean for that region, but from station temperature anomalies.”
They have a baseline becasue it is stated in their definition AS
anomalies are ALWAYS difference from the baseline. Hence a GLOBAL anamolie is by definition a difference from the GLobal baseline.
Searching for definition of Global Surface Tempurature. :
Global Anomalies are computed from station anamolies. {which quite sometime ago, I recall that it is ‘in essence’ a spacially weighted average. So Local Anamolies are basically a weighted mean to get Global Anamolies.
Now a weighted mean of Local Anomalies means you can also take the weighted mean of the local station baselines and get a derived Global Baseline. In addition,
the weighted mean of the Global Anamolies + dervied Global Baseline indeed matches (trivially) the weighted mean of the local baseline with local anamolies.
Hence the weighting of the local baselines is actually and specifically the defintion of anomalie they use as they state: “abnomalie means ALWAYS the difference from the local baseline.”
So GISS uses local station data for raw data, gets a base line then gets local anomalies. USe these as a proxy (by using the anomalies) to get a global anomalie which is by DEFINTION the variation from the GLobal baseline they USED to give by now don’t. But we can back derive it if we want.
Hence by their definition.
They removed it and it was theirs.
Your third comment is just you wishing to argue about nothing. The implications of a global raw temp of 3 could be useful in further studies that you yourself are unaware of but just because you can’t see the use of it is un important. Since raw temp is used to know certain physical properties that anomalies can’t it just might be used (say as a proxy to ice melting)/freezing in areas we dont have regional data, because of other knowledge available.
“They have a baseline because it is stated in their definition”
They have a baseline period, during which, for each place and time of year, they compute the anomaly base and subtract it. That creates the anomalies, which they then average. As you say
“you can also take the weighted mean of the local station baselines and get a derived Global Baseline.”
You could, but they don’t. They explain why. The results are unreliable because of the inhomogeneity of temperature. So they don’t have a baseline number for the average. That is why they suggested (unwisely) that you could put together something from GCM results. It’s still true that you could, but it is no longer their suggestion.
SO nice that you now ignore the fact that their definition is an anomaly is always a difference and their definition gives rise to a global baseline that they HAD provided before but now don’t. Their reason is
is not relevant to the fact that by defintion they have defined a global baseline and they don’t like the fact that that is used by some people as they prefer to use anomalies but it is there.
If a global average is unreliable then so is a global anomaly since CLEARLy it just may be the tropics heating up, or CLEARLY it might be just the poles.
Inhomogenity nor their advice to other scientists is not gospel. I don’t always take the word of other scientist and frequently choose to do work myself. As I know you do. I even go back and check old proofs and see if there are better ways.
You are welcome to continue to advise your students to the proper course.
A simple fact that they once before provided their global baseline and now removed it means they are not happy that people are using their data in the ‘wrong way’.
I am not sorry to use their data, I have paid for it. But if everyone wants to claim the WORLD is warming but acknowledge it is truly only regionally important, then it is time to carefully look at what possible claims from any models actually get right wrt regional anomalies.
“their definition gives rise to a global baseline that they HAD provided before but now don’t”
It’s very frustrating after a thread of commentary to find that basic facts still don’t get through. No, there isn’t a definition that gives rise to a baseline that they HAD provided before. They explain, in the FAQ quoted in the head post, where the figure came from:
“In 99.9% of the cases you’ll find that anomalies are exactly what you need, not absolute temperatures. In the remaining cases, you have to pick one of the available climatologies and add the anomalies (with respect to the proper base period) to it. For the global mean, the most trusted models produce a value of roughly 14°C, i.e. 57.2°F, but it may easily be anywhere between 56 and 58°F and regionally, let alone locally, the situation is even worse.”
They don’t have a number from their definition. They say you have to look to GCMs, but even then the number is very unreliable. They are really yelling at you – don’t do it. In 99.9% of cases it’s the wrong thing to do (and 99.9% of people understand that as a figure of speech, so to speak – it means effectively always) And so, you might ask, why do they even mention a number? Yes. But they are now mending their ways.
Nick,
For the ‘anomaly’ temperature to be a ‘useful’ alternative to the ‘absolute’ temperature, there is a need for the uncertainties in both to be displayed with both.
Given that statistically, the uncertainty in the anomaly has to be the same as the uncertainty of the absolute – because math can’t adjust the thermometry errors – I am at a loss as to the benefits of the anomaly method.
Many statisticians have used the measure of ‘ % relative standard deviation’, or %RSD, which is simply 100 times the quotient of standard deviation and mean. Here is a proble. Suppose for some reason, for a set af annual temperatures at a place, the anomaly was calculated by subtraction of the mean of the whole set of temperatures, rather than just a 30-year average from some part of it. This results in a mean of zero for the anomaly set. The %RSD therefore involves a division using zero, which is hard. The point is that the glib way to express standard deviation as a figure read off an Excel page, is problematic because it depends on the selection of the base temperature, be it 1, 2, 10, 30 … all years.
Relateldy, one can do many things with standard deviations as a measure of uncertainty. One can get a much reduced SD by using the square roots of the absolute numbers. We have seen from your vpast examples that one can get a reduced SD by subtracting a 30 year mean, but does it mean anything?
Cheers Geoff.
Geoff,
“Given that statistically, the uncertainty in the anomaly has to be the same as the uncertainty of the absolute”
Quite untrue for the global average. You could probably give a precise account of the levels in your house, relative to some reference (eg doorstep) despite being unsure of its height above sea level.
“Many statisticians have used the measure of ‘ % relative standard deviation’, or %RSD”
They would only do that if the measures were guaranteed to be positive. It is especially meaningless for temperature (not in k) when you get a different answer if you switch from F to C.
“We have seen from your past examples that one can get a reduced SD by subtracting a 30 year mean, but does it mean anything?”
You get a reduced sd if you subtract from each reading a mean for that time and place. This partitions the data into two components, and the spatial average of each part has its own sd. The anomaly component has the information that you want, and has a much lower SD. Partitioning is a no-brainer.
Nick writes,
“Given that statistically, the uncertainty in the anomaly has to be the same as the uncertainty of the absolute” Quite untrue for the global average. You could probably give a precise account of the levels in your house, relative to some reference (eg doorstep) despite being unsure of its height above sea level.
This IS true. You are introducing special conditions like doorsteps that are no part of my argument.
…………………
Nick then writes, “Many statisticians have used the measure of ‘ % relative standard deviation’, or %RSD” They would only do that if the measures were guaranteed to be positive. It is especially meaningless for temperature (not in k) when you get a different answer if you switch from F to C.
No, it does not matter what what the measures are, %RSD has utility for a string of ordinary numbers irrespective of their origin.
…………….
Finally, Nick again, “We have seen from your past examples that one can get a reduced SD by subtracting a 30 year mean, but does it mean anything?” You get a reduced sd if you subtract from each reading a mean for that time and place. This partitions the data into two components, and the spatial average of each part has its own sd. The anomaly component has the information that you want, and has a much lower SD. Partitioning is a no-brainer.
You can get a reduced SD if you do any number of imagined or real transforms. I mentioned taking the square roots of the numbers as a way to reduce SD. Your creation of a device to partition the SD into 2 parts is a transform, but you have to be careful with transforms because of the likelihood that a person unaware of their history might make wrong assumptions (like that there had been no transform) and end up with nonsense.
I raised the %RSD as an easy, time worn way to place the SD in perspective for comparison with other data, like comparing temperature variations from a hot place and a cold place. I mentioned the transform of taking the square root to show that if a further user took the SD value without knowing it was from transformed data, errors would happen. Likewise with the anomaly method. It has traps and pitfalls, But above all, it is a math device that does not overcome the intrinsic error arising from reading a thermometer in a screen.
Geoff
Geoff,
I would describe the process of de-trending a time-series, and subtracting the residual-mean from the residuals, as normalizing the data. That isn’t what is being done with temperature data.
Commonly, in other areas of science, the concept of an anomaly is that of something which is unusual or above a background level, such as in geochemical sampling. In that case, it is sufficient to just subtract the background, or lowest level. That emphasizes the values that are of interest. So, a drilling program can outline an ore body and estimate the volume or tonnage of the element of interest. If the background values are non-trivial, then the correct estimation of the recoverable element would require adding back the background level. In most cases, one needs to avoid transforms that are not recoverable, because information is lost. Climatologists don’t seen to appreciate the preservation of information, or being able to see the Big Picture.
“They say “In 99.9% of the cases you’ll find that anomalies are exactly what you need, not absolute temperatures.” and then “In the remaining cases, you have to pick “. IOW, don’t do it, it’s a dumb thing to do, but if you really insist….”
Nope. It depends on what people want. There is no “IOW don’t do it”. These are supposed to be scientists. Scientists are supposed to be able to express themselves succinctly. IOW they just have let the matter rest rather than confusing things.
Nice try though.
Nick, you are one of those that make me bang my head on the desk!
What do you think “change” is? Or is this some attempt to misapply General Relativity to Climate Science?
CHANGE is the difference between two measurements (or estimates), separated by time, of the same thing. If you do NOT have two measurements (or estimates), you CANNOT determine the change.
If GISS reports, say, that “there has been a +1.5C change in average global temperature over the last decade,” that does not mean ??? + 1.5C – it means “our estimate of the average global temperature one decade ago” + 1.5C.
Now, I’m not calling it fraudulent – but it is quite unreasonable to tell people that they don’t need to know what “our estimate of the average global temperature one decade ago” is, that all they need to know is “the change that we calculated from our estimate” – because when that estimate is modified, or replaced with some other estimate, there is no indication. Which Bob points out in his post – what ARE they using for the estimate? Is it the 14C that is now separated from the change data it applies to? Is it the 15.1C that is the current ensemble mean? Is it something else? (The possibility of “something else” does lead to suspicion that it is “whatever makes the best headlines for today’s press release.”)
It makes the anomalies meaningless if you don’t know where to start. You can say that temperature is up from 57.2 F and only down slightly if it’s 58 F or calculate the ideal at 280 – 285 ppm/v to be 59.0 F and then calculate future warming based on the ideal at so much co2 will raise the temperature by x amount, but use the lower F degrees to prove your point as far as anomalies go. Or deliberately hide the fact that temperatures aren’t near the ideal. If you are starting at a 1 C below normal absolute temp, you can raise the temperature and have it appear as if warming is occurring. ( not that I don’t believe that some warming has occurred since the 1970’s, I disagree with the cause … two separate issues )
Playing both sides of the coin. You can’t loose, except it’s wrong.
Is this the info you are talking about? I know I had saved a more recent version maybe up to 2017, but I can’t locate it. I prefer showing this as a truer illustration of what’s going on with earth temperatures:
Some years ago there were stories about the how the insurance companies had made (I don’t remember the exact numbers but these will be in the ballpark enough to make the point.) 3 billion in profits in the past year.
(Big numbers to give the impression that something nefarious was going on.)
Another story I read put that into perspective. Those big numbers only represented a 2% or 3% profit. Not so “nefarious” sounding.
So, what trend in temperature shows scarier changes? Absolutes or anomalies?
Standard Atmosphere has been 15C at 1013 mb since before the 90s
Bob,
So are you saying that instead of using the temperature calculated for the baseline that defines the anomalies, they are (or did) list the output of a model? Was the model output actually used to calculate anomalies instead of an empirical 30-year average?
I recently stated that the baseline can be arbitrary, and no less than the exalted Steven Mosher eloquently demolished my claim with the unassailable response, “wrong.” It would appear that you have found evidence that the baseline isn’t what it used to be.
Clyde, you’ve misinterpreted the post. My post was only about a conversion factor (for converting their annual anomaly data to absolute temperatures) that had existed at one time on a GISS data webpage but was then removed. Nothing more, nothing less.
Regards,
Bob
Bob,
And my question is about that missing conversion factor. What confused me was your quote, “…add the anomalies (with respect to the proper base period) to it. For the global mean, the most trusted MODELS produce a value of roughly 14°C,…”
If the anomalies are derived by subtracting the 30-year base period average from the ‘absolute’ temperatures, I would expect to get back to the original temperatures by adding the base period average. The reference to a “trusted model” makes it look like the inverse operation is not simply a recovery of data by reversing the arithmetic operation. Perhaps the operations of interpolating and extrapolating anomalies change things. But, it seems that the Q&A section leaves something to be desired as to what has been done and how a questioning mind would undo it.
This usually presages something they are going to fiddle in the ‘data’. I recall the sea level graph of NOAA(?) several years ago quietly discontinued adding data points (Argo float data?) after SLR dipped and flattened. They came back with a factor that adjusted for isostatic drops in the sea bottom caused by long term rebound of land once under a few kms of ice. Trouble is this added a sea volume factor resulting in a new SL that is above the actual water surface into the atmosphere and the meaning of the term as a physical thing is lost. This was when the expensive Argo buoys refused to get with the program and naughtily was reporting cooling, exactly what you would expect if sealevel was levelling off. Karlization of the “Pause” in 2015 by Karl on the eve of his retirement no less (recall also Hansen’s jiggering of GISS in 2007 on the eve of his retirement – up til then, 1937 was still the US high temperature record). I predicted here at WUWT that SLR was going to get a work over a few months before it came to pass. So watch for GISS to get a twist! They are so highly leveraged with cooling the past to maintain correlation with CO2 rise, and hoping and praying that warming will kick in doesn’t seem certain enough, but they can’t let the correlation deteriorate. Soon Nuuk Greenland will have warmer temps for the 19th Century than Albany, NY.
I have always wondered why the most precise and accurate method of measuring the broad oceans’ temperatures (ARGO) had to be “adjusted” using the results of random buckets of water.
notice that all the big weather sites have eliminated as much info about record highs and record lows and the years of said records over the past several years. sometimes things are just so important that you just HAVE to lie.
Yes interesting that it is almost impossible to get record high and low temperatures anymore. Every bit of media is locked into the global warming meme.
The one television media that isnt is SKY news out of Australia. Andrew Bolt of the Bolt report tells it like it is.
There is one massive problem with temperature that is consistently overlooked. Greenhouse warming concerns trapped energy and there is no linear correlation between energy and temperature. This is especially evident with water. The amount of energy required to change the temperature of ice from 0 degrees to 1 degree is more than 10 times the energy to change from 1 degree to 2 degrees. Anomalies are useless since a 0 to 1 degree change cannot be equivalent to a 1 to 2 degree change if energy is being measured. Climate math only works if degrees of temperature are being trapped, not energy.
The use of anomalies assume that temperature measurements are of a homogeneous material with a completely linear relation between temperature and energy. This would be fine for low accuracy work, but the extreme precision implied by a 1.5 degree temperature change requires very precise measurements. If the Earth’s absolute temperature is 287 degrees Kelvin, a 1.5 degree temperature change is 0.52%. This implies that energy flow is being measured to fractions of a percent.
This is absolute nonsense. The enormous amounts of energy flows required to melt and freeze both Arctic and Antarctic ice are highly variable are not measured to anywhere near the implied precision. This is why there are El Nino and El Nina events where small changes in energy flow cause large temperature swings.
The energy buffering effect is very obvious when looking at Arctic temperatures. During the Arctic winter when there is no water phase change, the temperatures fluctuate wildly.
During the warmer months when the ice is either melting or freezing, the temperature stays very constant even though there are enormous energy flows. There are Hiroshima levels of energy flows with little or no change in temperature. Ignoring these energy flows when ice covers such a large part of our planet is inane, especially when such high precision is implied.
The same reasoning also applies to the phase change of liquid water to vapor and back again. The difference is that the energy flows are substantially larger than the change from liquid water to ice and vice versa. Even worse there is substantially more water than there is ice.
This doesn’t even require a computer model to prove. Anyone that has dropped a few cubes of ice into a drink is well aware of how a small amount of ice can greatly reduce the temperature a significantly larger amount of water.
And anyone who has tried those ‘stones’ that you can use to cool your drink normally gets frustrated that they don’t work properly. This is precisely because almost all of the cooling of ice in drinks is caused by the melting of the ice, not the temperature difference.
A Zeeman,
Great comment, thanks. It really is all about energy and water, the condensing greenhouse gas.
The Paris climate agreement aims to restrict global warming to within 2℃ above ‘pre-industrial levels’ (or more recently to 1.5℃). Why use such a poor baseline? Temperature measurements in pre-industrial times are now regarded as archaic. Why not use the current temperature as a baseline?
Surely it can’t be that reporting an increase of just 0.5℃ to 15.5℃ can be presented as “catastrophic’? Or are we already living in catastrophic times compared to the utopia of 1750 or 1850 temperatures?
And please explain how people are able to live comfortably in places like Moscow with an average temperature of 6℃, and in Singapore with an average temperature of 26℃.
This becomes interesting, but too late for this post discussion. I am trying to understand what Nick said. The last posts reminded me about the political view for the absolutes vs anomalies discussion.
As a scientist I would use anomalies to analyze changes in sparse inaccurate data. Words accurate and precise have specific meaning: https://www.mathsisfun.com/accuracy-precision.html. Thermometers are precise but not very accurate. When I look at the thermometers in north and south side of my house, I get reliable numbers that are not the same. They are neither the ones that I need for deciding the cloths to wear today. Problems with the measurement accuracy of well covered in this blog. I skip it now. Satellites needs to be calibrated with balloons to get absolute values.
Let’s assume that global temperature is changing due to changes in Sun. Long term data of each single weather station tells quite much about the changes. But if the local environment changes or we change the thermometers, these time series are not OK anymore. We are in a swamp of corrections.
We don’t have spatially representative sample of weather stations. Anomalies do a better work in this this situation. Swamp of extrapolation looks very bad with absolute data when we don’t have a clue of accurate true temperatures in a long history.
AS far as I have seen, GISS nowadays use the MERRA2 reanalysis to ground-truth Gistemp loti ( for statements about seasonal cycle, absolute temps, etc.)
Since the MERRA2 1981-2010 average is 14.26 C, and Gistemp loti for this period is 0.42 C, the global average temperature for the 1951-1980 period should be 13.84 C.
Navigating this fascinating thread on relatives and absolutes put me into a trance and I began to imagine what was being discussed was fashion design. It is discussed with as much verve… and if one imagines that the ‘absolute’ temps are the human form being clothed or draped or bound or decorated or smothered. But the anomalies are not as simple as things shrinking in the wash. They are the way fabric slides over the body like weather systems on a planet. A belt clinched at the waist separating opposing cyclonics, fabric of wind and moisture drawn tight over mountainous regions. Things billow and nature has mechanisms for pleating with pressure waves and readjusting. Oceans and tectonics rise and fall as breath. It seems that data modellers are tailors, making continual alterations in what is presented by their models, that may represent some idolized view of a planet. Their designs become baggy over time and additional accessories like pads are added to the shoulders that change the ‘drape’ and hide the declines.
And then Bob Tisdale breaks the spell for a moment by asking, but what *is* the actual shape of a woman? It is important! and heated discussion ensues. And all the while as the data drifts away from the points and planes that shape the clothes, the figure characterized by these models begins to resemble those tall floppy noodle-people one sees in used car lots or cell phone kiosks. In constant anomalous movement without solid form.
Bob, I meant to ask a couple of question earlier regarding the US data record:
1. Outside of the half degree Celsius increase in temperature for the US for the 20th century added by retrospectively adjusting USHCN data, does anyone know what the temperature trend reflected prior to being adjusted?
“These adjustments caused an increase of about 0.5°C in the US mean for the period from 1900 to 1990.”
Source:
https://data.giss.nasa.gov/gistemp/faq/#q211
The explanation on the above link is difficult follow, mentioning adjusting USHCN data after analysis by Dr. Hansen.
Years later, in 2011 (after adding the half a degree to the US temperature record based on USHCN data) a US Government Report was published which found that almost half of the USHCN weather stations had been incorrectly sited and were prone to recording higher temperatures being affected by their surroundings.
Source: https://www.gao.gov/assets/70/68744.pdf https://www.gao.gov/assets/70/68746.txt
The original data was adjusted resulting in adding a half degree of heat, using data that was subsequently found by the US GAO to be coming from already incorrectly sited weather stations, which amounted to almost 50% of them when faults were listed.
2. My second question is, does the added half degree increase which was adjusted into the US temperature record for the 20th century include data from the weather stations that were subsequently found to have been incorrectly sited?
If so, it includes two adjustments, both causing an increased temperature to show on the record, one from the analysis by Dr. Hansen, and the second from inadvertently being wrongly sited to begin with.