Japan's 'Cool Hand Luke' moment for surface temperature

At NASA’s Climate 365, there is an interesting story posted with this statement and a graph:

Some say scientists can’t agree on Earth’s temperature changes

Each year, four international science institutions compile temperature data from thousands of stations around the world and make independent judgments about whether the year was warmer or cooler than average. “The official records vary slightly because of subtle differences in the way we analyze the data,” said Reto Ruedy, climate scientist at NASA’s Goddard Institute for Space Studies. “But they also agree extraordinarily well.”

All four records show peaks and valleys in sync with each other. All show rapid warming in the past few decades. All show the last decade has been the warmest on record.

In sync? Weellll, not quite. Japan apparently hasn’t ‘got their mind right‘ yet as the graph shows:

Surfacetemps_japan

Here is where it gets interesting. Note the purple line after the year 2000.

The Japanese data line in purple is about .25 degree cooler than the NASA, NOAA, and Met Office data sets after the year 2000. That has partially to do with anomaly baselines chosen by the different agencies, as these two comparison graphs shown below illustrate:

Surfacetemps_japan3

Source: http://ds.data.jma.go.jp/tcc/tcc/news/press_20120202.pdf

Surfacetemps_japan2

NASA GISS uses a 1951-1980 average for the anomaly baseline, Japan’s Meteorological agency uses a 1981-2010 baseline, and that explains the offset difference between 0.48 and ~ 0.23 C, however, it doesn’t explain the divergence when all of the data is plotted together using the same anomaly 1951-1980 baseline as NASA did, which is explained in more detail at the link provided in the NASA 365 post to NASA’s Earth Observatory study here:

Source: http://earthobservatory.nasa.gov/IOTD/view.php?id=80167

In that EO story they explain:

The map at the top depicts temperature anomalies, or changes, by region in 2012; it does not show absolute temperature. Reds and blues show how much warmer or cooler each area was in 2012 compared to an averaged base period from 1951–1980. For more explanation of how the analysis works, read World of Change: Global Temperatures.

The justification for using the outdated 1951-1980 baseline is humorous, bold mine:

The data set begins in 1880 because observations did not have sufficient global coverage prior to that time. The period of 1951-1980 was chosen largely because the U.S. National Weather Service uses a three-decade period to define “normal” or average temperature. The GISS temperature analysis effort began around 1980, so the most recent 30 years was 1951-1980. It is also a period when many of today’s adults grew up, so it is a common reference that many people can remember.

So, the choice seems to be more about feeling than hard science, kind of like the time when Jim Hansen and his sponsor Senator Tim Wirth turned off the air conditioning in the Senate hearing room in June 1988 (to make it feel hotter) when they first tried to sell the global warming issue:

But, back to the issue at hand. The baseline difference doesn’t explain the divergence.

Perhaps it has to do with all of the adjustments NOAA and GISS make, perhaps it is a difference in methodology in computing the global surface average and then the anomaly post 2000. Perhaps it has to do with sea surface temperature, which Japan’s Met agency is very big on, but does differently. A hint comes in this process explanation seen here:

http://ds.data.jma.go.jp/tcc/tcc/products/gwp/temp/explanation.html

Global Average Surface Temperature Anomalies

JMA estimates global temperature anomalies using data combined not only over land but also over ocean areas. The land part of the combined data for the period before 2000 consists of GHCN (Global Historical Climatology Network) information provided by NCDC (the U.S.A.’s National Climatic Data Center), while that for the period after 2001 consists of CLIMAT messages archived at JMA. The oceanic part of the combined data consists of JMA’s own long-term sea surface temperature analysis data, known as COBE-SST (see the articles in TCC News No.1 and this report).

The procedure for estimating the global mean temperature anomaly is outlined below.

1) An average is obtained for monthly-mean temperature anomalies against the 1971-2000 baseline over land in each 5° x 5° grid box worldwide.

2) An average is obtained for monthly mean sea surface temperature anomalies against the 1971-2000 baseline in each 5° x 5° grid box worldwide in which at least one in-situ observation exists.

3) An average is obtained for the values in 1) and 2) according to the land-to-ocean ratio for each grid box.

4) Monthly mean global temperature anomaly is obtained by averaging the anomalies of all the grid boxes weighted with the area of the grid box.

5) Annual and seasonal mean global temperature anomalies are obtained by averaging monthly-mean global temperature anomalies.

6) The baseline period is adjusted to 1981-2010.

Note what I highlighted in red:

…for the period after 2001 consists of CLIMAT messages archived at JMA

That along with:

The oceanic part of the combined data consists of JMA’s own long-term sea surface temperature analysis data, known as COBE-SST

Is very telling, because it suggests that Japan is using an entirely different method for both land and sea data. For the post 2001 land data, it suggests they use the CLIMAT data as is, rather than the “value added” processing that NCDC/NOAA and NASA GISS do. The Met Office gets the NCDC/NOAA data already pre-processed with the GHCN3 algorithms. NASA GISS deconstructs the data then applies their own set of sausage factory adjustments, which is why their anomaly is often the highest of all the data sets.

Prior to 2001, Japans Met Agency uses the GHCN data, which is pre-processed and adjusted through another sausage recipe pioneered by Dr. Thomas Peterson at NCDC.

The land part of the combined data for the period before 2000 consists of GHCN (Global Historical Climatology Network) information provided by NCDC

A good example of the GHCN sausage is Darwin, Australia, as analysed by Willis Eschenbach:

Above: GHCN homogeneity adjustments to Darwin Airport combined record

So, it appears that Japan’s Meteorological agency is using adjusted GHCN data up to the year 2000, and from 2001 they are using the CLIMAT report data as is, without adjustments. To me, this clearly explains the divergence when you look at the NASA plot magnified and note when the divergence starts. The annotation marks in magenta are mine:

Surfacetemps_japan4

If anyone ever needed the clearest example ever of how NOAA and NASA’s post facto adjustments to the surface temperature record increase the temperature, this is it.

Now, does anyone want to bet that the activist scientists at NOAA/NCDC (Peterson) and NASA (Hansen) start lobbying Japan to change their methodology to be like theirs?

After all, the scientists in Japan “need to get their mind right” if they are going to be able to claim “scientists agree on Earth’s temperature changes”, when right now they clearly don’t.

P.S.

BTW if anyone wants to analyze the Japanese data, here is the source for it:

http://ds.data.jma.go.jp/tcc/tcc/products/gwp/temp/map/download.html

It is gridded, and I don’t have software handy at the moment to work with gridded data, but some other readers might.

UPDATE: Tim Channon at Tallbloke’s has plotted the gridded data and offers a graph, see here: http://tallbloke.wordpress.com/2013/02/01/jmas-global-surface-temperature-gridded-first-look/

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Les Johnson

My discussion with Gavin on that chart…
https://twitter.com/ClimateOfGavin/status/296435881156427776

How long can NOAA or GISS continue to cook the numbers? Eventually people are going to wonder why ice is not melting at 0 degrees Celsius any longer.

Doug Huffman

Thanks for the “got their mind right” citation. I was looking for Captain’s “What we’ve got here is [a] failure to communicate.”
Captain: You gonna get used to wearing them chains after a while, Luke. Don’t you never stop listening to them clinking, ’cause they gonna remind you what I been saying for your own good.
Luke: I wish you’d stop being so good to me, Cap’n.
Captain: Don’t you ever talk that way to me.(pause, then hitting him) NEVER! NEVER!(Luke rolls down hill; to other prisoners) What we’ve got here is failure to communicate. Some men you just can’t reach. So you get what we had here last week, which is the way he wants it. Well, he gets it. I don’t like it any more than you men.
Our ‘chains’ are of ignorance.

Les Johnson

Click on Gavin’s name or picture…..

pochas

Yea, Anthony! An anomaly chart or data without the baseline info clearly indicated is not real data. It’s likely tomfoolery.

Stephen Richards

The inscrutable japonese. I don’t think they will rush to be indoctrinated by the yanks unlike the UK Met who obviously saw the benefits very early.

Mike Bromley the Canucklehead back in Kurdistan but actually in Switzerland

What a frustrating equivocation. “The warmest on record”….by a fraction of a degree above some cooler ‘norm’. Yes, Senator Wirth, you can’t ‘feel’ that difference. If we couldn’t track this miniscule hair-splitting hiccup in the temperature, would we even notice anything was other than normal?
No. But NASA, and GISS, and that cheap phoney religious zealot Jim Hansen, (I’m sorry, but it really does come down to an ad hominem…he LIED with all the theatrics), have an interest, at every moment, to communicate hyperbole on the issue. The warmest on record? Yeah, since they started looking. Too bad about ALL of earth history. 30 stupid years doesn’t register, at all, out of 4+ billion’s worth of variation. To conclude that it is the warmest decade is nothing more than a hubris-sodden exhalation of self-importance for a group of otherwise boring and somewhat bitter ‘scientists’.
End of rant. We, no, not really.

Stephen Richards

“But they also agree extraordinarily well.”
They ruddy well should!! It’s the same data up to 2000 !!

What a coincidence that 1951 through 1980 was the coolest 30-year period in the 20th century;)

Athlete

Nick Stokes Zeke Steve Mosher chiming in to defend BEST in 3, 2, 1 ….

Warren

Is Climate 365 really a NASA effort? I can’t fin reference to it anywhere on their Website.
And the chart is deceptive well beyond what you indicate here, Anthony. Major problems: 1. It stops in 2006 when it represents data available through 2012 at the time of its creation. 2. There at far more than 4 organizations that compute/estimate average temperature during this period.
I concluded that this is the result of some schmuck trying to deceive, not really NASA. Am I right?
REPLY: “Is Climate 365 really a NASA effort?” yes see the about page, http://climate365.tumblr.com/about
About Climate 365
Climate 365 is a joint project of NASA’s Earth Science News Team, communications teams at Goddard Space Flight Center and Jet Propulsion Laboratory and the NASA web sites Earth Observatory and Global Climate Change.
Questions? Contact: patrick.lynch@nasa.gov

-Anthony

RHS

Wonder how long it will be until the Japanese data will be left off the chart all together?
And what is the justification for just these four? Surely there has got to be a broader data set than NOAA. My vague understanding is their distribution is terribly limited with greater than 90% of their station in the CONUS. And even then the remaining are mostly spread across Pacific Islands which is not terribly global as it leaves out the Arctic, Antarctica, Europe, Asia, India, Australia, Africa, etc, etc.
And while I’m on a soapbox, what, wait a minute? I almost forgot to consider the source.
End critical thinking, begin activism at it’s most annoying.

davidmhoffer

Gee, they didn’t include the satellite data at all. Wonder why that is? It only has the best spatial and temporal coverage there is, it only cost a few billion to put those satellites up in space, but what the heck, what’s a few billion and the most accurate data we have compared to the fate of the world?
That said, my objection remains that averaging anomaly data is meaningless. An anomaly of 1 in the arctic represents a change in energy balance of less than one third the amount of an anomaly of 1 in the tropics. Averaging them together results in a meaningless number.

Kev-in-Uk

I stil get annoyed with the ‘warmest decade evah’ type crapola. Not just because of the fixed numbers y NASA/GISS,etc, but simply because if there is still an ongoing natural warming (a la, a recovery from the LIA or indeed a recovery from the last ice age) it stands to reason that over any period of time where such warming takes place – the last decade will nearly always be the warmest – until it reaces the ‘top’, and starts to cool (or level off) again. I suppose if we were currently reliving the 70’s the claim would be ‘coolest decade evah’?
In other words, if the climate boys accept that there is ‘some’ underlying natural warming – I believe most do (?) – then it stands to reason that the last decade will nearly always be the warmest! Why they ever mention it just astounds me. (I actually had to explain this to a geography school teacher a couple of years ago, and he conceded the point, despite trying to argue from the usual warmista stance for ages!)

Marcos

i once found a NASA web page that explained the use of ‘normals’ and stated specifically that they were never intended for use as a measure of climate change and the reasons why not. for the life of me, i haven’t been able to find that link…

John F. Hultquist

You have this part in bold:
It is also a period when many of today’s adults grew up, so it is a common reference that many people can remember.
This is the language usage from the 1930s when the international standard for weather reporting was established (as was the 30 year period). When that was the intent, it made sense.
The only reason for using a 30 year period for what they are now doing, that is ‘climate science’ (sic), is to confuse issues. They say “many people can remember” the 1950s! Well, I do. A few of those still employed by NASA might. Still that’s a stretch. Someone wants to use a set of numbers giving a lower average base rather than using all the data because all the data will give a slightly higher average. We learned this trick in high school math class, or maybe earlier.
“Humorous,” indeed. You are being diplomatic.

George Mullerleili

Sen. Wirth’s comments in the video are pretty telling. The meme is all about “staging” events, not science. Apparently no one has figured out yet how to stage nearly two decades of static global temperature into a staged event for global warming.

John R. Walker

Faced with Japanese data and NASA GISS data – which one is really inscrutable?

John F. Hultquist

Marcos says:
January 31, 2013 at 9:42 am

~~~~~~~~~~~~~~~~~~~~~~
See my comment at 11:51, here:
http://wattsupwiththat.com/2012/08/24/a-different-take-on-the-hottest-month-on-record/

tmonroe

Somebody posted this on facebook. I pointed out that the warming rate from 1910 to 1940 looks faster than the warming rate from 1970 to 2000… So some of the questions I have are: weren’t human beings putting out *way* less CO2 in 1910? I mean, the automobile was barely invented back then – and wasn’t human population around 1.5 Billion? (compared to somewhere around 9 billion today). Where is the blip for the depression? Where is the spike for WW2? You say the climate doesn’t respond that quickly? Then why did this rate start in 1910? Even in this graph, they make a statement that the current rate is unprecidented… yet 100 years ago – there was an even faster rate of warming than today.
At least they are honest – no trend in the last 15 years or so Perhaps the only question their graph is, so, you aknowledge no warming for the last 15 years then? Are the true beilevers blind to this?
Starting the graph at 1910 is also misleading. What does the graph do before that? My assertion is that *something* happened to get from mile thick ice-sheets (in much of North America) to today’s relatively ice-free world. Yes, I acknowledge they have some proxy data that shows relatively gradual change over time… but what would a quick, 30 year .25 degree uptick in temperature look like in a proxy that is showing 12,000 + years? Would it even be visible?
The graph also makes a bald faced lie. Very few “deniers” believe that the climate stays at one set temperature all the time. How many times do we have to repeat “yes, it is getting warmer”.The argument has never ever been if it has been warming (depending on the start date). The argument has *always* been “how much are we responsible for”? They are liars, and should be called out as such.
These activists are only making a straw-man argument. Absolutely nobody will tell you that it isn’t warmer today than it was 12,000 years ago during an ice-age. Yes, there is a normal warming rate. Most of us will even admit that CO2 may have some small impact on temperature. They need to stick to point:What they want to do is to drive civilization back at least 100 years (and cull 8 billion people) all based on an assertion that the rate of change will increase someday, even if it is flat today…

astateofdenmark

Do they have any given reason for not including the satellite data? Seems a significant oversight.

Theo Goodwin

Mike Bromley the Canucklehead back in Kurdistan but actually in Switzerland says:
January 31, 2013 at 9:16 am
“No. But NASA, and GISS, and that cheap phoney religious zealot Jim Hansen, (I’m sorry, but it really does come down to an ad hominem…he LIED with all the theatrics), have an interest, at every moment, to communicate hyperbole on the issue.”
Accusing someone of lying does not make your statement an ad hominem. In any case, most of us agree that he lied. It seems to me that he is a serial liar or deluded.

astateofdenmark

To answer my own question, seems not.

The problem with a graph like this is that they are using a bad metric. When you have a set of data like that(adjusted or unadjusted, although raw data is better) the metric should be the average of all observations- one number with with a 95% C,I. Then calculate the deviation from “normal”.

Theo Goodwin

davidmhoffer says:
January 31, 2013 at 9:31 am
“That said, my objection remains that averaging anomaly data is meaningless. An anomaly of 1 in the arctic represents a change in energy balance of less than one third the amount of an anomaly of 1 in the tropics. Averaging them together results in a meaningless number.”
Yes, using anomalies is a great way of side tracking a science. Maybe some day climate scientists will switch from anomalies to some measurable feature of the environment.

Werner Brozek

Japan has 1998 as the hottest year. As well, 4 of the data sets below agree with this, namely RSS, UAH, Hadcrut3 and Hadsst2. For further details as to how 2012 ended compared to the warmest year for each set, keep reading.
How 2012 Ended on Six Data Sets
Note the bolded numbers for each data set where the lower bolded number is the highest anomaly recorded in 2012 and the higher one is the all time record so far.

With the UAH anomaly for December at 0.202, the average for 2012 is (-0.134 -0.135 + 0.051 + 0.232 + 0.179 + 0.235 + 0.130 + 0.208 + 0.339 + 0.333 + 0.281 + 0.202)/12 = 0.161. This would rank 9th. 1998 was the warmest at 0.42. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. The anomaly in 2011 was 0.132 and it came in 10th.
With the GISS anomaly for December at 0.44, the average for 2012 is (0.36 + 0.39 + 0.49 + 0.60 + 0.70 + 0.59 + 0.51 + 0.57 + 0.66 + 0.70 + 0.68 + 0.44)/12 = 0.56. This would rank 9th. 2010 was the warmest at 0.66. The highest ever monthly anomaly was in January of 2007 when it reached 0.93. The anomaly in 2011 was 0.54 and it came in 10th.
With the Hadcrut3 anomaly for December at 0.233, the average for 2012 is (0.206 + 0.186 + 0.290 + 0.499 + 0.483 + 0.482 + 0.445 + 0.513 + 0.514 + 0.499 + 0.482 + 0.233)/12 = 0.403. This would rank 10th. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2011 was 0.340 and it came in 13th.
With the sea surface anomaly for December at 0.342, the average for the year is (0.203 + 0.230 + 0.241 + 0.292 + 0.339 + 0.352 + 0.385 + 0.440 + 0.449 + 0.432 + 0.399 + 0.342)/12 = 0.342. This would rank 8th. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555. The anomaly in 2011 was 0.273 and it came in 13th.
With the RSS anomaly for December at 0.101, the average for the year is (-0.060 -0.123 + 0.071 + 0.330 + 0.231 + 0.337 + 0.290 + 0.255 + 0.383 + 0.294 + 0.195 + 0.101)/12 = 0.192. This would rank 11th. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2011 was 0.147 and it came in 13th.
With the Hadcrut4 anomaly for December at 0.269, the average for 2012 is (0.288 + 0.208 + 0.339 + 0.525 + 0.531 + 0.506 + 0.470 + 0.532 + 0.515 + 0.524 + 0.512 + 0.269)/12 = 0.436. This would rank 10th. 2010 was the warmest at 0.54. The highest ever monthly anomaly was in January of 2007 when it reached 0.818. The anomaly in 2011 was 0.399 and it came in 13th.
If you would like to see the above month to month changes illustrated graphically, see:
http://www.woodfortrees.org/plot/wti/from:2012/plot/gistemp/from:2012/plot/uah/from:2012/plot/rss/from:2012/plot/hadsst2gl/from:2012/plot/hadcrut4gl/from:2012/plot/hadcrut3gl/from:2012

Les Johnson says:
January 31, 2013 at 9:11 am
Click on Gavin’s name or picture…..
============================================
No thanks

RHS says:
January 31, 2013 at 9:27 am
Surely there has got to be a broader data set than NOAA. My vague understanding is their distribution is terribly limited with greater than 90% of their station in the CONUS. And even then the remaining are mostly spread across Pacific Islands which is not terribly global as it leaves out the Arctic, Antarctica, Europe, Asia, India, Australia, Africa, etc, etc.

Map of NCDC Station locations, these should be only stations that have at least 240 days/year of data from 1950-2010.

thelastdemocrat

a thought for everyone: look at the graph again. there is no hockey stick.
the graph starts low and goes up.
rhetorically, to use a temp graph that things used ot be at a normal level without influence by humans, then started going up with human influence, the flat part of the hockey stick is necessary.
has the hockey stick become so unpopular that the rhetoric is carried out but with the hope that the flat part of the hockey stick is no longer needed? has the era of the hockey stick ended?
if you used this as a discussion point with a warmer cult member, you could have them declare that temps were stable, then humans influenced them to go up. then, ask them ot point out the stable period. there is none.

I think the right way to measure deviation from “normal” of any kind needs to be of a moving 30 years’ normal.
It seems to me a year by year normal is the right thing to do, but the minimum should be a decadal one.
I.e. The Year 2012 should be referenced to either the 30 years of 1981 to 2011 normal, or at least to 1980 to 2010.
Otherwise it seems apples and oranges to me.

Frank K.

Matthew W says:
January 31, 2013 at 10:18 am
Les Johnson says:
January 31, 2013 at 9:11 am
Click on Gavin’s name or picture…..
============================================
No thanks

I agree…I was about to, but then why give them any web traffic – not worth it. Anyways – isn’t Gavin supposed to be working on proper documentation for Model E or somehting? Nahhh…too busy being a climate rock star…

This is actually hiding the disagreement. Include the satellite measurements, too, then we can really see the divergence.

My guess is that the same person who released the climategate email’s also convinced someone to use the Japanese data on this graph thereby exposing GHCN as the hothead in the GAT crowd.

Jon

It’s pretty obvious that the climate has warmed over the past few decades … what is interesting is that this warming is strongly correlated with the rate of movement of the North Pole. See here: http://www.appinsys.com/globalwarming/earthmagneticfield.htm

son of mulder

Has anyone looked at the individual graphs on the absolute temperature scale and then overlayed them to see how thay actually disagree with each other in absolute terms? I’d like to see such a graph particularly as on the anomoly chart the Japanese data is somewhat lower than the other anomolies around 1900 as well.
Anomoly graphs allow the potential use of smoke and mirrors.

Taphonomic

“The GISS temperature analysis effort began around 1980, so the most recent 30 years was 1951-1980. It is also a period when many of today’s adults grew up, so it is a common reference that many people can remember.”
That appears to be either a rather blatant misstatement or severely outdated. Referring to pre-1980 as period that many people can remember is humorous. To realize how funny this really is, one should consult the Beloit College Mindset lists. Each year since 1998 (i.e, which will be the class of 2002) Beloit College publishes a list of things to which freshmen don’t have a living, historical reference.They have published 14 more lists, updating it each year.
Below are links to the oldest, 1998, which would be for the class of 2002
http://www.beloit.edu/mindset/2002/
Examples:
The people starting college this fall across the nation were born in 1980.
They are too young to remember the Space Shuttle Challenger blowing up.
They never had a polio shot, and likely, do not know what it is.
Bottle caps have not always been screw off, but have always been plastic. They have no idea what a pull top can looks like. (on this particular one, I once had to explain the term “church key” to a young friend who didn’t know that there were cans before pull tops.)
Atari pre-dates them, as do vinyl albums.
and the most recent, 2012, which would be for the class of 2016.
http://www.beloit.edu/mindset/2016/
Examples:
For this generation of entering college students, born in 1994, Kurt Cobain, Jacqueline Kennedy Onassis, Richard Nixon and John Wayne Gacy have always been dead.
Benjamin Braddock, having given up both a career in plastics and a relationship with Mrs. Robinson, could be their grandfather.
Outdated icons with images of floppy discs for “save,” a telephone for “phone,” and a snail mail envelope for “mail” have oddly decorated their tablets and smart phone screens.
Star Wars has always been just a film, not a defense strategy.
They have had to incessantly remind their parents not to refer to their CDs and DVDs as “tapes.”
There have always been blue M&Ms, but no tan ones.’

Bloke down the pub

In the graphic above, the Japanese data seems to end before the others which all have an uptick,

Werner Brozek

From 1995, the Japanese curve is very similar to both Hadcrut3 and RSS as can be seen below. That is what “agreement” looks like.
http://www.woodfortrees.org/plot/hadcrut3gl/from:1995/plot/rss/from:1995

Steve McIntyre

UCAR http://climatedataguide.ucar.edu/guidance/sst-data-cobe-centennial-situ-observation-based-estimatessays that, relative to HadSST, the Japanese “bias adustments” are “somewhat primitive”. j

Graeme W

astateofdenmark says:
January 31, 2013 at 9:54 am
Do they have any given reason for not including the satellite data? Seems a significant oversight.

I can come up with an easy justification for not including the satellite data. The graph uses a baseline of 1951-1980. We don’t have satellite data for that period (only the very end of the period), so they wouldn’t have been able to calculate correct anomalies.
Of course, if they used the same baseline as the original Japanese data, they would have been able to include the satellite data for the relevant period…

David L.

I find it amazing that people think the accuracy and precision of a global temperature average is that good. Or that a global average means anything.

MikeB

All four records show peaks and valleys in sync with each other. All show rapid warming in the past few decades. All show the last decade has been the warmest on record.

Whoops. As I read “All four records show peaks and valleys in sync with each other. All show rapid warming in the past few decades. All show the last decade…”, I expected the logical sequence to be “All show that there has been no further warming in the last decade”. I expected that because, firstly it is true, and secondly it continues to describe how global temperatures have fluctuated. But instead, it says “the last decade has been the warmest on record”.
This is what we call sophistry. That is, the statement is not in itself untrue, not an outright lie, but it is carefully designed to be deceptive. It is designed to mislead from the fact that temperatures have now stopped rising (and no climate model predicted that and current theory cannot account for that!)
This is sophistry – i.e dishonesty. Why do GISS feel it necessary to stoop to that? Does it help their case do you think?
Over at James Delingpole’s blog there is quote from famous James Lovelock, once the high priest of the global warming green movement. It says

I am James Lovelock, scientist and author, known as the originator of Gaia theory, a view of the Earth that sees it as a self-regulating entity that keeps the surface environment always fit for life… I am an environmentalist and founder member of the Greens but I bow my head in shame at the thought that our original good intentions should have been so misunderstood and misapplied. We never intended a fundamentalist Green movement that rejected all energy sources other than renewable, nor did we expect the Greens to cast aside our priceless ecological heritage because of their failure to understand that the needs of the Earth are not separable from human needs. We need take care that the spinning windmills do not become like the statues on Easter Island, monuments of a failed civilisation.

Maybe one day, in the not too distant future GISS could also become honest and encompass such humility?

rw

Dear James Lovelock,
You forgot you were living in Salem.

If you use RSM (reference station method ) or CAM ( common anomaly method ) and IF you base your data on GHCN or CLIMAT, then there are two periods you could use as base periods to MAXIMIZE the number of stations used.
1950-1981 or 1960-1991. Note that GISS which uses RSM and CRU which uses CAM each select on of these periods. the difference between these two periods is that one (1950-1981 ) gives you a few more stations in the NH, while 1960 -1991 gives you a few more stations in the
southern hemisphere.
When japan selects the period they do, they also change the distribution and number of stations used in each hemisphere. So a good comparison will control for that spatial difference in sampling.
The right approach, pioneered by skeptic Jeff Id and statistician roman M, just uses all the data without a base period. This approach was enhanced by Nick Stokes, and Tamino, and finally improved upon by Berkeley. Bottom line, you dont need base periods, and more importantly choosing a base period can alter your sampling.
on SST
japan take this route:
“3) An average is obtained for the values in 1) and 2) according to the land-to-ocean ratio for each grid box.”
That approach is rather crude. If you simply weight according to land ocean ratio you are neglecting the issue that each portion of the grid may have more or less measures that will over weight land portions that are under sampled and underweight those that are over sampled. CRU has a better method for calculating grids were land and ocean are part of the grid.
The better method, of course, is not to grid the world at all or to grid at a finer level. Errr..just krig it

Reblogged this on acckkii.

thunderloon

I just can’t see a baseline that avoids your dataset as responsible science. its like putting all the bells and whistles AND a more powerful engine in the test-drive car but never mentioning it has 90 more HP and a better air conditioner.
Honestly the entire output becomes apples and orangutangs, two completely un-related forms of data.

Let me get this straight, according to the boffins, the gas medium that we live in has warmed by about 1DegC since 1900?
I don’t know whether to laugh or cry at that one.

rgbatduke

Gee, they didn’t include the satellite data at all. Wonder why that is? It only has the best spatial and temporal coverage there is, it only cost a few billion to put those satellites up in space, but what the heck, what’s a few billion and the most accurate data we have compared to the fate of the world?
It would actually be very interesting to see just the post 1988 tail of this data plotted against the satellite data. The GISS and HADCRUT records, IIRC, are currently constrained by the fact that if they add any more artificial warming on at this point, the divergence from the satellites will be too great and will tip their hand. From the look of things — without much resolution at this scale — the Japanese data is much more in alignment with the satellites, and hence is actually moderately believable.
Your point on anomalies is also well made. But that is only one of the many aspects of lying with statistics in play at this point, and not the worst of them.
Looking back at the correction graph in the top article I am once again struck by the prospect of performing a statistical analysis of the corrections applied to the data, under a null hypothesis of fair and unbiased corrections that would move station readings up as often as it moves them down. After all, I can think of no good reason that readings of thermometers from long ago would be systematically biased in their errors — this violates ever so many principles of statistics. My prediction is that the p value of the existing correction set under the null hypothesis would be enough to convince any jury that they are not only biased, but openly and flagrantly biased (that is, a p value less than one in a million or thereabouts).
That actually sounds as though it would be worth a paper, or perhaps an addendum to the paper Anthony already has going on weather station siting and corrections.
rgb

R Barker

If I wanted to know the history over a period of time of the average temperature of the surface air mass of the earth, what I really need to know is the heat content of that air mass over time relative to some reference point.

Ron

n’kay… fine … but what does this graph have to do with the ‘A’ in ‘AGW’? Let’s say, for the sake of argument, that all those pretty little fluctuations are natural, nothing out of the ordinary? What good is the pretty little graph and its sinister insinuation then?