Guest essay by Eric Worrall
NASA researcher Mark Richardson has completed a study which compares historical observations with climate model output, and has concluded that historical observations have to be adjusted, to reconcile them with the climate models.
The JPL Press Release;
A new NASA-led study finds that almost one-fifth of the global warming that has occurred in the past 150 years has been missed by historical records due to quirks in how global temperatures were recorded. The study explains why projections of future climate based solely on historical records estimate lower rates of warming than predictions from climate models.
The study applied the quirks in the historical records to climate model output and then performed the same calculations on both the models and the observations to make the first true apples-to-apples comparison of warming rates. With this modification, the models and observations largely agree on expected near-term global warming. The results were published in the journal Nature Climate Change. Mark Richardson of NASA’s Jet Propulsion Laboratory, Pasadena, California, is the lead author.
The Arctic is warming faster than the rest of Earth, but there are fewer historic temperature readings from there than from lower latitudes because it is so inaccessible. A data set with fewer Arctic temperature measurements naturally shows less warming than a climate model that fully represents the Arctic.
Because it isn’t possible to add more measurements from the past, the researchers instead set up the climate models to mimic the limited coverage in the historical records.
The new study also accounted for two other issues. First, the historical data mix air and water temperatures, whereas model results refer to air temperatures only. This quirk also skews the historical record toward the cool side, because water warms less than air. The final issue is that there was considerably more Arctic sea ice when temperature records began in the 1860s, and early observers recorded air temperatures over nearby land areas for the sea-ice-covered regions. As the ice melted, later observers switched to water temperatures instead. That also pushed down the reported temperature change.
Scientists have known about these quirks for some time, but this is the first study to calculate their impact. “They’re quite small on their own, but they add up in the same direction,” Richardson said. “We were surprised that they added up to such a big effect.”
These quirks hide around 19 percent of global air-temperature warming since the 1860s. That’s enough that calculations generated from historical records alone were cooler than about 90 percent of the results from the climate models that the Intergovernmental Panel on Climate Change (IPCC) uses for its authoritative assessment reports. In the apples-to-apples comparison, the historical temperature calculation was close to the middle of the range of calculations from the IPCC’s suite of models.
Any research that compares modeled and observed long-term temperature records could suffer from the same problems, Richardson said. “Researchers should be clear about how they use temperature records, to make sure that comparisons are fair. It had seemed like real-world data hinted that future global warming would be a bit less than models said. This mostly disappears in a fair comparison.”
NASA uses the vantage point of space to increase our understanding of our home planet, improve lives and safeguard our future. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.
For more information about NASA’s Earth science activities, visit:
Read more: http://www.jpl.nasa.gov/news/news.php?feature=6576
The abstract of the study;
Reconciled climate response estimates from climate models and the energy budget of Earth
Climate risks increase with mean global temperature, so knowledge about the amount of future global warming should better inform risk assessments for policymakers. Expected near-term warming is encapsulated by the transient climate response (TCR), formally defined as the warming following 70 years of 1% per year increases in atmospheric CO2 concentration, by which point atmospheric CO2 has doubled. Studies based on Earth’s historical energy budget have typically estimated lower values of TCR than climate models, suggesting that some models could overestimate future warming2. However, energy-budget estimates rely on historical temperature records that are geographically incomplete and blend air temperatures over land and sea ice with water temperatures over open oceans. We show that there is no evidence that climate models overestimate TCR when their output is processed in the same way as the HadCRUT4 observation-based temperature record3, 4. Models suggest that air-temperature warming is 24% greater than observed by HadCRUT4 over 1861–2009 because slower-warming regions are preferentially sampled and water warms less than air5. Correcting for these biases and accounting for wider uncertainties in radiative forcing based on recent evidence, we infer an observation-based best estimate for TCR of 1.66 °C, with a 5–95% range of 1.0–3.3 °C, consistent with the climate models considered in the IPCC 5th Assessment Report.
Read more (paywalled): http://www.nature.com/nclimate/journal/vaop/ncurrent/full/nclimate3066.html
Frankly I don’t know why the NASA team persist with trying to justify their increasingly ridiculous adjustments to real world observations – they seem to be receiving all the information they think they need from their computer models.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.

I’m not a scientist, just an obsolete, retired, old engineer, but it really irritates me to see the amount of ‘data torturing’ and plain making it up that goes on in the climate “science” establishment. I’ve worked with plenty of data in real life, and we never tried to get away with what’s acceptable in “government work” these days.
If I’d used these techniques in Professor A.D. Moore’s electrical design course back in the late ’50s, he’d probably have thrown me out on my ear.
Still waiting for the change in human height adjustment… That would where considering people were shorter, they previously over-read the temperature, so the past was really cooler. Today people are taller, therefore they under-read the temperature relative to the past, so today, it is actually warmer than the recorded temperatures. Simple logic…
Steve. I have wondered about that parallax error too. I am sure that the parallax error is nothing compared to the errors that were due to made up numbers by the folks who were supposed to go out in the rain, cold and heat to read those thermometers but didn’t bother. When I worked for the government it was called pencil whipping. Of course I never did it. 🙂
Variations on that theme are “kitchen table research”and “drive-by estimating”
I believe the data protection act makes it a criminal offense for government employees to adjust that data. We need an attorney general who will start prosecuting that act. We can start with the climate adjusters and move on to the EPA. They have all been guilty of changing the collected data to more closely match their video games and need to go to jail for it, particularly any that overwrite the original data in the process! If the original data is still available they have an out, but then their “products” need to be called something other than data, because none of it was ever observed.
Perhaps there was no INTENT to be deceptive. (The now famous Hillary defense.)
Heh.. It all just happened by accident.. 😉
I had no INTENT to rob that bank! I just happened to do it, is all. So, I get to walk free, right?
If it’s good enough for Hillary, it should be good enough for everyone else. We can save billions on new prison construction.
However, the “Precautionary Principle”, of which GW alarmists are so fond, would suggest building more prisons. 😉
I guess the next step will be to tell us that we don’t need no stinking thermometers or other ways of measuring temperatures because the models will tell us not only what the temp has been in the past but will tell us what they will be. Perhaps they’ll outlaw thermometers like they did incandescent light bulbs and toilets that use over a certain amount of water to flush.
Data? We don’t need no stinkin’ data, we have models…
“The data doesn’t matter. We’re not basing our recommendations on the data. We’re basing them on the climate models.”
~ Prof. Chris Folland ~ (Hadley Centre for Climate Prediction and Research)
RAH —
Michael Mann has said that we need only stick our heads out the window to know that that climate change is real and endangering mankind — or something to that effect. His words may have been slightly different from mine but the meaning I convey is certainly his.
Eugene WR Gallun
Is ‘out the window’ a euphemism for ‘up our аѕѕ’?
Apparently, some of these fancy water-saving toilets are a little too fancy. In a government building that shall remain nameless, the toilets are equipped with special sensors that figure out how much water to use for a flush for every individual “use”. These are powered with electricity, and apparently have no manual flushing mechanism, or at least no external button anyone can find.
Thus, if there is power failure, the toilets cannot be flushed. Insert your own federal government joke here.
Why don’t they just use the models to tell what the temperature is?
It actual fact, they have been using computer models to determine missing data for surface temperatures for some time now.
By “determine” missing data, I assume you really mean “make up”?
I’ve seen references to this elsewhere, but the practice really worries me. If you’re algorithm needs a complete 100% data set and you don’t have it, you don’t create false data to fill in the gaps. You find a different algorithm, that can cope with gaps.
Anything else, all you’re doing is running statistics on made-up data.
the met have been forecasting with models (badly) for some time now . the accuracy of the forecast has gone down the toilet in recent years as the models “improve”.
Haven’t there been sci-fi stories about folks worshiping computers? These folks are worshiping a computer program.
Australia’s coastal desalination plants were built due to modelling which indicated permanent rain deficit. Nobody bothered about the records which showed Sydney’s driest year being 1888, the continent’s driest year likely being 1902 (but there’s talk that it was 1838 when even the ‘bidgee dried up) and the driest decade being the 1930s.
These unused desals cost Australian ratepayers up to 2 million per day. The Melbourne one suffered serious construction delays due to…
Can you guess?
It’s true that Eastern Australia has endured half a century of rainfall deficit. The catch is that it was from circa 1895 to circa 1947. This was achieved without any retro-fitting by non-Kardashian models.
So, even though my local temperature forecast is calling for cooler temperatures later this week, it is actually going to be hotter? I guess it’s time to move to the arctic so I can experience nice weather all year around and be one of the last breeding pairs of humans. 🙂
Orwell nailed it: “Those who control the present control the past — those who the past control the future.”
You have committed a thoughtcrime (and missed a control).
So in summary, computer models will be used to determine what past temperatures should have been so that the computer models projecting the future will be in more synch with the past records. Well since 1984 has arrived 32 years too late, I think NASA is finally on to something for once. It sure beats sending a team to Mars – way too much effort there; let’s use the computer to change past temperatures instead – great!
“It sure beats sending a team to Mars…”
Actually, sending humans to Mars is very easy. All you do is MODEL a manned expedition and — surprise! — it turns out the Martians destroyed their own planet via global warming and the last breeding pair died at the Martian south pole! 🙂
Much of our history has been modified or is currently being modified, Why omit temperature from this leftist exercise? Even my fifty-year old unabridged Random House dictionary has become a quaint curiosity.
Now that Senator Cruz is “relieved” of any duty to support the Republican presidential candidate, perhaps he could spend his energy looking into this remarkable piece of “scientific” work. Completion and publication date 19 January 2017 latest.
Love that cartoon: Mann is pulling down on one end of the chart and pushing the other end higher. Love it!
“The fault, dear Brutus, lies not in our models, but in our data”.
So there’s a new variable called “The Quirk”. Fascinating. Lol.
Does this mean that the reduced warming rate of the 21st century has now been predicted accurately?
It would save a lot of time and money if they would just calculate the historical global temperature for as far back as needed. I’ll join the consensus that proclaims the calculated temperatures to be accurate. Then I’ll bet that in ten more years, the GCM will still be out of whack.
There is a very good answer/analysis to this paper by N. Lewis published on J Curry’s blog:
https://judithcurry.com/2016/07/12/are-energy-budget-climate-sensitivity-values-biased-low/
(also reblogged at S. McIntyre’s climate audit: https://climateaudit.org/2016/07/12/are-energy-budget-tcr-estimates-biased-low-as-richardson-et-al-2016-claim/ )
Also some of the comments there are very well worth the time reading them,
LoN
“A new NASA-led study finds that almost one-fifth of the global warming that has occurred in the past 150 years has been missed by historical records due to quirks in how global temperatures were recorded.”
Historical records? I wonder which bastardized dataset of historical surface temperatures they used for this study?
How do you do an accurate study when you start out with bastardized surface temperature data?
The results of the Study agree with previous bastardized surface temperature data. What does that telll you?
The HadCRUT3 to 4 transition (here:) http://www.colderside.com/Colderside/HadCRUT4.html actually brought about 400 moire mini-Urban Heat Islands into the fold, and they hind-casted them as well, thus mathematically “translating” the entire data set upward.
It still wasn’t enough, so now they are adjusting “real” readings to the overheated models.
I guess Anthony could call this post “Sunday Humor” but I’m not laughing!
The pre-satellite era temperature of the arctic is poorly known. There is historical evidence that, at various times, it was as warm as it is today. There are reports of ice extent that mirror modern data. Here’s an example from the 1930s.
I’m guessing that the adjusted temperatures won’t agree with the, admittedly sparse, historical record.
Another thing … what about the antarctic temperatures? Shouldn’t they be fully represented too? They don’t seem to change much. They might go a long way to offsetting the more dramatic arctic temperature swings.
Bernid Madoff went to jail for adjusting return rates on stocks…….
Yeah, and Bernie just swindled his victims out of a few billion dollars, whereas this CAGW hoax is swindling the U.S. taxpayers and taxpayers around the world out of TRILLIONS of dollars.
Gee whiz. Obvious malfeasance within a government agency. Somebody should turn this over to the Justice Dept. (Just not this Justice Dept.)
If historical data mixed water and air temps, which “also skews the historical record toward the cool side”, don’t they need to “adjust” the historical temperature record up to make it more accurate? Wouldn’t this reduce the amount of warming from then to now?
And to think, JPL made it all the way to Mars. Must be a few good scientists there, anyway.
Climate Computer Models are NOT science. They only do what they are programmed to do. They are computer programmers wet-dreams designed to show what their bosses demand. NOT SCIENCE, NOT REAL, NOT USEFUL, and WORTHLESS, particularly when used as a proxy for the real world. Where is that copy of the RICO rules?
So from 1939 til 1950 were these “Global Cooling Observations”?
Or have the “Tamperature” (thanks M’lord) adjustments thus far already eliminated that inconvenient truth?