Besting the BEST surface temperature record

Guest essay by Patrick J. Michaels and Ryan Maue, Center for the Study of Science, Cato Institute

JRA-55—BETTER THAN THE BEST GLOBAL SURFACE TEMPERATURE HISTORY, AND COOLER THAN THE REST.

Let’s face it, global surface temperature histories measured by thermometers are a mess. Recording stations come on-and offline seemingly at random. The time of day when the high and low temperatures for the previous 24 hours are recorded varies, often changing at the same station. Local conditions can bias temperatures. And the “urban heat island” can artificially warm readings with population levels as low as 2500. Neighboring reporting stations can diverge significantly from each other.

The list goes on. Historically, temperatures have been recorded by mercury-in-glass thermometers housed in a ventilated white box. But, especially in poorer countries, there’s little financial incentive to keep these boxes the right white, so they may darken over time. That’s guaranteed to make the thermometers read hotter than it actually is. And the transition from glass to electronic thermometers has hardly been uniform.

Some of these problems are accounted for, resulting in some dramatic alterations of original climate records (see here for the highly cited New York Central Park adjustments), via a process called (love this word) homogenization. Others, like the problem of station darkening are not accounted for, even though there’s pretty good evidence that it is artificially warming temperatures in poor tropical nations.

clip_image001

Figure 1. Difference between satellite-measured and ground-measured trends. Artificial warming is largest in the poor regions of Africa and South America. (Source: Figure 4 in McKitrick and Michaels, 2007).

There are multiple “global” temperature histories out there, but they all look pretty much the same because they all run into the problems noted above, and while the applied solutions may be slightly different, they aren’t enough themselves to make the records look very different. The most recent one, from Berkeley Earth (originally called the Berkeley Earth Science Team (BEST) record) is noteworthy because it was generated from scratch (the raw data), but like all the others (all using the same data) it has a warming since 1979 (the dawn of the satellite-sensed temperature era) of around 0.18⁰C/decade. (Computer models, on average, say it should have been warming at around 0.25⁰C/decade.)

They all have a problem with temperatures over the Arctic Ocean as there’s not much data. A recent fad has been to extend the land-based data out over the ocean, but that’s very problematic as a mixed ice-water ocean should have a boundary temperature of around freezing, while the land stations can heat up way above that. This extension is in no small part responsible for a recent jump in the global surface average.

It would sure be desirable to have a global surface temperature record that suffered from none of the systematic problems noted above, and—to boot—would be measured by electronic thermometers precisely calibrated every time they were read.

Such a dream exists, in the JRA-55 dataset. The acronym refers to the Japan Meteorological Office’s (originally) 55-year “reanalysis” data, and it updates to yesterday.

Here’s how it works. Meteorologists around the world need a simultaneous three-dimensional “snapshot” of the earth’s physical atmosphere, upon which to base the forecast for the next ten to sixteen days. So, twice a day, at 0000 and 1200 Greenwich Mean Time (GMT) (0700 and 1900 EST) weather balloons are released, sensing temperature, pressure, moisture and tracked to determine the wind. There’s also satellite “profile” data in the mix, but obviously that wasn’t the case when JRA-55 begins in 1958. These are then chucked into national (or private) computers that run the various weather forecast models, and the initial “analysis”, which is a three-dimensional map based upon the balloon data, provides a starting point for the weather forecast models.

One the analyzed data had served its forecasting purpose, it was largely forgotten, until it dawned upon people that this was really good data. And so there have been a number of what are now called “reanalysis” datasets. The most recent, and the most scientifically complete one is JRA-55. In a recent paper describing, in incredible detail, how it works, the authors conclude that it is more reliable than any of the previous versions, either designed by the Japan Office or elsewhere.

Remember: the thermistors are calibrated at the release point, they are all launched at the same time, there’s no white box to get dirty, and the launch sites are largely in the same place. They aren’t subject to hokey homogenizations. And the reanalysis data has no gaps, using the laws of physics and a high-resolution numerical weather prediction model that generates physically realistic Arctic temperatures, rather than the statistical machinations used in the land-based histories that inflate warming over the Arctic Ocean.

There is one possible confounding factor in that some of the launch sites are pretty close to built up areas, or are in locations (airports) that tend to attract new infrastructure. That should mean that any warming in them is likely to be a (very slight) overestimate.

And so here is JRA-55 surface temperature departures from the 1981-2010 average:

clip_image003

Figure 2. Monthly JRA-55 data beginning in January, 1979, which marks the beginning of the satellite-sensed temperature record. The average warming rate is 0.10⁰C/decade and there’s a clear “pause” between the late 1990s and the beginning of the recent El Niño.

The warming rate in JRA-55 until the 2015-16 El Niño is 0.10⁰C/decade, or about 40% of what has been forecast for the era by the average of the UN’s 106 climate model realizations. There’s no reason to think this is going to change much in coming decades, so it’s time to scale back the forecast warming for this century from the UN’s models—which is around 2.2⁰C using an emissions scenario reflecting the natural gas revolution. Using straight math, that would cut 21st century warming to around 0.9⁰C. Based upon a literature detailed elsewhere, that seems a bit low (and it also depends upon widespread substitution of natural gas for coal-based electricity).

JRA-55 also has a rather obvious “pause” between the late 1990s and 2014, contrary to recent reports.

The fact of the matter is that what should be the most physically realistic measure of global average surface temperature is also our coolest.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

251 Comments
Inline Feedbacks
View all comments
November 24, 2017 6:00 am

Hooray for real data!

November 24, 2017 6:24 am

Interesting, but deja vu.

There is no balance on latitude, never mind longitude and height of measurement.
If you put the wrong results in, how can you expect the right results?
The long term average looks more or less like mine, around + 0.1K/decade, but there has already been a turning point, that you would have picked up if you had looked at minima or maxima. Unfortunately, earth inner core has also been moving, due to the magnetic stirrer effect, which is confusing everyone by showing an unchanging, or even increasing Tmean, due to the unbalanced global reporting.
By me,there has been no warming here where I live, or even in the whole of the SH.

November 24, 2017 8:14 am

As far as I know there is nothing special about JRA-55, as it uses almost exactly the same real data as ERA (ECMWF) and NCEP/NCAR. Differences come mainly from the model, that in JRA and ERA is 4D, and NCEP/NCAR is 3D. Americans are falling behind in this 😉

I personally prefer reanalysis data to satellite or surface data. I know it is only intended for forecasting, but this is a critical mission as lives depend on it. Deviations over time due to inhomogeneities in the case of reanalysis are most likely going to average out, unlike in methods where a human hand tips the scale. And if there is a drift over time, it is likely to be both much smaller than in adjusted data and self-correcting, as data gathering improves with time.

And interestingly all three reanalysis datasets say essentially the same (see the figures in the article’s linked paper). This is a huge improvement, as we now have two satellite datasets disagreeing, and multiple surface datasets disagreeing.

A lot of climate researchers are moving over to reanalysis data because it has the big advantage of being stable. You don’t want to base your research on a database that is significantly changed every Tuesday, as is the case of GISS. In five years your article and research is worthless, together with its conclusions. GISS has managed to become just a figure provider for alarmist media reports. Too expensive for that. It should be discontinued. The future is in reanalysis. Satellite data is still required for reanalysis.

Reply to  Javier
November 24, 2017 10:23 am

Javeir said:
“The future is in reanalysis. Satellite data is still required for reanalysis.”

I disagree:
The future is to stop studying tenth of a degree changes in average temperature.
They are harmless and meaningless.

and focus on real environmental issues: Gross land, air and water pollution in China,
India and other Asian nations.
Real pollution is harmful and important.

The time to cut taxpayer funded and counter productive “climate research”
by at least 90%, is NOW !

The warmunists spend their time spinning wild tales of a coming climate crisis,
while skeptics are huddled together, heads down, debating and re-debating
tenths of a degree differences in average temperature data.

The big picture is CO2 does not control the average temperature
and runaway global warming is a fairy tale — skeptics should not get
bogged down in surface data where the “fix” is in — the data are compiled
by smarmy government bureaucrat climate modelers who can’t be trusted
and repeatedly “adjusted” to show more global warming.

Temperature measurements would be important ONLY if there was a real climate problem today.

But there is no climate problem today — the climate is wonderful in 2017, and
has been getting better for humans, animals and green plants for at least 500 years!

Mike Osborne
November 24, 2017 9:08 am

has anyone tried a simple cross plot of CO2 data against the JRA 55 data? If there is a strong correlation, it will be obvious. Last time I tried this with published data the correlation coefficient was a bit better than noise.

Reply to  Mike Osborne
November 24, 2017 5:20 pm

correlation? wrong approach guy
https://www.nature.com/articles/nclimate2568

crackers345
Reply to  Mike Osborne
November 25, 2017 2:14 am

if anything, it should be a plot of
ln(CO2) vs GMST….

November 24, 2017 10:31 am

” Local conditions can bias temperatures. And the “urban heat island” can artificially warm readings with population levels as low as 2500. Neighboring reporting stations can diverge significantly from each other.”

Reanalysis ingests data from a wide variety of sources. Especially sources that we do not use in surface products. Some examples:

1. Networks of sensors located by roads provided by departments of transportation.
2. networks of urban areas – Urbannet. This data is closed to the public.
3. Networks set up by railroads in the US
4. High school networks.
5. Private industry data that is likewise not available to the public

IN climate studies we avoid this data because its not available to the public to be checked. Also,
it is overly urban. In re analysis they dont care about UHI, because they are trying to predict the weather in urban areas. In climate studies we either remove urban stations or we adjust the data to account for detectable UHI influence.

next

“The fact of the matter is that what should be the most physically realistic measure of global average surface temperature is also our coolest.”

1. The way the reanalysis is VERIFIED by the modelers is by comparing it to HADCRUT 4. That is the people who build these systems use actual observations to validate their models. That tells you what they trust.

2. The Reanalysis uses AGW physics. This means if you Accept the output you are logically bound to
accept the inputs and accept the physics used ( radiative transfer) You cannot logically reject AGW science
on one hand and accept the output of re analysis models that use this physics.

3. Over land it is WARMER than Berkeley earth. Over the ocean it is cooler. And the reaosn for that is they
do not make the required adjustments to SST products they ingest.

4. It is not the most physically realistic measure. There are other Re analysis products with comparable physics. A good analyst compares all the products and explains the differences. In the end differences between products are an estimation of the STRUCTURAL uncertainty.

If this is the best the red team has you’ll better go back to school. You see long ago Judith Curry suggested that re analysis was the best source of information. Ya think maybe some of us went off and looked at every re analysis product? ya think? Personally I’ve been studying this re analysis data and other datasets since 2014 for my business. Comparing the Global numbers is just the start. After that you have to dive into the inputs, you have to see where the products differ and explain why.

Or.. you could see a conclusion you like ( JRA is cool) and simply declare it the best because it fits your agenda

Reply to  Steven Mosher
November 24, 2017 11:11 am

Masher:
Average temperature changes a few tenths of a degree from some prior base period are meaningless data, even if the data were perfectly accurate.

No one knows what average temperature is “normal”.

We don’t have real time measurements for 99.999% of Earth’s past.

Measurements before 1940 had little data from the Southern Hemisphere.

Up to half of surface grids even now are filled in with wild guess “infilling”.

Historical data are repeatedly “adjusted”, almost always “cooling the past”
or warming recent decades, in an attempt to create more global warming out of thin air.

Leftist scientists have no idea of the exact causes of climate change,
yet they create “models” of a subject they don’t understand,
simply to make their wrong, scary predictions of the future climate
seem more believable!

30 years of grossly inaccurate computer game predictions
have made government bureaucrat climate modelers
look like the clueless fools they are,
and they have given real science a bad name.

My analysis of the climate for the past 20 years resulted in
the conclusion that today’s climate is wonderful, except for pollution
in Asia that environmentalists ignore, and the best thing humans have
ever done for the climate, although inadvertently, was adding more CO2.

I support a doubling or tripling of todays levels to optimize C3 plant growth.

The optimum climate for our planet is a climate that produces the most food from plants —
our current CO2 level of 400ppm is not far from the bottom of the 200 to 8,000 ppm range
estimated by geologists — are too far below the estimated 1,000 ppm average CO2 when
the C3 plants were evolving.

So Masher, you keep your head down analyzing and re-amalyzing
meaningless surface temperature data, and distracting skeptics from
the big picture — the climate is wonderful and has been getting better for
hundreds of years, helped by adding man made CO2 to the air.

Sorry most of this post is over your head, Masher — get back to your precious surface temperature numbers and ignore the air, water and land pollution in Asia … even though some of the air pollution reached California — I’m assuming you are from California.

I just checked Berkeley Earth for S. Masher and found your picture:
Didn’t your mama ever tell you to comb your hair before they take a picture Masher !

henryp
Reply to  Richard Greene
November 24, 2017 1:16 pm

OK. Masher?

crackers345
Reply to  Richard Greene
November 24, 2017 5:11 pm

Richard Greene commented –
“Average temperature changes a few tenths of a degree from some prior base period are meaningless data, even if the data were perfectly accurate.”

why?

“No one knows what average temperature is “normal”.”

there is no “normal” temperature — only
the temperature
that we and all other
species have adapted to.
readapting can be
difficult, as history
has shown, especially
at the huge current
rate at which
climate is
changing

Reply to  Richard Greene
November 24, 2017 5:13 pm
Reply to  Richard Greene
November 24, 2017 5:16 pm

“Sorry most of this post is over your head, Masher — get back to your precious surface temperature numbers and ignore the air, water and land pollution in Asia … even though some of the air pollution reached California — I’m assuming you are from California.”
Nope: Born in Michigan, Lived in Chicago, LA, Sunnyvale, SF,
Now I live in Beijing and Seoul and visit SF.

Reply to  Richard Greene
November 24, 2017 5:17 pm

“Historical data are repeatedly “adjusted”, almost always “cooling the past”
or warming recent decades, in an attempt to create more global warming out of thin air.”
err no.
the raw data is WARMER before adjustments.

crackers345
Reply to  Richard Greene
November 25, 2017 1:44 am

Richard, SM has earned the right
to be called by his real name

JasG
Reply to  Steven Mosher
November 27, 2017 5:49 am

Some qualification required to dispel the rank disinformation…
1. They use actual observations to calibrate, not verify. The reconstruction of Hadcrut is not used – only individual temps of known quality ie those that did not need adjustments.
2. Accepting radiative transfer gets you 1 degree of warming per doubling of CO2 which is not scary and more likely to be beneficial. Rejecting any value above that is not rejecting any science – it is rejecting models known to be inadequate.
3. All SSTs have huge error margins and different datasets overlap. No researcher in this area trusts them before 2005 nor is there enough data to trust them below 700m. Ask Josh Willis. After 2005 temps are far from scary and even had to be adjusted so they did not show cooling – because the scary models were believed more than the data.
4. A good analyst would not reject what is likely to give the best data – the satellites. Again this is because they are properly calibrated – unlike surface data – and after such calibration they give the best coverage by far. When your dataset cannot match the satellites therefore it is very likely wrong. Every other measure of climate change, from global greening to sea level relies on satellite data. You don’t get the best answer from averaging reconstructions that all use the same flawed data and usually the similar homogenisations, rather you should you pick the one most likely to be more accurate then to extend it you try to find another that overlaps it in that most trusted period.

All arguments against satellites stem from a belief that there should be more warming seen because inadequate models show there should be more. It is confirmation bias writ large! The model that best shows reality (the Russian) has low CO2 sensitivity and low water vapour feedback. But such a low warming projection is not scary and so would neither affect energy policy nor maintain the huge level of research funding into such a non-event.

crackers345
Reply to  JasG
November 27, 2017 9:12 am

JasG commented – “All arguments against satellites stem from a belief that there should be more warming seen because inadequate models show there should be more. It is confirmation bias writ large!”

you’re assuming the “observed” data
are always right. but it comes from a model
too, and the history of science shows
data models are not always right – they
must be scrutinized just as much as
the theoretical models.
even the history of climate science shows
this, with the uah sign-error debacle,
and the large difference
now between uah’s and rss’s model
results for the lower troposphere.

Dr. Deanster
November 24, 2017 2:18 pm

Question ….. given that the reading on a thermometer is NEVER a direct result of all that fancy mathematical radiative physics, but rather is the result of the direction from which the wind blows, how can anyone put any stock into the validity of any of the measurements?

crackers345
Reply to  Dr. Deanster
November 25, 2017 1:40 am

huh. here i thought the reading on
a thermometer i read is the temperature
of the air around it.

no?

1sky1
November 24, 2017 3:03 pm

The warming rate in JRA-55 until the 2015-16 El Niño is 0.10⁰C/decade, or about 40% of what has been forecast for the era by the average of the UN’s 106 climate model realizations. There’s no reason to think this is going to change much in coming decades…

Lacking unequivocal specification of the time-interval over which “the warming rate” is estimated, this claim is surprisingly unscientific. The clear reason for thinking that the decadal rate will change markedly is the presence of strong multi-decadal and longer oscillations in global temperatures, quite independent of ENSO and other intra-decadal changes.

1sky1
November 24, 2017 3:05 pm

The second paragraph is not a quotation and should not be italicized.

November 24, 2017 4:28 pm

So, we don’t really know the global temperature.
The satellites are out best “thermometer” for a global temperature. And the data they can provide us is limited and young.
Surface station, sea data can not give a true reflection of “GLOBAL” temperature no matter how many computer mirrors it is run through.
Satellites are the “BEST” we have for a Global temperature.
Too bad they don’t support the IPCC’s political meme.
Drs’ Roy Spencer and John Christy could be rich!
(And “Climate Science” might not need to be put in quotes anymore.)

crackers345
Reply to  Gunga Din
November 24, 2017 5:01 pm

Gunga Din commented – “The satellites are out best “thermometer” for a global temperature.”

why do you
think that?

do you know uah and rss are
now trying to calibrate
over about 11 different
satellites (i think it was, last
time I looked – maybe more
now).

Reply to  crackers345
November 24, 2017 5:18 pm

GOSH!
Then I guess we don’t know the “Global” temperature at all!
Then what are we acting on?

crackers345
Reply to  crackers345
November 24, 2017 5:33 pm

no
variable can be measured with
perfect accuracy. we can only
do the best we can
with the
data we got.

hence calibrations,
adjustments, homogenization,
etc.

but it’s clear the planet is
warming, because lots of
ice is melting and the
ocean is warming.

JasG
Reply to  crackers345
November 27, 2017 5:17 am

Every thermometer is calibrated against known temperatures. That’s how you make it is accurate. Calibration is not homogenization.

crackers345
November 24, 2017 4:40 pm

the authors wrote – “Let’s face it, global surface temperature histories measured by thermometers are a mess.”

this is a huge claim that belongs in the peer reviewed literature, not a blog. the authors are scientists and know this very
well.
here they are attempting
to skirt the
peer reviewed literature
and make statements about
the science without doing the
hard work
necessary to justify those
statements.

they know this, and they
know professional scientists
are not going to give this blog
post a moment’s notice.

this is only about PR
and preaching to the
choir

Reply to  crackers345
November 24, 2017 5:06 pm

OH PLEASE!
If global surface temperature histories measured by thermometers are NOT a mess then why did they need be “adjusted” and “BESTed”?
You and those like you are the PR.

Reply to  Gunga Din
November 24, 2017 5:07 pm

DAMN!
I forgot to mention Treemometers!
(Sorry)

Reply to  Gunga Din
November 24, 2017 5:11 pm

when you mention treemometers you have to credit me since I invented the term.
You dont HAVE to adjust the temperatures.
if you DONT adjust them, there is MORE warming.
if we dont adjust them they are warmer than the gold standard CRN stations

Nick Stokes
Reply to  Gunga Din
November 24, 2017 5:37 pm

But you’ve picked up
the style.

crackers345
Reply to  Gunga Din
November 24, 2017 5:42 pm

Gunga Din – the raw data
needs to be adjusted because,
of the course of its
measurement, it has been
measured at different times,
in different places, with different
instruments.

in other words, biases
must be eliminated.

how would you prefer
to eliminate these biases in
the best possible way.

crackers345
Reply to  Gunga Din
November 24, 2017 5:42 pm

Gunga Din – bias
adjustments reduce the
long-term warming
trend.

would you prefer it to
be higher?

Reply to  Gunga Din
November 24, 2017 6:15 pm

“when you mention treemometers you have to credit me since I invented the term.”

Really you invented the term? Yet I cannot find a reference to you using it that is older than these:

1. https://wattsupwiththat.com/2007/10/27/helio-la-nina-and-bad-winters-now-nuts/
2. http://blogmasterg.com/2007/01/26/0-degrees/

crackers345
Reply to  Gunga Din
November 24, 2017 6:36 pm

Poptech – are you associated with
the annual conference in Maine? Nice
conference. Do
they know you’re commenting here
under their name?

SM did much of the
programming for BEST.

he knows this stuff better
than anyone else here, i’m
almost certain….

Reply to  Gunga Din
November 25, 2017 2:11 pm

Steven Mosher November 24, 2017 at 5:11 pm
when you mention treemometers you have to credit me since I invented the term.

Credits to you for a good word to describe the root (I would have said “roots” but only one tree was involved.) of Mann’s Hockey Stick.

Reply to  Gunga Din
November 26, 2017 4:52 pm

crackers, no I have nothing to do with that site and I am commenting under the same name I have been for about ten years now.

BTW, Mosher could not program his way out a paper bag.

crackers345
Reply to  Gunga Din
November 26, 2017 4:54 pm

Poptech – Mosher was
an integral part of BEST.

you can disagree with him,
ya know,
without denigrating
him

Reply to  Gunga Din
November 26, 2017 5:04 pm

Stating that he is a liberal arts major with no background in science is not denigrating him, it is a fact.

He is such a computer illiterate he does not know elementary differences between Windows operating systems.

http://www.populartechnology.net/2014/07/nasa-and-usgs-does-not-know-difference.html

Reply to  crackers345
November 26, 2017 10:13 am

This planet has been warming for about 20,000 years.

That has been “normal” for 20,000 years.

Why would you leftists suddenly decide that 1 degree C. of warming since 1880 is abnormal?

Why would you want government policy based on assuming the 1975 to 2000 warming trend is permanent?

Why would you expect +3 degrees of average temperature rise for every doubling of CO2 when the simple lab experiments suggest, but don’t prove, a harmless +1 degree C. rise per doubling?

Why would anyone care if the average temperature might go up one degree C. in the next 133 to 200 years, assuming CO2 levels increase +3 ppm or +2 ppm per year?

The flat temperature trend from 2000 to 2015 clearly showed the 1975 to 2000 warming was not permanent — unless you “adjust” the historical data to make the 2000 to 2015 hiatus disappear !

Why would you leftists use EL Nino temperature peaks, unrelated to CO2, to further your cause?

Why would you leftists use measuring instruments with a margin of error of +/- 1 degree C. or more,
wild guess temperatures in grids totally up to half of earth’s surface,
claim a very hard to believe global average temperature margin of error of +/- 0.1 degree C.,
and then use alleged few hundredths of a degree C. difference to claim
one year was “the warmest year evah”?

Because you leftists are extremely dishonest self-serving people, that’s why!

My best guess is you are a global warmunist eager to increase government powers
by falsely claiming a coming climate crisis, that only governments can stop.

That’s just my guess, of course — from reading your inane, science-free comments,
it appears that your only skill is poking skeptics in the eye, throwing mud at their comments
and character attacking them.

I don’t expect you to provide intelligent answers to my questions above,
because I doubt if you could.

Gabro
November 24, 2017 5:08 pm

Most of the alleged “warming” has “occurred” where there are no actual data.

This genocidal sc@m has cost the world at least tens of millions of human lives (not to mention massacred birds and bats) and tens of trillions of squandered dollars.

Reply to  Gabro
November 24, 2017 5:12 pm

Ah no.
However we do find this.
every time we recover old data… the record warms

Reply to  Steven Mosher
November 24, 2017 6:21 pm

Mosher did you ever finish your Ph.D. in English?

Gabro
Reply to  Steven Mosher
November 24, 2017 6:40 pm

SM,

Please support that preposterous assertion.

Thanks!

crackers345
Reply to  Steven Mosher
November 24, 2017 6:49 pm

gabro, see karl et al’s famous 2015
paper in Science, fig 2

Reply to  Steven Mosher
November 25, 2017 7:13 am

“SM,

Please support that preposterous assertion.

Thanks!”

crackers345
Reply to  Gabro
November 24, 2017 6:34 pm

Gabro November commented – “Most of the alleged “warming” has “occurred” where there are no actual data.”

HadCRUT goes back
to 1850.
GISS and NASA to
1880.
How much more
data do
you need to conclude
warming?
In fact, almost all the
warming has occurred
since 1970.

Gabro
Reply to  crackers345
November 24, 2017 6:38 pm

That the fictionalized “data sets” go back to the 19th century means nothing.

For all the oceans, there are no actual “surface” data, but only made up garbage from below the surface. For much of poorly sampled continents, such as Africa, “data” are made up. Ditto for polar regions.

The so-called “surface data sets” are works of anti-science fantasy.

crackers345
Reply to  crackers345
November 24, 2017 6:50 pm

“That the fictionalized “data sets” go back to the 19th century means nothing.”

why?

“fictionalized?” how?

crackers345
Reply to  crackers345
November 24, 2017 6:51 pm

gabro commented – “The so-called “surface data sets” are works of anti-science fantasy”

why?

crackers345
Reply to  crackers345
November 24, 2017 6:51 pm

gabro – almost all the
warming has occurred
since 1970.

Tom Halla
Reply to  crackers345
November 24, 2017 7:14 pm

crackers, I know you know better. Pretend you are totally unaware of Tony Heller’s compilation of actual, hard copy records before they were “adjusted”. Pretend the sea surface temperature records are anything but inadequate, but used to justify the adjustments nevertheless. Add in the infill for areas with no actual reporting stations, and you are selling unicorn racing results.

Gabro
Reply to  crackers345
November 24, 2017 7:00 pm

Crackers,

In the real world, warming occurred in the late 19th century, early 20th century and late 20th century, with natural cooling cycles in between the warming phases.

Actually, warming occurred after the PDO flip in 1977, then stopped in the late ’90s, and has stayed flat since then, While CO2 rose rapidly from the 1940s to ’70s, earth cooled dramatically.

Hence, there is no correlation between rising CO2 and temperature. The two trends just happened accidentally to coincide from 1977 to c. 1997 or at most 2007.

Now earth is cooling again. Arctic sea ice has been growing since 2012.

Gabro
Reply to  crackers345
November 24, 2017 7:04 pm

And the early 20th century warming is indistinguishable from the late 20th century warming, supposedly due to more CO2.

But both pale in comparison with the early 18th century warming, coming out of the depths of the LIA during the Maunder Minimum.

There is no detectable human CO2 signal in real temperature records.

Nick Stokes
Reply to  crackers345
November 24, 2017 7:33 pm

“Tony Heller’s compilation of actual, hard copy records before they were “adjusted”. “
Just dumb. Unadjusted data is fully available now. In US, NOAA even provides facsimiles of the handwritten originals. And there is far more unadjusted data known than was available to people 40 years ago.

“And the early 20th century warming is indistinguishable from the late 20th century warming”
Nope
comment image

Dave Fair
Reply to  crackers345
November 24, 2017 11:44 pm

I always love reading Mr. Mosher’s Weed Wandering excursions. The problem for him is that no matter the data set one uses, IPCC climate models run 1.5 to 3 times too hot.

More problems: Global weather is not deteriorating as predicted, Arctic ice is not going away, …. aw hell; you all know the problems with CAGW.

Yes, some gasses have radiative properties that, everything else being equal, should result in some atmospheric warming. Being a water world, evaporation, convection, phase changes, clouds and so on confound making any definitive statements about the overall impacts of CO2 concentrations.

Dave Fair
Reply to  crackers345
November 24, 2017 11:46 pm

Sorry, totally misplaced comment. Of course it applies to much of the obfuscation going on here.

crackers345
Reply to  crackers345
November 25, 2017 1:12 am

Dave Fair commented – “I always love reading Mr. Mosher’s Weed Wandering excursions. The problem for him is that no matter the data set one uses, IPCC climate models run 1.5 to 3 times too hot.”

prove it.

finally, someone here prove something.
or retract.

waiting………………………………………………………………………………………………………………………………

Dave Fair
Reply to  crackers345
November 25, 2017 10:24 am

If you are unaware of the numerous comparisons, you should not be commenting, crackers345. Paid Troll?

crackers345
Reply to  crackers345
November 25, 2017 1:13 am

Dave Fair commented – “More problems: …Arctic ice is not going away”

No? discuss:

http://psc.apl.uw.edu/wordpress/wp-content/uploads/schweiger/ice_volume/BPIOMASIceVolumeAnomalyCurrentV2.1.png

Dave Fair
Reply to  crackers345
November 25, 2017 10:36 am

So you can’t see that the decline bottomed out over 10 years ago, crackers 345?

What ever happened to our experts’ ice-free predictions?

crackers345
Reply to  crackers345
November 25, 2017 1:30 am

Dave F – why doesn’t your comment
show up in my Inbox, when I’ve asked
to receive emails about new comments.

is this a known bug in
word press?

stevekeohane
Reply to  crackers345
November 25, 2017 5:27 am

Nick Stokes
Gee Nick, the second warming is slightly steeper, must be that homicidal anthropogenic CO2! Why, it must be +.25°/century.
http://i65.tinypic.com/34goxt2.jpg

Reply to  crackers345
November 26, 2017 9:22 am

clackers, your comments at this website are even worse
than those from Masher and the Griffter !

I normally ignore surface data because satellite data have far less infilling,
are not affected by economic growth in the vicinity of the thermometers,
and correlate with weather balloon data — surface data are the outlier
that correlate with no other measurement methodologies.

The surface data show warming from 1910 to 1940
and similar warming from 1975 to 2000.

You smarmy leftists want us to believe the
1910 to 1940 warming was natural while the
similar 1975 to 2000 warming was from man made CO2.

Two different causes of warming in the same century?

How can you be so sure of that?

In addition, the unproven leftist claim of two different causes
of 20th century warming means a belief that 4.5 billion years
of natural climate change suddenly ended in 1975,
and man made CO2 took over in 1975
as the average temperature controller,
with no explanation of how or why that could have happened.

If you believe that, clackers, then you can believe in any climate fairy tale.

My climate blog for non-scientists:
Please stay away clacker!
http://www.elOnionBloggle.Blogspot.com

Reply to  crackers345
November 26, 2017 10:20 am

There were very few Southern Hemisphere measurements before 1940.

And almost none before 1900.

Starting point thermometers in the 1800s tended to read low = exaggerating actual warming.

Even today, up to half of earth’s surface has no measurements = infilled grids = wild guess data that can never be verified, or falsified.

Almost all of the warming has not happened since 1970 as you claimed clacker — roughly half the 1880 to 2015 claimed warming was before 1940.

I do not include the 2016 El Nino peak because that happens to be temporary, and could not have been caused by CO2.

crackers345
Reply to  crackers345
November 26, 2017 3:46 pm

richard, as i
wrote, almost
all warming has
occurred since 1970.

if you don’t want
to include the 2016
el nino, can we exclude
the previous la ninas too,
2010-11, 2011-12, and
the weak one going now?

car to explain why el nino
years keep getting
warmer? and la nina
years too? and neutral
years?

crackers345
Reply to  crackers345
November 26, 2017 3:49 pm

richard commented – “You smarmy leftists want us to believe the
1910 to 1940 warming was natural”

please stop insulting me.

i don’t know anybody who thinks
1910-40 warming was all natural. do
you have evidence of scientists
saying that?

crackers345
Reply to  crackers345
November 26, 2017 4:15 pm

dave f – no i don’t see
any bottoming out – just
fluctuations like have
happened
before on that chart.
1981-87 stands out.

note the latest value of
the ice volume is right
on the trend line.

Reply to  crackers345
November 28, 2017 7:05 am

Mr. Clackers, I have no idea why I bother to respond to your mud slinging
but I also have no idea why I slow down to look at accidents
on the other side of a divided highway.

I have to cut and paste some of your words here
from your comments that are actually below this,
but they have no “reply” links for a more direct reply:

Clackers Sez:
“crackers345 November 26, 2017 at 3:46 pm
richard, as i
wrote, almost
all warming has
occurred since 1970.”

REPLY: Clackers you silly boy!
The only way you could even come close
to being right is to use the 2016 El Nino
heat peak as your measurement end point,
knowing that peak is temporary
and not caused by CO2.

In economics that would be like
starting a measurement with the stock market
average in March 2009 (low)
and ending the measurement with the stock market
average now (high) and then claiming
the stock market triples every eight years!

.

Clackers Sez:
“car to explain why el nino
years keep getting
warmer? and la nina
years too? and neutral
years?”

REPLY: Clackers you silly boy!
The last time I looked the strong 2016 El Nino
heat peak was a mere +0.1 degrees C.
warmer than the strong 1998 El Nino heat peak,
and 0.1 degrees C. much smaller
than any reasonable margin of error.

Clackers Sez:
crackers345 November 26, 2017 at 3:49 pm
richard commented – “You smarmy leftists want us to believe the
1910 to 1940 warming was natural”

“i don’t know anybody who thinks
1910-40 warming was all natural. do
you have evidence of scientists
saying that?”

REPLY: Clackers you silly boy!
Scientists claim man made CO2 had
little to do with 1910 to 1940 warming,
therefore the primary cause must be natural
climate variations.

Dear Mr. Clackers:
If you want more respect at this website,
it would help to use your real name
like I do.

And once in a while share a URL
that you consider a good source of climate data,
so we can better understand your biases.

Also, format your comments so they are not
so hard to read — I format many of my comments
with a space between every sentence because of
problems with my eyesight — but that’s easier to read
than your formatting.

Most important,
say something of value
rather than throwing mud every time.
State a fact.
Present data.
State a conclusion, and then back it up.
Refute other comments with data, facts and logic.

The coming global warming catastrophe is “because we say so” leftist nonsense.

The water vapor positive feedback theory is “because we say so” leftist nonsense.

Your claim that almost all warming (since 1880) was after 1970, is nonsense too.

If you stop a measurement at a temporary El Nino peak, to show more warming.
that is bias.

And I do think you leftists are smarmy people — that is my opinion based on
64 years of observations. I am personally a libertarian and an atheist.

My pet peeves are politicians making empty promises, such as Trump,
and people who make make scary predictions and then
tell others how to live (usually leftists and religious leaders).

My climate blog for non-scientists:
No wild guess predictions to the future climate:
It will either get warmer or cooler.
http://www.elOnionBloggle.Blogspot.com

Toneb
November 25, 2017 8:53 am

Why does your second red line not LMS the plots properly?
It is plainly not steep enough.

Dave Fair
Reply to  Toneb
November 25, 2017 11:56 am

I just assumed, Toneb, that he left out the Super El Nino as an unwarranted end effect.

Dave Fair
Reply to  Toneb
November 25, 2017 11:58 am

Actually, Toneb, you could be correct. I took a closer look.

Glen Martin
November 28, 2017 12:49 pm

I caught this little detail:

“Historically, temperatures have been recorded by mercury-in-glass thermometers housed in a ventilated white box. But, especially in poorer countries, there’s little financial incentive to keep these boxes the right white, so they may darken over time. ”

Not only could the boxes darken over time, but if they are painted infrequently, the homogenization process could treat each new paint job as a station move to a cooler site while ignoring the gradual warming in between since that was happening to all the stations. Thus the past would get cooled a little bit each time one of the boxes got a new paint job.

michaelspj
Reply to  Glen Martin
November 28, 2017 3:37 pm

Thanks! Glad someone noticed.