2017 Global temperatures are leveling off – near 1980 temperature anomaly (depending on who you ask)

From the “dashed hopes for the warmest year evar!” department comes this update from Dr. Ryan Maue on the global surface temperature:

Via Twitter:

Global temperatures have generally settled to +0.26°C compared to 1981-2010 climatology continuing downward glide thru 2017 (black line)

He adds:

Tropical vs. non-tropical temperature anomalies have balanced out mostly for the past few months. No El Niño suggests continued T levels:

In a nutshell what Dr. Maue is saying is that without a strong El Nino event to boost temperature, global temperatures are stabilizing around +0.26°C. FYI, NCEP data used in these plots is from the NOAA National Centers for Environmental Prediction. The data is available here: http://cfs.ncep.noaa.gov/cfsr/downloads/

Recently, we covered a story from the Australian Bureau of Meteorology (BoM) that called off their El Niño watch. BoM says:

All eight international models surveyed by the Bureau of Meteorology now suggest tropical Pacific Ocean temperatures are likely to remain ENSO-neutral for the second half of 2017.

It’s a tough business to be in when your CO2 driven “climate change” can’t get there unless a natural ENSO event pushes up the temperature for you. Meanwhile, Justin Gillis at the New York Times claims “Earth Scorching CO2” is higher than ever while temperatures stabilize at a value that is the same as about 1980 (0.27°C), according NASA’s GISTEMP:

Land-ocean temperature index, 1880 to present, with base period 1951-1980. The solid black line is the global annual mean and the solid red line is the five-year lowess smooth. The blue uncertainty bars (95% confidence limit) account only for incomplete spatial sampling. [This is an update of Fig. 9a in Hansen et al. (2010).]

Looks like that big El Niño driven peak in GISTEMP of 0.98°C for 2016 could be coming down in 2017 if the current values hold and ENSO neutral conditions remain.

Just look at the sea surface temperatures, there’s not a lot of warm water:

We live in interesting times.


NOTE: I expected some complaints about comparing GISS and NCEP graphs, and there were plenty. I did it to illustrate a point.

Which one is the RIGHT temperature anomaly? Anomalies are all products of their baselines, and baselines are a choice of the publisher.

If NASA GISS is to be believed as the world’s most cited source for global temperature, then 0.27C is correct for 1980.

Unfortunately, they have been living in the past, and refuse to update their baseline. UAH did it, RSS did it, NOAA/NCEP did it….why not GISS? The answer: Gavin Schmidt.

Not sure about BEST: They don’t list their baseline period in their graph: http://berkeleyearth.org/wp-content/uploads/2015/03/land-and-ocean-summary-large.png

This is why absolute temperatures don’t suffer from the choices made by the researchers for the anomaly baseline. There’s no musical chairs with anomaly baselines.

It would be nice if GISS got with the program used the 1981-2010 baseline like other data sets, or all the climate data publishers agreed on using one baseline. For example, here’s a BEST plot with all the baselines adjusted to NASA GISS 1951-1980.

The general public really doesn’t care or know about anomaly baselines – they just want to know what today’s temperature is relative to the past.

Standardizing on one baseline for all climate data sets would make that easier for the public consumption. I’m sure that call for standardization of baselines will fall on deaf ears at NASA GISS, where their lead researcher, Gavin Schmidt, is so petty he can’t even appear on the same TV set with another researcher. 

Let the squawking begin.

UPDATE: To further illustrate the point about different baselines giving different results to the public, here is the HadCRUT4 data, which uses a 1961 to 1990 baseline:

Source: Hadley Climate Research Unit https://crudata.uea.ac.uk/~timo/diag/tempdiag.htm

According to their data (which is mostly the same raw GHCN data used by NASA GISS, plus some others) their 1980 temperature anomaly was somewhere around 0.1°C (see green lines intersection), where GISS says 0.27°C

Again, why can’t climate science do a simple thing like standardize on a baseline period ?

I’ve added a caveat in the title to reflect this: (depending on who you ask)

185 thoughts on “2017 Global temperatures are leveling off – near 1980 temperature anomaly (depending on who you ask)

  1. I wonder if anyone has ever considered that CO2 might act as a stabilizing agent?…like a buffer
    …and like any other buffer, when it’s low everything goes whack a mole
    /just thinking

  2. How can you say that 2017 is like 1980. The base periods on your Figures are different: 1951-1980 for GISS and 1981-2010 for the first Figure.

    • Yes I realize that, and expected complaints, in fact I counted on them…but here’s the deal: which one is the RIGHT temperature? hmm? Anomalies are all products of their baselines, and baselines are a choice of the publisher.
      If NASA GISS is to be believed as the world’s most cited source for global temperature, then 0.27C is correct for 1980.
      Unfortunately, they have been living in the past, and refuse to update their baseline. UAH did it, RSS did it, NOAA/NCEP did it….why not GISS? The answer: Gavin Schmidt.
      Not sure about BEST: They don’t list their baseline period in their graph: http://berkeleyearth.org/wp-content/uploads/2015/03/land-and-ocean-summary-large.png
      This is why absolute temperatures don’t suffer from the choices made by the researchers for the anomaly baseline.There’s no musical chairs with anomaly baselines.
      Cue the anomaly defenders, Stokes and Hausfather in 3…2…1

      • There is no right baseline. You simply cannot compare anomalies with different baselines without first putting them on the same baseline.

      • Anthony, jeez, this is goof up. Get on top of it.
        Neither base period is more valid than the other but you can not pick 2017 anomaly of one data set and compare it to the 1980 anomaly from another with a completely different baseline period.
        Apples, oranges, etc.

        2017 Global temperatures are leveling off – near 1980 temperature anomaly

        No they are not. Look at any dataset you wish to chose and you will find that is not the case.
        CNN style retraction needed ASAP.

      • Maybe, build a list of the more rabid NASA “temperatures du jour” defenders?
        Along with their standard strawman claims, instead of honest rebuttals along with their determined focus on irrelevant minutia.
        Then it can be a “fill in the dots and draw the lines to funding” exercise.

      • Here’s a discussion of determining absolute temperatures and why anomalies are easier: https://wattsupwiththat.com/2014/01/26/why-arent-global-surface-temperature-data-produced-in-absolute-form/
        For example, many parts of the world have poor record of absolute temperature due to factors such as elevation diversity in some regions, while there are points in such regions such as cities with weather records in mountainous areas. So much of a specific region of the world are better known in terms of how much anomaly WRT some baseline, as opposed to absolute temperature of most of that region. And the surface temperature can’t be measured as accurately by satellites as that of regions of the atmosphere, because surface emissivity varies more than that of the satellite-measured parts of the atmosphere and due to more factors.

      • Javier,
        Anomaly variability in satellite record is about 1.3 degrees C, so two percent of that would be within measurement error, at 0.026 degree. Where I live, annual variation can be from -37 to +47 degrees. Yesterday’s was from 7 to 36.

        • Gabro,
          The temperature difference between 1980 and 2017 is way above 2% and therefore significant. The warming is real.
          As the earth is a spheroid with an inconstant orientation towards the sun, local conditions can vary hugely. Average changes for the entire surface are however much much smaller. The difference between the Last Glacial Maximum and the Holocene is believed to have been of only 4-5 degrees C.

      • IMO, half a degree per century is nothing about which to worry. Indeed, it’s a good thing.
        But even that amount of warming for another 63 or 100 years is unlikely, since we’re due for another 30-year cooling cycle, as in the 1940s to ’70s and 1880s to 1910s.

        • I am not worried either, but there is a story about a frog.
          Whether we are due or not for cooling you should remember that has been a consistently failed prediction for the past 15 years. Predicting is difficult, specially about the future.

          • Whether we are due or not for cooling you should remember that has been a consistently failed prediction for the past 15 years.

            And the same can be said about warming. What we are seeing is the down wind impact of tropical water vapor, and as the ocean warm pools move from place to place over the decade(s), it alters the land surface temperature average, on decadal time frames.
            Watch those sst anomalies. Large areas of the US are having cooler than average temps. It’s 63.5F, and the days are already getting shorter.

          • It’s 63.5F, and the days are already getting shorter.

            Yes. Winter is coming. I have heard that somewhere. In the meantime I am going to enjoy summer.

          • In the 60’s summers would be cool until early, then July thru Aug were hot, then cool off again. Depending where the air comes from, its a 15-20F difference.

      • Javier,
        There is at least 2% cooling from 1988 to 2017. Is that 30-year interval significant, too, then?

        • There is at least 2% cooling from 1988 to 2017
          2% of Earth’s temperature is 0.02*288K = 5.8 K. That would be significant, but the problem is the inappropriate use of a percentage.

        • Gabro,
          If you are comparing the average temperature for two different years, the interval between them is irrelevant.
          According to UAH the temperature difference between 1988 and 2017 is significant. The question is what does that mean. 1988 was a strong El Niño year, and 2017 is not. And saying that since 1988 there has been cooling would be an obvious mistake.

      • Javier June 27, 2017 at 8:36 am
        I haven’t predicted that it would start 15 years ago. Maybe someone else did. But 15 years ago, I expected that the late 20th century warming cycle would end about 30 years after it started.
        The earliest I would have expected cooling was 2006, ie 30 years after the dramatic PDO flip of 1977, which caused whatever warming actually has been observed since then. It’s too soon to say that that cooling hasn’t indeed begun, since the super El Nino may have masked the signal.

        • Gabro,
          What you (or I) expect or predict is pretty much irrelevant.
          One of the few things that has been solidly demonstrated in climatology is that nobody has a clue of what temperatures are going to do in the future.

      • lsvalgaard June 27, 2017 at 8:43 am
        I’m using it the same way as did Javier, looking not at absolute temperature but at the warming or cooling from a 30 year average baseline, ie an anomaly.
        The fact is that there has been no warming since 1988 but rather cooling, when comparing those two years.

          • No, not really, it’s done the opposite way on a regular basis in newspaper articles…ie. temp today is x warmer now than some date in the past.
            It’s like the song, “Does anybody really know what time it is?” With anomalies and differing baselines presented to the public, does anybody really know what the temperature is, or was? Some standard for baselines in the climate community would solve this. That’s my point.

          • That is weather, not climate.

            That depends on what you are talking about, the measured average temperature at a location, or the calculated average temperature at that particular spot?
            Because if it’s measurements, that weather is a tiny part of climate. And when weather is controlled by long period features, such as decadal ocean cycles, weather averages into climate, and that “climate” is going to have a decadal cycle.

          • No, not really, it’s done the opposite way on a regular basis in newspaper articles…ie. temp today is x warmer now than some date in the past

            One of the reasons I used the prior days temps for that same station.
            I wanted to capture that stations change.

        • If you want to compare the climates in 1988 and 2017 you should compute the 30-year averages centered on 1988 and 2017 and compare those.

          • Agreed, I can do that. But the public can’t. ..any they shouldn’t have to, that’s the point of making this comparison, to show how different baselines are giving people different answers. Why can’t the climate community standardize on a single baseline for public presentations? If they did, there would never be issues over comparisons over one graph or the other….

      • Javier June 27, 2017 at 8:46 am
        Picking any arbitrary year is a mistake, as with 1980. But Warmunistas point to 2015 and 2016 despite their being super El Nino years.
        The trend in UAH since 1979, despite our just coming off a super El Nino, is barely positive. Hence, I agree with you that no worrisome warming is occurring. However, I’d go farther and say that no significant warming has happened, since the past warming cycle is in no way any different from prior warming cycles within prior centennial-scale warmings, such as the Medieval WP, where “significant” means attributable to human activity, outside of natural variability. IOW, there is no human signal in the data.

      • lsvalgaard June 27, 2017 at 8:48 am
        They weren’t carefully picked, as in cherry picked. One is this year and the other is 30 years ago, a traditional climate interval. The trend during that interval would also be about flat, although I haven’t computed it.
        I can’t center a 30-year interval on 2017. I can only end one then.
        The interval centered on 1988, ie 1974-2003, may well prove warmer than 2003-32.

        • picking a single year is not correct. And you CAN do the average centered on 2017: just wait until 2032. THEN you can make a meaningful comparison.

      • you should compute the 30-year averages centered on 1988 and 2017 and compare those.
        That only works if the temps are in terms of a common baseline, such as Celsius. As soon as you use anomalies based on the past, where you are also adjusting the past, your results are not to be trusted.

      • you should compute the 30-year averages centered on 1988 and 2017 and compare those.
        That only works if the temps are in terms of a common baseline, such as Celsius. As soon as you use anomalies based on the past, where you are also adjusting the past, your results are not to be trusted.

      • Javier June 27, 2017 at 9:04 am
        My prediction matters because my US Representative, who’s in the House leadership, relies upon it. Unfortunately, my two Senators don’t.

      • Javier,
        Whether the US continues subsidizing windmills matters very much. His district has more of them than any other in the country. A lot of his campaign contributers have gotten rich off them. If he votes against subsidies, so might other members.
        So the fact that he is convinced that earth is liable to cool again, as it did twice before since the end of the LIA, is significant for US “climate” policy.
        Some climatologists have better records predicting than others, by relying upon climate history, rather than tarot. Or extrapolation of the latest trend indefinitely.
        The one thing we can be sure about climate is that it will change. Hence, cooling is certain sooner or later. I could be wrong about when it will start, or has started. I won’t be wrong that climate will cool, if not in a decade, then a century or millennium or three.

      • lsvalgaard June 27, 2017 at 9:04 am
        Reality has certainly not matched Hansen’s bias, who in 1988 predicted runaway global warming. Here is is 30 years later, and UAH finds global temperature cooler than in 1988, despite CO2 rising at or above his highest estimated rate.
        Thus, CACA is yet again falsified.

      • Which means that he, his successor Schmidt, GISS, NASA and NOAA cannot be trusted, and shouldn’t be.

      • Who cares if the temperatures are up or down, what matters is what proportion is human-induced and what is natural ?
        The natural we can do nothing about.
        The human we can do something about, although destroying the energy-dependent modern world to do this needs public input – not the idiotic, sociopathic ‘elites’ imposing it on everyone else.
        This all looks to me to be the result of the natural restoration of climate due to the end of the Little Ice Age (started and ended by natural processes).

      • Moa,
        Yes, the vast majority, if not all the warming observed since the depths of the LIA, c. AD 1690, has been natural. Obviously same goes for prior such fluctuations in the Holocene and previous interglacials.
        Earth has not yet enjoyed in the Current Warming Period, since c. AD 1850, a single 50-year interval as warm as at least three such during the Medieval WP, and more during the Roman, Minoan and Egyptian WPs, to say nothing of the long Holocene Climatic Optimum.
        Until and unless an important human signal be teased out of genuine climatic data, then there is no reason to worry, let alone dismantle the global economic system which feeds, clothes, houses, educates, warms, cools and provides work and play for going on eight billion people.

      • Javier: “The temperature difference between 1980 and 2017 is way above 2% and therefore significant. The warming is real.”
        The warming is real but here is a different perspective on it.
        Here’s Hansen’s 1999 U.S. surface temperature chart:
        On the Hansen chart you can see that 1998 is the hottest point on the chart with the exception of the 1930’s, which is 0.5C hotter than 1998, and this also makes the 1930’s 0.4C hotter than 2016.
        So, yes there has been warming from 1980 to 2017. 1980 is one of the colder years on record, so it’s no wonder we have warming. But we had even more warming from 1910 to 1940, and the 2017 temperatures are about 0.7C cooler than the 1930’s.
        The warming from 1910 to 1940 is considered to be natural variability, and there is no reason to assume the similar warming from 1980 to today is not also natural variability.
        If you want to argue that the Hansen 1999 U.S. temperature profile does not represent the Global temperature profile, I would say you are wrong. All unmodified charts from around the world resemble the Hansen U.S. chart temperature profile. They definitely do not resemble the bogus, bastardized Hockey Stick charts the Alarmists have dishonestly created (see Climategate) to sell the CAGW narrative.
        According to the Hansen U.S. 1999 chart, in combination with the UAH satellite chart, which Gabro reproduced above, we have been in a temperature downtrend since the 1930’s, and we will have to go at least 0.7C higher from here to break the downward trendline.
        No CAGW to see here.

      • Here everyone goes again – averaging averages of intensive variables. This is like comparing the average telephone numbers in two different telephone books to 3 places of decimals. Mathematically perfectly correct, logically worthless.
        Average temperatures give no information on the amount of energy in the lower atmosphere. They cannot even indicate whether the amount of energy in the lower atmosphere is going up or down.

    • Correct. Headline should read same anomaly, not temperature.
      But actually the temperatures are pretty close, because GISTEMP’s past has been cooled so much and present warmed.

    • : 1951-1980 for GISS
      GISS adjusts the past every day or two, which makes a mockery of using the past as a baseline. In effect the baseline is constantly changing. One might as well redefine the value of zero every couple of days. Hopefully in the right direction when paying bills and the opposite when depositing the pay check.

    • “CNN style retraction needed ASAP.”
      I hear you, but no, I’m trying to illustrate a point here. Read the update.
      The point is: why can’t climate science pick a standard baseline period so that any comparison between datasets is valid and simple without having to employ conversions?
      An even more basic question: what was the temperature of the Earth in 1980? Depending on who you ask, you’ll get different numbers.
      It seems that with climate science losing the battle of opinion, this would be a good thing to do: A Standardized Surface Temperature Anomaly across all datasets – SSTA

      • Never happen as long as “consensus climate science” rules, since how then could the gatekeepers keep cooking the books with constant adjustments, in order to keep up the scare and keep the funding trough taps flowing?

      • The caveat is not good enough. The headline is still grossly misleading. Perhaps this would be better:
        “Why can’t climate scientists not standardize of a common baseline?”
        Makes your point right up front.

  3. The oceans look like they’ve lost all of their excess heat. So is the average baseline average, or is it the average of the bottom.
    If not there is more heat to lose. And here in NE Ohio it’s going to have a high of ~70F partly cloudy. Last 2 days high of 75F.

  4. 2017 Global temperatures are leveling off – near 1980 temperatures

    Meanwhile, Justin Gillis at the New York Times claims “Earth Scorching CO2” is higher than ever while temperatures stabilize at a value that is the same as about 1980 (0.27°C), according NASA’s GISTEMP

    Not so, I’m afraid: Both graphs show anomalies, not temperatures. Both have different base periods.

  5. GISSTEMP of .27 degree C and NCEP CFSR CFSv2 of .264 degree C are an apple and an orange. The 1980 GISSTEMP of .27 C is .27 C warmer than its 1951-1980 averagte. The NCEP CFSR / CFSv2 of .264 C is .264 degree C warmer than its 1981-2010 average.

  6. Just on the face of it, this makes no sense. First you say that temperatures are leveling off at 0.26 C degrees above 1981-2010 levels using NCEP data. Then you say that temps are leveling off near 1980 levels. This is completely contradictory. How can temps be both above 1981-2010 levels but at 1980 levels? Reading on, it becomes clear how you do it: You are comparing the NCEP anomaly with the GISS anomaly. This, of course, is totally illegitimate. The GISS anomaly is based on a 1951-1980 baseline while the NCEP anomaly is based on a 1981-2010 baseline. You are comparing apples and oranges. Were you unaware of your mistake or were you hoping we wouldn’t notice?
    [read the story -mod]

  7. Anthony is showing just how misleading it is for temperature to be presented in terms of an arbitrary baseline that has not been agreed as an international standard.
    We are asked to make decisions on trillions of dollars when the data itself has become a propaganda tool, with scientists themselves largely to blame.

    • Well, I think this is misleading if not just worse, and shouldn’t have been put out like this.

      • shouldn’t have been put out like this.
        The best way to get someone to fix a problem is to SHOW it is a problem.
        There is ZERO good reason for different groups to use different baselines, all with different values for ZERO.

  8. Now I understand why climate science prefers anomalies to real temperatures – SO much easier to fudge whichever way you want. Nice tool.

    • And why they prefer using temperatures and vague terms like hotter, rather than actually measure atmospheric heat content in kilojoules per kilogram which is what they claim to be concerned about.

  9. Curiosity question, I realize that the WMO and friends consider 30 years to be suitable for anomaly baselines, but does anybody have a longer term baseline, i.e. 60+ years? And if not, is there a “how-to” on how to create one?

    • Since 1880, there have been only 2.283 such intervals, but you create on by averaging all the annual temperatures from, say, 1881-1940 or 1951-2010.

  10. why is it so difficult for climate scientists to agree on a common baseline for climate data?
    Using an average of the past is nonsense because climate science routinely breaks the first rule of data integrity, they adjust the past. If accountants did this they would go to jail.
    As soon as you adjust the past this invalidates any baseline that is an average of the past.
    For years 14.5C was the accepted average temperature of the earth. This is the obvious baseline that should have been agreed internationally as the common baseline for all anomalies.
    As Anthony has rightfully show the current situation is nonsense.

    • For years 14.5C was the accepted average temperature of the earth.

      The real problem is that we do not know the average temperature of the earth with an acceptable degree of precission.
      Another serious problem is that models can’t do real temperatures without being all over the place. That’s why they have to work with anomalies. Otherwise their failure would be obvious to all.

      • Javier,
        I agree “that we do not know the average temperature of the earth with an acceptable degree of precision.” Doesn’t this also mean that we can’t calculate a global temperature anomaly with an acceptable degree of precision?
        Regarding climatology. Rather than comparing the change in temperature anomalies relative to a climatology baseline, shouldn’t we be looking at changes in the climatology baseline over time?

        • Doesn’t this also mean that we can’t calculate a global temperature anomaly with an acceptable degree of precision?

          Theoretically no, calculating the anomaly only requires the station data and a consistent methodology. On principle you can calculate the difference between two unknowns with great precission because you can measure the changes, not the absolutes. Obviously I am not going to defend the methodology behind temperature data. I am just talking in general.
          Regarding climatology there is an absurd reductionism of climate to temperature changes, and these to anomaly changes. It doesn’t make much sense, but that is the way humans are. We need a number to anchor our thoughts, even if it is totally meaningless, like the famous two degrees that we should avoid, that is a totally made up number.

      • The real problem is that we do not know the average temperature of the earth with an acceptable degree of precision.
        It is worse than that. much, much worse. there is no international standard to calculate the average temperature, and depending upon the algorithm you choose, it is possible to show the earth on average is both warming and/or cooling at the same time.
        In other words, global warming may be as much a product of the method used to calculate global average temperature as anything else.

        • global warming may be as much a product of the method used to calculate global average temperature as anything else.

          The amount of warming is up for discussion, but the warming is not. The biological response to the warming by species and ecosystems is very clear.

      • Ferd,
        Note that the late 20th century warming cycle was a little smaller than the early 20th century warming, and the the 1930s retain the heat record in the raw data, not the 1990s.

      • ferdberple June 27, 2017 at 9:48 am
        Now that is man-made global warming. Maybe women, too, depending upon which mendacious, crooked bureaucrats made the unwarranted “adjustments”.

      • Javier June 27, 2017 at 10:57 am
        The issue is that whatever warming has occurred since CO2 took off after WWII, which is slight, is well within normal bounds, so there is no detectable human footprint. What is detectable is the observation that CO2 released by human activity has greened the earth, especially in arid regions. Warming effect, not so much.

  11. So you’re comparing anomalies from different datasets and you’re doing so, according to your addendum, “to illustrate a point”. But the point of your original article was completely clear: That 2017 temperatures are just about the same as 1980 temperatures. It’s right there in the title of your article. But now you seem to be abandoning that point completely.
    Please clarify the point of your article. Are you maintaining that there has been little if any warming over the past 37 years? If so, your method is completely illegitimate and your conclusion is misguided. If your point is something about the desirability of standardizing baselines, then why didn’t you say so in the original article? And if your point is now the latter, that certainly is ironic, because you specifically did not standardize the baselines of the two datasets in the comparison that you made.
    (By the way, baselines don’t need to be “updated”. They are what they are, and once established, they don’t change.)

  12. @ lsvalgaard June 27, 2017 at 8:43 am
    Good morning Leif,
    Your comment reminded me of —
    About 50 years ago (+ – 10), a meteorology textbook was published wherein the temperatures were converted from C to F. The process also resulted in the Latitude and Longitude on the maps being likewise converted. The degree symbol ( ° ) is not often such a problem, but percentages and nominal, ordinal, interval, and ratio scales are.

  13. Uh oh. NYT keeps pushing climate por.n while El Scorchio can’t seem to get it up this year. Looks like the flaccid CAGW hypothesis could use a little data pumping to firm it up.

  14. Javier ….. as you say, predicting is difficult, especially about the future. Same holds for this notion by the IPCC that we are going to see between 2-6 C increase in temp. Like you said, they’ve been predicting warming for 15 years, and save for a few elninos …. just hasn’t happened.

    • Correct. Contrary to models and climastrologists’ predictions since at least 1988, temperature was flat between the two super El Ninos, despite steady rise in CO2. The only possible “warming” remotely plausibly attributable to humans is the accidental fact that the super El Nino peak of 2016 was ever so slightly warmer than that of 1999. So, essentially, no warming for 17 years. And it’s looking as if the zero trend will continue, if not cooling in the offing.

    • Dr. Deanster,
      I agree that future temperature predictions are as likely to fail whether they are towards the warming side as towards the cooling side. And the more extreme the predictions are, the more likely they will fail.
      In my opinion it is very likely that future temperatures will fluctuate, showing some warming or some cooling at times.

  15. “The general public really doesn’t care or know about anomaly baselines – they just want to know what today’s temperature is relative to the past.”
    No they don’t. They don’t give a c.rap. All they care about is what’s today’s temp relative to their comfort level or other practical consideration.

      • I’d bet that no more than 1% of the world’s population even thinks about catastrophic global warming day-to-day. And that is being extraordinarily generous.

  16. Leif and others were well spoken on the problem about using comparisons that have different baselines. So my question to those commenters is “What is the best baseline?”. There is none is a possible answer.

    • And that’s my point, see the update. If you ask CRU, you get a different answer than GISS. Why can’t climate science standardize on a baseline period?

      • “…as long as everyone uses [or adjusts to] the same baseline.”
        Right. So why DON’T THEY? (the publishers of the data and choosers of the baseline)
        Imagine if monthly sunspot counts were expressed in anomalies using different baselines by different researchers. NOAA might use the 20th century average as a baseline, NASA might use the last 10 solar cycles as a baseline, SIDC might use a baseline from 1800 to 2000.
        It would get pretty ridiculous pretty quick in reporting “sunspot anomalies” to the public. Just look how much trouble you had getting the recent correction to sunspot numbers accepted…now you have people referring to old and new sets.

        • Imagine if monthly sunspot counts were expressed in anomalies using different baselines by different researchers.
          This is actually what has happened with different observers defining different baselines. The difficulty is in ‘harmonizing’ the baselines. And the difficulty in that is that the assumption that the definition of solar activity [e.g. “what is a group?”] does not vary with time [which is actually does in poorly known ways]. The analogous problem with global temperature is the changing distribution and density of stations as well as the changing environment [less rural].
          All that said, there is really no excuse for not using the same baseline [recognizing the uncertainty when going back in time].

    • There are offsets that you can add or subtract to compensate for baseline changes. It varies a little from month to month and supplier to supplier. I posted a table here. GISS land/ocean for January is reasonably typical. It goes

      1951-80  0
      1961-90  0.102
      1971-00  0.242
      1981-10  0.428

      IOW if you campare NCEP with a 81-10 baseline to GISS without conversion, you are adding in a 0.428 difference.

  17. There is no need for, or even justification for, a moving baseline. The concern expressed by alarmists is that industrialization is responsible for increased release of CO2 and consequent warming. Thus, the appropriate baseline for the argument is any pre-industrialization 30-year period.
    A moving baseline is much like the infamous shell game. It becomes difficult to know which shell the pea is under. That is, it is difficult to make comparisons and predictions when different baselines are used routinely. But, maybe that is the intent!
    Finally, the calculated baseline is actually an artificial construct. The global standard deviation is quite large for a 30-year period. Therefore, currently, an average is calculated and it is assigned a precision that is essentially the same as the annual/monthly average that is used to compute anomalies. If the data analysis isn’t going to be rigorous, one might as well pick some arbitrary number such as 14.000 deg C and compute anomalies from that and drop the pretenses. That is, say, “Assuming a pre-industrial global average temperature of exactly 14 deg C, it is defined as the baseline temperature for computing anomalies.” It won’t make much difference in the reported results, but one can then easily make comparisons between reports from different times and authors without needing to know what baseline was used for the particular report.
    What I have said above is still valid if anomalies are computed at the station level instead of at the global level. If one adds (or subtracts) a constant to the baseline, a computed anomaly will differ only by that constant. Any subsequent operations such as calculation of trend lines of converting back to actual temperatures will not be affected by the constant.

    • The truth about “climate change” is that the globe is warmer than it was 30,000 years ago, cooler than it was 3000 years ago, warmer than it was 300 years ago and cooler than it was 30 years ago.

      • Three years ago would be weather, rather than climate, if there be such a thing as global weather.

        • Three years ago would be weather

          The weather 3 years ago left it’s impression in climate (big or tiny). Just look at what an El Nino does to climate, and that is weather.

      • The powers of ten alternation breaks down at 300 Ka, since that was also during a glaciation, so was colder than now. However 3 Ma was warmer than now.
        Climate constantly changes, has usually been warmer during the Phanerozoic Eon (last 540 million years), and nothing the least bit out of the ordinary or worrisome is happening as a result of a fourth molecule of vital plant nutrient (photosynthesis fuel) in 10,000 dry air molecules.

      • Yes, every year’s average WX goes into the computation of climatically significant averages of 30, 100, 300, 1000 years, etc.

  18. LOL, due to methodology changes, the “global” temperature for …hmmm, I think it was 1997 as stated by NASA was over 1C warmer than the current tempertures. You have to work it out from their anomaly and baseline since they don’t state the temperature directly. It’s amusing to see how much wiggle room there actually is in the processing.

  19. Why can’t climate science standardize on a baseline period

    But even with a equivalent period unless they use the same process to getting a baseline they won’t be the same.
    After listening to Mosher, I’m not sure how much of BEST is even measurement, their baseline isn’t going to be equal to CRU or GISS baseline.

  20. Even the IPCC can’t decide on a baseline:
    13.3 Defining the Baseline
    A baseline period is needed to define the observed climate with which climate change information is usually combined to create a climate scenario. When using climate model results for scenario construction, the baseline also serves as the reference period from which the modelled future change in climate is calculated.
    13.3.1 The Choice of Baseline Period
    The choice of baseline period has often been governed by availability of the required climate data. Examples of adopted baseline periods include 1931 to 1960 (Leemans and Solomon, 1993), 1951 to 1980 (Smith and Pitts, 1997), or 1961 to 1990 (Kittel et al., 1995; Hulme et al., 1999b).
    There may be climatological reasons to favour earlier baseline periods over later ones (IPCC, 1994). For example, later periods such as 1961 to 1990 are likely to have larger anthropogenic trends embedded in the climate data, especially the effects of sulphate aerosols over regions such as Europe and eastern USA (Karl et al., 1996). In this regard, the “ideal” baseline period would be in the 19th century when anthropogenic effects on global climate were negligible. Most impact assessments, however, seek to determine the effect of climate change with respect to “the present”, and therefore recent baseline periods such as 1961 to 1990 are usually favoured. A further attraction of using 1961 to 1990 is that observational climate data coverage and availability are generally better for this period compared to earlier ones.
    Whatever baseline period is adopted, it is important to acknowledge that there are differences between climatological averages based on century-long data (e.g., Legates and Wilmott, 1990) and those based on sub-periods. Moreover, different 30-year periods have been shown to exhibit differences in regional annual mean baseline temperature and precipitation of up to ±0.5ºC and ±15% respectively (Hulme and New, 1997; Visser et al., 2000; see also Chapter 2).
    13.3.2 The Adequacy of Baseline Climatological Data
    The adequacy of observed baseline climate data sets can only be evaluated in the context of particular climate scenario construction methods, since different methods have differing demands for baseline climate data.
    There are an increasing number of gridded global (e.g., Leemans and Cramer, 1991; New et al., 1999) and national (e.g., Kittel et al., 1995, 1997; Frei and Schär, 1998) climate data sets describing mean surface climate, although few describe inter-annual climate variability (see Kittel et al., 1997; Xie and Arkin, 1997; New et al., 2000). Differences between alternative gridded regional or global baseline climate data sets may be large, and these may induce non-trivial differences in climate change impacts that use climate scenarios incorporating different baseline climate data (e.g., Arnell, 1999). These differences may be as much a function of different interpolation methods and station densities as they are of errors in observations or the result of sampling different time periods (Hulme and New, 1997; New, 1999). A common problem that some methods endeavour to correct is systematic biases in station locations (e.g., towards low elevation sites). The adequacy of different techniques (e.g., Daly et al., 1994; Hutchinson, 1995; New et al., 1999) to interpolate station records under conditions of varying station density and/or different topography has not been systematically evaluated.

  21. The discussion comparing anomalies with different baselines is germane, however the big take away here, to me, got lost in this matter. I have been talking (here) about the lack of warm water back when the 2015-16 El Nino was rising. I was a bit surprised (and suspicious) of how high it got, but then, not surprised at how fast it dropped (record decline) in 2017. I suggested to Tisdale at the time that he, or someone more knowledgeable than I CALCULATE the bounds of likely temperature from the thin warm layer to see where the temps are likely to go (remember the experts were thinking a continuing or a repeat El Nino was in the offing).
    I’ve also more recently taken up this lack of warm water and a disconnect with surface temperatures and the (restricted) ENSO zone as an indicator of whither temperatures. With cold water not so much welling up at the eastern end of the equator, but slanting down into the equatorial zone from cold blobs in NH and SH and an impotent W. Pacific Warm Pool – cool at both ends. Also, the rather quick change from persistent warm blobs in the El Nino development period to persistent cold blobs in the temperate zones since looked like world temperatures were going to follow these cooling effects and ignore the equatorial band. These too might have been calculated by the specialists to give a forecast (as I did a year ago by eyeball).
    I checked to see if I was typing in Russian because my entreaties didn’t seem to interest a generally argumentive, sharp crowd here at WUWT. I was even beginning to think that only Ben Santer and Michael Mann saw my offerings, noting that the latter at least twitters instantly after a controversial blog post appears on WUWT so he’s watching. Hey, I’m only a geologist and engineer – so what do I know. Anyway, thanks to Ryan Maue my analyses have been belatedly independently corroborated. Maybe now some PhD nouveau climate student will do the calculations.

  22. Anthony, I understand the point you are trying to make, but only after reading your comments further down. The article was not clear, I honestly thought your vacation was causing you to go senile.
    Was there something in particular that triggered this post? I have never seen the general public comparing anomalies, most of the alarmists use only their favored temperature set, GISS, not specific anomaly values, except in reference to the ‘scary’ 2 degree threshold, but that at least does have a quasi-standard baseline.
    As to your point, yes, it would be nice to see a standard baseline used in the climate science community. Either that or attempts at absolute temperatures, but that is a much more difficult, if not impossible task.

    there is no international standard to calculate the average temperature, and depending upon the algorithm you choose, it is possible to show the earth on average is both warming and/or cooling at the same time.
    you are right. my results show it is already cooling, if you look at a globally balanced sample
    maxima or minima
    in degrees C/annum

    • Forrest Gardener
      Because it fits 100%? To define a function you need at least 4 points.
      Admittedly, the period I looked at (1973-2015) is approximately half a Gleissberg cycle.
      So, the whole wave is a sine wave, wavelength 87 years.
      Still, I think for the half GB the parabola proves my pint, i.e. all warming and cooling is natural.
      Man made warming either does not exist or is too small to make even make a dent in what nature gives us,
      hence my correlation of 100% for the speed of warming/cooling.
      they had that already more or less figured out before they started with the CO2 nonsense:

  24. Rick Perry, Sec. of Energy, is discussing the need for nuclear energy and environmental issues right now on todays White House Press briefing.

  25. Rick Perry: “Climate changes, always has. Mankind is contributing to it. The question is how much. Let’s have a conversation about that.”

    • Humans contribute to local climate change, but it doesn’t add up to enough noticeably to affect the global average. Unless your only data come from urban heal islands which used to be cool, dark forest.

  26. I was just looking at these National Weather Service guidelines:
    1. Place the thermometer 5 feet above the ground (+/- 1 ft.). A thermometer too low will pick up excess heat from the ground and a thermometer too high will likely have too cool of a temperature due to natural cooling aloft. 5 ft. is just right.
    2. The thermometer must be placed in the shade. If you put your thermometer in full sunlight, direct radiation from the sun is going to result in a temperature higher than what it should be.
    3. Have good air flow for your thermometer. This keeps air circulating around the thermometer, maintaining a balance with the surrounding environment. Therefore, it is important to make sure there are no obstructions blocking your thermometer such as trees or buildings. The more open, the better.
    4. Place the thermometer over a grassy or dirt surface. Concrete and pavement attract much more heat than grass. That is why cities are often warmer compared to suburbs. It is recommended to keep the thermometer at least 100 ft. from any paved or concrete surfaces to prevent an erroneously high temperature measurement.
    5. Keep the thermometer covered. When precipitation falls, you do not want your thermometer to get wet as that could permanently damage it. A Stevenson screen is a great place to store thermometers and other instruments as they provide cover as well as adequate ventilation. If you can’t get one, a simple solar radiation shield is adequate.

    … and wondering how a person places a thermometer in the shade and out of direct sunlight AND keeps it in the open and positioned for good air circulation at the same time?
    I still see the perfect temperature-measuring spot as elusive. How does anybody agree that they are even measuring the same thing consistently from one location on Earth to the next ?
    Forget whether experience agrees with a scientific guess or not, Mr. Feynman (yeah, THAT Feynman) — just tell me how the heck do I even take a blasted temperature measurement to help confirm a scientific guess or not !

    • Measuring in precisely the same way, at the same times of day at exactly the same spot will tell you about the changes there over time. But how many such good locations are there? And how can they represent the planet?
      Hence, satellites and balloons are the only even remotely good enough data for scientific purposes. Floating temperature gauges in the oceans move around too much. Even balloons aren’t sampling exactly the same volumes of the atmosphere.

      • Measuring in precisely the same way, at the same times of day at exactly the same spot will tell you about the changes there over time. But how many such good locations are there?

        Probably not many. But as long as they in general do the same bad thing, the day to day change will be as good as it can get.
        This is one of the reasons I follow the day to day change in a single station, subtract yesterday’s Tmin from today’s Tmin as difference, intra-day change Tmax-Tmin, and average Tmin and Tmax.
        You can look at all three, and get a good idea of what’s happening where we have surface stations, and they they change from day to for long periods https://i2.wp.com/micro6500blog.files.wordpress.com/2017/01/1980-series1.png

  27. NASA (the National Adjusting the Science Association) CAN’T use 1980-2010 as a baseline period, because they haven’t yet decided how how or cold that period was. Cooling the past can’t begin while youre still using part of that period in your hottest evah decade claims.

  28. Thanks for the CFS data link.
    The low solar activity cooling regime is in place & will continue until SC25 starts, 2019-20.
    Using my tried and true F10.7-TSI-SST model, which is based an intimate perfect working knowledge of the temporal relationship between these three measures, I’ve estimated Had3SST will drop a further 0.27C to 2020 from the Dec 2016 Had3SST value of 0.447C, to 0.178C (+.05/-.1), putting the end of solar cycle value (if it were to end at the year-end) of somewhere between just above to just below the cycle SST yearly starting value of 0.141C in 2008.
    CFSv2 2m will drop along with it, possibly going to “zero” or negative by 2019-20.
    This estimate will be updated in January 2018 after the 2017 numbers are all in. It could go lower.
    The very best way to keep up on ocean warming/cooling is the daily 7-day SSTa change:

    • Using my tried and true F10.7-TSI-SST model, which is based an intimate perfect working knowledge of the temporal relationship between these three measures
      Only in religion and cults does one find ‘intimate perfect working knowledge’. Not in science…

      • The perfect knowledge of these relationships and principles is based on science, research, & work.
        Cult? At least I’m not involved in the continual promotion of your cult of personality. 😉
        You have no idea what I’ve done, so the attitude towards me is wholly unwarranted.
        A few days ago you were trying to tell me there is no such F10.7-TSI-SST relationship. Instead you presented the rather false and pathetic formula as you did here today that you use to describe the sun-earth temperature relationship to support your theory that the sun only warms by 0.1C. The formula this mathematical theory is based on has no predictive power through a solar cycle like my work.
        My work is powerfully predictive and had immediate application to my knowing the timing of the 2015-16 ENSO. Three years ago I said on an ENSO blog post here,
        “…Climate change comes from solar changes. Solar activity ramped up late last year and has since tapered off. The “recharge” of the oceans from that rampup is now dissipating. If and only if there is another spike in solar activity this year will there be an El Nino.
        I said that then because I knew at that time that all the ENSOs at the top of the solar cycle occured above the 120 sfu level AND are delayed due to the temporal relationship of F10.7cm to TSI.
        The green arrow in the image below signifies the time when I first plotted and realized this relationship.
        The rest of my model involved first smoothing the daily data and finding more confirmation of the temporal relationship, and creating a very nice low error TSI predictor based on the SWPC monthly SSN/F10.7cm forecast. I used the SWPC 2016 F10.7 forecast in late 2015 to forecast the Had3SST change over the year based on a second empirically derived regression formula of TSI-SST. I was less than 3% off.
        The SC24 TSI rise & maximum drove the whole 0.6C 2008-2016 SST spike, and is now cooling us off.
        The solar cycle influence for SC24 was 0.6C, not 0.1C. You’re way off Leif. My stuff works.
        The SC24 solar cycle influence driving the 2009-10 and 2015-16 ENSO is very apparent.

  29. Anthony,
    I thought you would know this. The WMO defines climatological periods of reference ending in the last complete decade, while for comparison purposes also establishes a fixed reference period. Then it is up to research institutions to adhere to this standard or not.

    4.8.1 Periods of calculation
    Under the current WMO Technical Regulations, recognising the realities of a changing climate, climatological standard normals are defined as averages of climatological data computed for successive 30-year periods, updated every ten years, with the first year of the period ending in 1, and the last year, with 0. That is, consecutive 30-year normals include: 1 January 1981 to 31 December 2010, 1 January 1991 to 31 December 2020, and so forth. Countries should calculate climatological standard normals as soon as possible after the end of the decennium. Climatological standard normals periods should be adhered to whenever possible in order to allow for a uniform basis for international comparison.
    Also under the WMO Technical Regulations, recognising the need for a stable base for long-term climate change and variability assessment, a fixed reference period is defined as the 30-year period 1 January 1961 to 31 December 1990. This period should be used to compare climate change and variability across all countries relative to this standard reference period. It will remain fixed in perpetuity, or until there is a sound scientific reason to change it.

    So now you know. Some are using the climatologial standard normal 1981-2010, that will be changed to 1991-2020 in less than 3 years, while others are using the fixed reference period 1961-1990. Both are doing it in accordance to WMO guidelines.

    • “that will be changed to 1991-2020 in less than 3 years”
      There are simple practical considerations that cause people to make different choices, which the WMO recognises here. GISS uses 1951-1980 because that was the most recent 30 year period when they started. And they now have a large base of published numerical data. If they changed, then whenever you saw GISS data you’d have to look up which anomaly they were using.
      On the other hand, the anomaly base temperature is supposed to be your best estimator of present data. That ensures that you don’t have to worry about whether the sample for a given month includes the right balance of warm and cold places, because you have subtracted out the difference. If there has been significant drift since the base period, this works less well. That is why, when HADCRU (using 1961-90)T included more Arctic stations in V4, the anomaly (and trend) went up.
      Then there is the issue that the anomaly base is first used for individual stations, so you have some work to do if they don’t have data in the period. That is another reason why 1961-90 is popular with people using GHCN; they can include more stations. It’s also why they don’t use more than 30 years. However, the issue is manageable – BEST uses the same least squares system I do, which doesn’t need a fixed period at station level. But you do need eventually to decide on a reference period.
      So NOAA uses 1961-90 for its initial average calculation. Once the data has been aggregated, you can convert to any other base just by subtracting the average for that period. So NOAA converts to 20th century for a lot of reporting.
      Satellite data, of course, has to use some period since 1979.

    • The 30 year periods of which you write were decided on back in the mid-1930s.
      That was a time of printed materials, before computers (as we know them), and before the United Nations. Moreover, the issue of interest was more about meteorology, and less about using scary climate to justify “social justice” — or whatever it is that is going on.
      As he who is about to go on vacation says “why can’t climate science do a simple thing like standardize on a baseline period ?
      But I would add, it doesn’t have to be 30 years.
      I vote for 73, it is a nice prime number.

    • Javier,
      Rather than a step up, as during the switch from 1971-2000 to 1981-2010, Warmunistas are liable to get a nasty surprise after the switch to 1991-2020. For that matter, if the rest of this year, 2018 and 2019 drop back under the present baseline, as after the last super El Nino, the dreaded Pause will be on again.

      • “to get a nasty surprise after the switch to 1991-2020”
        No-one gets a surprise. It is an elementary calculation, which does not cause difficulty in the real world.

      • Nick,
        Not sure you’re familiar with the real world, but the surprise I have in mind is that the nest 30 years are liable to be cooler than the past 30 years, as demonstrated by the past 300, 3000, 30,000, 300,000, 3 million, 30 million, 300 million and three billion years of climate history.

    • Is that graph in Farenheit?
      And, yes, that presentation makes it look as though there is very little change, but sometimes even small changes can have important consequences. Going from -0.5C to 0.5C – just one degree – means that my ice lolly drops off the stick and turns to slush. Probably ruins my trousers. This is the sort of horror that Margaret Thatcher was warning us about, and why we have to hang everyone who breathes out.
      So perhaps a rethink on the presentation.

  30. This post should be deleted and rewritten. It should be argueing temps are increasing at similar rates looking 37 years apart. In particular, looking at 2 neutral El Nino years we have:
    1980 +0.27°C warmer than the 1950-1980 baseline
    2017 +0.26°C warmer compared to 1981-2010 baseline
    And the later baseline was also warmer.
    I hate to say it but that sounds like an argument that temperatures are continuing to increase with a near linear trend. That’s not what most WattsUpWithThat readers, including myself, were expecting 3 years ago.

  31. “I’ve added a caveat in the title to reflect this: (depending on who you ask)”
    It doesn’t reflect it very well. There isn’t any data source saying that the temperature now is similar to 1980. If you don’t take account of base differences, you can mix them to get any number you like. GISS then with NCEP now, it’s a small difference. If you compare GISS now with NCEP then, the difference is huge.

      • Average UAH anomaly for first five months of 2017: +0.31 (thanks to high May figure)
        Average UAH anomaly for first five months of 1980: +0.01
        So I was off by a tenth of a degree. The anomaly gain from the first five months of 1980 to the same months of 2017 is 0.3 degrees C, not 0.2. That equates to 0.8 degrees per century.
        Still not very scary, half a degree more warming in the next 63 years. And the preliminary May figure might be revised downward.

    • “a 0.2 degree C difference in 37 years huge”
      ERA-Interim, using a 1981-2010 base, says the temperature in 1980 was negative. GISS says the 2016 average was about 1°C. That shows boost you can get with base period fiddling.

      • Nick,
        As you are well aware, 2016 was a super El Nino year.
        If you use the same base, as I did above, for UAH, you get a gain of only 0.3 degree C for 2017 over the same part of 1980, for which year the average monthly anomaly was indeed very slightly negative.

      • Gabro,
        You’re missing my point, which is the difference you can make using different base periods in a comparison. As Joe Bastardi points out below, a reasonable estimate of warming from 1980 to now is about 0.45C. As I point out above, a reasonable estimate of the difference between a 1951-80 and a 1981-2010 base is 0.42. So if you use the 1959 base for 1980, and the 1981 base for now, you get something like .45-.42 = not much change. But if you do it the other way, you get .45+..42=0.87C, which would be a large change in 37 years.

      • Nick,
        I get your point.
        Mine is that in UAH, 2017 has been only 0.3 degrees C warmer than the first five months of 1980. Hardly anything to get worked up about.
        Of course you’re right that the 1950s, ’60s and ’70s were cooler, so you’d get a bigger difference with an earlier baseline. That is, until you go back to the 1920s, ’30s and ’40s.
        If you compared the past 30 years with the 30 years around 1690, we would indeed be importantly warmer. But the same interval would be cooler than the 30 years from 1181 to 1210, or many other such intervals in the Medieval WP.
        The warming since 1979 is trivial.

  32. I think Anthony needs a holiday. Maybe someone in New Zealand can show him how to convert temperatures from one baseline to another. Ask a high school student, it’s part of the curriculum.

    • We are not talking about a single thermometer, we are talking about a homogenised average. It’s simple algebra. I note Anthony has now totally rewritten this article to cover his tracks 😉

    • See Mosher’s comment below about converting between absolute temperatures and anomalies. This is not rocket or climate science.

  33. I much prefer to see the actual temperatures, rather than contemplate anomalies to the second decimal.
    NOAA reported: “The combined global average temperature over the land and ocean surfaces for April 2017 was 0.90°C (1.62°F) above the 20th century average of 13.7°C (56.7°F)—the second highest April temperature since global records began in 1880, trailing 2016 by 0.17°C (0.31°F) and ahead of 2010 by 0.0.7°C (0.13°F).”
    Mmm, so approaching a searing 15°C, boy the world is steaming /sarc.
    And for May, NOAA reported: “Averaged as a whole, the global land and ocean temperature for May 2017 was 0.83°C (1.49°F) above the 20th century average of 14.8°C (58.6°F) and the third highest May in the 138-year global records, behind 2016 (+0.89°C / +1.60°F) and 2015 (+0.86°C / +1.55°F). ”
    But to keep up their hottest “evar” meme, they also noted: ‘May 2017 was characterized by warmer- to much-warmer-than-average conditions across most of the world’s land and ocean surfaces. However, near- to cooler-than-average conditions were present across the eastern half of the contiguous U.S., eastern Europe, western and north-central Russia, as well as parts of the northern and southern Atlantic Ocean, northern and southern Pacific Ocean, and the tropical Indian Ocean.”
    “The global land-only surface temperature was the coolest May land temperature since 2011 and the seventh highest since global records began in 1880 at 1.15°C (2.07°F) above the 20th century average 11.1°C (52.0°F).” Wow, the land temperature averaged only 12°C in May, no wonder I have the heater on.

    • “Wow, the land temperature averaged only 12°C in May, no wonder I have the heater on.”
      That is why it is very foolish of NOAA to quote such an average in these reports (they explain here, S 7). Not only is it almost impossible to measure properly, but it is meaningless. Most places were nowhere near 12°C in May. But if you say that the average anomaly was 1° (it wasn’t, in May), then there is a reasonable chance that it was warmer than usual where you are, whatever usual is.

  34. Buuuu …buuu …. buuuu …. buuuuuut … I just read in my MSM feed …. Seaaaaaaaaaa levelllllllllllllll ….. FIFFFFFFFFFtyyyyyy …. perCENNNNNNT …. FAAAAASterrrrrrrrrr ………
    …. OVErrrrrrrrrrr … the .,… PAST …. TooooooWENNNNNty YEEEEEAAAaaaarrrrrrrrrrrrrrs!!!!
    Extra … Extra …. read all about it!

  35. Gavin says that he can accurately and precisely take Earth’s surface temperature with 50 stations. If they all have continuous records since 1880, stations uniformly maintained during that time, with no switch to electronic thermometers, and cover the land surface uniformly, to include elevation differences and are all in areas which have been rural all that time, away from pavement, then, yes, maybe, theoretically, but for the land only. But there are few if any such sites.
    Antarctica’s fringes only started getting measured continuously in the 20th century, and at the South Pole only since 1957, IIRC. There has been no warming at the SP, which is precisely where it should be most evident, according to AGW theory.

    • Joe,
      Anth@ny has clarified that the point he was making was precisely about the problems with using T anomalies from different baseline periods. He maybe should have made the point more explicitly rather than relying upon sub rosa satire.

    • PS: Some would say that 0.3 degrees C warmer for the first five months of 2017 over 1980 is actually fairly near.

  36. Simply no place left globally(the oceans) for warmth to come from.
    Few of we Meteorologist put any real credence in a trace gas(vital to life).

  37. We calculate absolute temperature.
    You can choose any base peroid you like.
    For display purposes and consistency with the most widely used series (hadcrut) we use 1951-80.
    U can play with periods to your hearts content if you first do the real series in absolute as we do.
    It’s not an issue worth discussing.
    Waste of time.
    Not scientifically relevant.

    • …You use the same time period (51 – 80) when “scientists” like yourself were claiming the next “Ice Age” was just around the the corner…..Things that make you go…hmmmmmm……

  38. If u use the baselines of different time periods, you get nonsense. We should be more patient with the climate to change. Climate is slow, like evolution. Its no use to watch evolution from one month to the other.

    • Except that new species do spring into existence in a single generation.
      New climates, not so much.

  39. Well there certainly has been lots of squawking, both here and elsewhere. Some the elsewhere squawking was downright mean, but that’s OK, it’s part of the rigid mindset those people have, they fear change, and they fear different ways of looking at things.
    Joe Bastardi and Leif Svalgaard (among others) point out that the comparison is ridiculous, and that is indeed the point; It is. But how is the public supposed to be able to interpret these differing surface temperature presentations done on different baselines? The answer is: they can’t, unless there is a standardized baseline.
    All the squawking elsewhere has shown me that there’s really no interest in the climate science community in coming up with a standardized baseline for public surface temperature presentations, they’d rather defend their own work and declare “Watts is an idiot for saying so”. It’s pretty typical, and exactly what I expected. They don’t like change, they really don’t like anyone else suggesting that they way they present temperature data might not be in the best public interest, because after all, they are saviors of the planet and who are you to question us.
    As Steve Mosher likes to say: Too Funny!
    On the plus side, I’ve been given a marvelous gift from all this squawking. Watch this space after I return. For now I’m closing the thread, as I’m heading out, but there will be a new post sometime after I return on this very topic.

Comments are closed.