Guest essay by Paul Homewood
WUWT carried the story yesterday of the paper by Kodra & Ganguly, forecasting a wider range of temperature extremes in the future.
According to the Northeastern University press release, using climate models and reanalysis datasets, the authors found that
While global temperature is indeed increasing, so too is the variability in temperature extremes. For instance, while each year’s average hottest and coldest temperatures will likely rise, those averages will also tend to fall within a wider range of potential high and low temperate extremes than are currently being observed.
But is there any evidence that this has been happening? We can check what’s been happening in the US, by using the US Climate Extremes Index, produced by NOAA.
Of course, the US only accounts for 2% of the Earth’s surface, (except when there is a polar vortex, a mild winter or a drought in California), but it seems a sensible place to start. We also know that climate models often bear very little resemblance to reality!
Just to recap, the US Climate Extremes Index, or CEI, is based on an aggregate set of conventional climate extreme indicators which, at the present time, include the following types of data:
- monthly maximum and minimum temperature
- daily precipitation
- monthly Palmer Drought Severity Index (PDSI)
- landfalling tropical storm and hurricane wind velocity.
In terms of temperature, the CEI is
- The sum of (a) percentage of the United States with maximum temperatures much below normal and (b) percentage of the United States with maximum temperatures much above normal.
- The sum of (a) percentage of the United States with minimum temperatures much below normal and (b) percentage of the United States with minimum temperatures much above normal.
So, for instance, we can plot maximum temperatures during summer months:
http://www.ncdc.noaa.gov/extremes/cei/graph/1/06-08
And, minimum temperatures in winter:
http://www.ncdc.noaa.gov/extremes/cei/graph/2/12-02
The reds indicate the percentage of the US, which were “much above normal”, and the blues “much below normal”. The CEI also lists the actual percentages, so we can plot the “much aboves” in summer, and the “much belows” in winter, thus:
The trend is to an increasing percentage with above average summer temperatures, although recent years seem to be at similar levels to the 1930’s. (The CEI is based upon adjusted temperatures, before anyone asks).
In winter, though, the trend is decreasing.
We can now combine the summer and winter sets together.
[I have simply added together the percentages, although of course some areas could have experienced both hot and cold – think of it as an index].
Clearly, the overall trend is to extreme temperatures reducing. In other words, the area of the US experiencing unusually high or low temperatures is tending to grow smaller. (Although it is interesting to note the relative absence of such extremes in the years around 1970).
Of course, although this analysis tells us about the area of the country affected, it does not say anything about how extreme the temperatures are. But we can check this very simply, using the NCDC Climate At A Glance datasets.
The graph below shows the difference each year between winter and summer temperatures, for the country as a whole, along with a 10-Year average. As can be seen, the variation from winter to summer has been getting smaller in recent years.
The most extreme year was 1936, when the hottest summer on record (even after adjustments) followed the second coldest winter. I wonder how their models account for that?
FOOTNOTE
NOAA offer this definition of how they calculate their index:
The U.S. CEI is based on an aggregate set of conventional climate extreme indicators which, at the present time, include the following types of data:
- monthly maximum and minimum temperature
- daily precipitation
- monthly Palmer Drought Severity Index (PDSI)
- landfalling tropical storm and hurricane wind velocity
* experimental (not used with the Regional CEI)
Each indicator has been selected based on its reliability, length of record, availability, and its relevance to changes in climate extremes.
Mean maximum and minimum temperature stations were selected from the U.S. Historical Climatology Network (USHCN) (Karl et al. 1990). Stations chosen for use in the CEI must have a low percentage of missing data within each year as well as for the entire period of record. Data used were adjusted for inhomogeneities: a priori adjustments included observing time biases (Karl et al. 1986), urban heat island effects (Karl et al. 1988), and the bias introduced by the introduction of the maximum-minimum thermistor and its instrument shelter (Quayle et al. 1991); a posteriori adjustments included station and instrumentation changes (Karl and Williams 1987). In April 2008, maximum and minimum temperature data from the USHCN were replaced by the revised USHCN version 2 dataset. In October 2012, a refined USHCN v
ersion 2.5 was released and replaced version 2 data for maximum and minimum temperature indicators.
The U.S. CEI is the arithmetic average of the following five or six# indicators of the percentage of the conterminous U.S. area:
- The sum of (a) percentage of the United States with maximum temperatures much below normal and (b) percentage of the United States with maximum temperatures much above normal.
- The sum of (a) percentage of the United States with minimum temperatures much below normal and (b) percentage of the United States with minimum temperatures much above normal.
In each case, we define much above (below) normal or extreme conditions as those falling in the upper (lower) tenth percentile of the local, period of record. In any given year, each of the five indicators has an expected value of 20%, in that 10% of all observed values should fall, in the long-term average, in each tenth percentile, and there are two such sets in each indicator.
A value of 0% for the CEI, the lower limit, indicates that no portion of the period of record was subject to any of the extremes of temperature or precipitation considered in the index. In contrast, a value of 100% would mean that the entire country had extreme conditions throughout the year for each of the five/six indicators, a virtually impossible scenario. The long-term variation or change of this index represents the tendency for extremes of climate to either decrease, increase, or remain the same.
The index is built up from the Climate Divisional Database, and therefore reflects the area of the US, rather than a simple percentage of stations.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Mosher’s folly revealed: “If you live in the illusion that there is a historical truth that you can recover….”
That is the deconstructionist words of French philosophy, not words of science, spoken by a trained deconstructionist English major, in other words a historical revisionist. But there *is* a historical truth, exclaimed the little boy.
All I’ve read so far has led me to think the following about climate and radiative gases:
– The primary effect of radiative gases is to smear heat around the atmosphere and resist large changes in air temperature. This is why humid tropical areas have little temperature variation across the day, while deserts have massive diurnal variation. It also explains the minimal temperature variation in the atmosphere of Venus despite its remarkably slow axial rotation.
– There is also am impact of changes in radiative gas levels on global average temperatures, but it appears to be trivial compared to the smoothing and smearing effects. Small changes in cloud levels appear to have as much, if not more, influence than large changes in radiative gas levels to global average temperatures.
– Greater warming in the Arctic than elsewhere makes sense on this basis, but the different situation in the Antarctic suggests that changes in radiative gas levels aren’t the dominant driver of global temperature change. The reverse may be true. It appears reasonable, though, to state that man’s CO2 emissions have played a significant part in the increases of the last century.
– You would expect that, ceteris paribus, this would lead to higher minimum temps and lower maximum temps, with little change to ‘averages’. UHI, land use changes, data quality and confirmation bias impacting historic temperature adjustments may be causing a small exaggeration of this average temperature increase.
I’m not convinced by the analysis shown in this piece, but it certainly makes more sense than the gloom-and-doom political statements, sometimes masquerading as scientific studies, that seem designed to scare people into thinking a world with more greenhouse gases means more extremes of weather and much higher average temperatures. Less extremes appear to be more logical to expect, while empirical data suggests modest changes to global averages.
Not sure if I have a clue…. but here I go. Are these temps representative of an associated area, on a Thiessen polygon system like you would use for rainfall? An example would be do we have two records for all of North Dakota and eastern Montana that gets the same weight as two recordings near New York City. Then you figure the same polygon system and you have eight million people thinking it is hot in NYC and 700,000 of us freezing. There could be some inconsistencies in perception.
Extreme weather is extreme weather. We care about it as it can be dangerous. That would be the point of preventing it by trying to cut CO2. It makes no sense to call a mild winter “extreme weather”. It is less dangerous, not more.
Also the definitions of “extreme” are: reaching a high or the highest degree; very great. or furthest from the center or a given point; outermost. And to be extreme in the sense of a warm winter day, you would need to be further from the long-term mean in the warm direction than very cold days are in the cold direction. In a place like St. Louis, for example, say the average winter day is 40 C. They may have cold records of -10 C. So to be as extreme you would need to have a winter day that was 90 C. At least that’s one way of looking at it. Having a trend that winters are getting warmer by a few tenths of a degree per decade is NOT extreme.
At least if you want to use English as defined. But if you choose to use your own version of English where the words have special meanings I think it is only fair to let us know that in advance.
These two posts are directed at a few people above who want to count warm winters as extreme.
plowboy55 says:
July 31, 2014 at 3:51 pm
———————————————
The temperature is recorded at numerous points and then ‘gridded’, so that what you suggest regarding undue weighting to lots of stations in a small area is minimised (but there are other problems, aplenty). You bring up a good point about perception though.
If a greater and greater % of the population live in cities, with those cities growing larger, two things happen. First, larger cities create a larger heat-island effect, whereby all the concrete and tarmac infrastructure soak up solar heat during the day and release it during the night, making the nights and average temperatures hotter. Second, more people than before experience this, so more people are of the opinion that “it’s getting warmer” than would be the case if the rural/urban population balance was unchanged over the decades.
This urbanisation process makes it easier for the case of man-made global warming to be pushed, of which I’m sure the political advocates are well aware. If the end of this century sees almost everyone living in ever-larger conurbations, as per the UN Agenda 21, with rural temperature stations being increasingly phased out, the surface temperature record will have a large (artificial) component of warming and the general populace will find it congruent with their perception. Sneaky buggers.
Bill_W I think you mean F not C
40 degrees C would be 100 degrees F
-10 C would be 20 F
90 C would be over 200 F!
So, the climate really is changing:
it is getting calmer.
First, it’s good to see that NOAA have found themselves a proper filter in the form of the 9 point binomial. At least peaks and troughs are in about the right places.
Unfortunately the last ‘Climate at a glance” graph goes back to crappy running average, which they have not even managed to centre correctly. As a result the 1936 peak leaves the RM obviously too high for the following 10years.
Oh well, I supposed with their “limited” resources , at least it’s a start.
As time passes the number of records will decline. on the first day of recording any data all data is a record but unlikely to be exceptional.
Can anyone tell us the impact of time on frequency of records where they to be random rather than trended data?
Im just waiting for the alarmists to say how terrible it is that extremes of weather are decreasing
Paul, it would be helpful to include the definitions of what you are plotting. What is “normal” and what does “much above” mean.
http://www.ncdc.noaa.gov/extremes/cei/definition
In each case, we define much above (below) normal or extreme conditions as those falling in the upper (lower) tenth percentile of the local, period of record. In any given year, each of the five indicators has an expected value of 20%, in that 10% of all observed values should fall, in the long-term average,
As per usual with this kind of propaganda, there is the stupid idea that the average over some arbitrary period is somehow “normal” which suggests to the reader that any deviation is “abnormal”.
Then the top and bottom 10% are defined as being “extreme”. While this is technically correct, in the sense that the two ends of a piece of wood can be called the extremes, it is not the way this will be read by the layman public this presentation is aimed at who will understand this as OMG extreme weather. Indeed this is clearly the spin they are trying to put on it.
As always you should include a definition of what you are plotting. I had to go and dig this out from NOAA to understand what you were showing. I suggest you include the paragraph or link above in the article.
Peter K/Mi Cro:
In fact I referred them as urban-heat-island — dense met network– and rural-cold-island — sparse met network — effects. The former is over emphasized in the averaging of global temperature and later is less emphasized. Thus, global temperature is over estimated. The second issue is prior to 1957 the unit of measurement was different from the unit of measurement from 1957. Here while averaging the former gives less error while the later gives more error. For example 23.45 = 23.5 and 23.35 = 23.3 and thus the difference of 0.1 changes to 0.2 in oF and oC. So,after 1956 the averaging has positive effect. Also, with passing of time met stations are changed — place or instruments or shelter, etc –. Also number stations changed with the time.
Dr. S. Jeevananda Reddy
re. Greg Goodman says: July 31, 2014 at 8:40 pm
Apologies to NOAA , the last graph with the crappy running mean that was not centred correctly was Paul’s handiwork, not NOAA. I was mislead by the link below it which I understood to be a link to the original.
It was also incorrectly described as a “ten year average” which would have one point every ten years. What is shown is a ten year running mean ( a crappy distorting filter ) with a 5 year lag.
Hopefully Paul will take a tip from NOAA and find himself a proper filter.
The idea of displaying the summer average / winter average difference is a good one
The reduction in the annual variation from 40F in 1900 to about 30F in 2000 is notable.
“The reduction in the annual variation from 40F in 1900 to about 30F in 2000 is notable.”
But is it real, or an artifact of the poor sampling in 1900.
vuk’ says:
Annual hurricane number above average is often included in the extreme weather events.
Hurricanes occurrence is closely related to the AMO (Atlantic Multidecadal Oscilation).
====
You miss labelled the blue line. I’m guessing it is lagger 20 or so years.
Yes, accumulated cyclone ( hurricane ) energy does seem to follow AMO quite closely:
http://climategrog.wordpress.com/?attachment_id=215
Ryan Maue’s graphs show major storms and total storm count have both peaked around 2000-2005.
Cooling signs.
plowboy55 asked if their are inconsistencies due to population size diffs between NY and Montana?
my answer: Montana is part of flyover country for the Liberals who think nothing worth noting happens between I-5 and I-95.
I might be missing something here but isn’t this more or less what you’d expect to see the longer data is being gathered where the climate is stable (but possibly cyclic)?
Greg Goodman says: July 31, 2014 at 9:50 pm
………….
Hi Greg
Strong link between multi-decadal variability in the Arctic pressure and the N. Atlantic SST going back to 1860s appears to be reflected in the hurricane events frequency.
Greg Goodman
Paul, it would be helpful to include the definitions of what you are plotting. What is “normal” and what does “much above” mean.
Good point, thanks Greg.
I’ve added a footnote.
Keith, you were doing great until: “It appears reasonable, though, to state that man’s CO2 emissions have played a significant part in the increases of the last century.”
You seem to be pretty on point on everything. But keep in mind that mankind is only responsible for 15ppm of the 400ppm of CO2. Just one more small jump and you’ve got it all.
Cheers!
Eric
But…but…but…UHI affect is likely the cause of warming winter extremes. And that…wait!…we can’t say that because we KNOW that UHI is NOT the cause of summer warming extremes. Never mind.
CET shows summer v winter extremes also declining in England
http://notalotofpeopleknowthat.wordpress.com/2014/08/01/temperature-variation-becoming-less-extreme-in-england/
“That is the deconstructionist words of French philosophy, not words of science, spoken by a trained deconstructionist English major, in other words a historical revisionist. But there *is* a historical truth, exclaimed the little boy.”
actually not.
Suppose you have a written record that it was min 56. F in New york city on april 1, 1909
is that the truth? how do you check it?
You can assume it is the truth. Of course if you read the protocal for observers you will see that the proceedure REQUIRES them to round up or down.
So, what was the min temperature on april 1 1909. you dont know. you certainly cant test the proposition
that it was 56F, and if you believe the written record you only know this
1. if you believe the observer followed the procedure
2. if you believe he wrote down the number correctly
3. if you believe the thermometer was accurate
4. Then you know that the min temperature after rounding was 56.
so, once you give up the illusion that the know the truth, you are leftt with your best estimate GIVEN
the data and GIVEN the acceptance of certain assumptions.
On the other hand if I tell that F=MA then you can make a prediction about the future and test it.
It has nothing to do with french philosophy. deconstruction would admit none of what I stated above.
@Steven Mosher 8/1 8:13 pm
so, once you give up the illusion that the know the truth, you are leftt with your best estimate GIVEN the data and GIVEN the acceptance of certain assumptions.
On the other hand if I tell that F=MA then you can make a prediction about the future and test it.
What the duce are you going on about?
Of course you can make a prediction — it is easy to make predictions that fail.
Yes, you can make an accurate prediction about the future
GIVEN F=ma,
GIVEN you know the frame of reference,
GIVEN the measurements of F, m, a, are made accurately,
GIVEN they were written down accurately,
GIVEN the procedures are followed accurately.
Heck, you even have to assume constancy of the measurements over the future to achieve an Accurate prediction. Or you have to accurately predict the changes anticipated in your measurements to make a good prediction. Even mass isn’t necessarily a constant. Ever hear of a rocket, a case where dm/dt is less than zero by quite a bit?
We can launch a satellite into orbit using a 8 min time dependent vector with – eight degrees of freedom, air resistance at subsonic, transonic, supersonic modes as a function of time, altitude and orientation. We can with lots of practice, and some risk of mechanical failure. We can.
But how is our ability to use F=ma to launch a satellite in the future any justification for discounting the truth of what someone recorded in a temperature log in 1950?