(Image Credit: WoodForTrees.com)
By Werner Brozek, edited and with introduction by WUWT regular “Just The Facts”
Your help is needed in building a regular temperature trend analysis for WUWT. With much attention being focused on how much warming, or lack thereof, has occurred in Earth’s recent past (1, 2, 3, 4) it seems worthwhile to establish a regular update that provided a consummate summary of the key temperature records and their associated trends. Fortunately, WUWT regular Werner Brozek has been compiling just such an update and posting it in comments on WUWT and Roy Spencer’s website. As such, we would like to present an expanded version of Werner’s analysis for your input and scrutiny, before finalizing the content and form of these regular updates. As such, please review the following and lets us know, if it appears to be factually accurate, what you think of the layout, what you think of the content, if you think certain links should be images or images should instead be links, any additional improvements that can be made. There are few additional specific questions included in Werner’s analysis below. Thank you for your input. JTF
—
Temperature Trend Analysis
By Werner Brozek
This analysis has three section covering 6 data sets, including GISS, Hadcrut3, Hadsst2, Hadcrut4, RSS and UAH:
Section 1, provides the furthest date in the past where the slope is a least slightly negative.
Section 2, provides the longest time for which the warming is NOT significant at the 95% level.
Section 3, provides rankings of various data sets assuming the present ranking stays that way for the rest of the year.
Section 1
This analysis uses the latest date that data is available on WoodForTrees.com (WFT) to the furthest date in the past where the slope is a least slightly negative. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, I give the time from October so no one can accuse me of being less than honest if I say the slope is flat from a certain month.
On all data sets, the different times for a slope that is at least very slightly negative ranges from 8 years and 3 months to an even 16 years.
1. UAH Troposphere Temperature: since October 2004 or 8 years, 3 months (goes to December)
2. NASA GISS Surface Temperature: since May 2001 or 11 years, 7 months (goes to November)
3. Wood For Trees Temperature Index: since December 2000 or 11 years, 9 months (goes to August)
4. Hadley Center (HadCrut3) Surface Temperature: since May 1997 or 15 years, 7 months (goes to November)
5. Hadley Center (HADSST2) Sea Surface Temperatures: since March 1997 or 15 years, 8 months (goes to October)
6. RSS Troposphere Temperature: since January 1997 or 16 years (goes to December) RSS is 192/204 or 94% of the way to Ben Santer’s 17 years.
7. Hadley Center (Hadcrut4) Surface Temperature: since December 2000 or an even 12 years (goes to November.)
Here they are illustrated graphically;
you can recreate the graph directly here.
Here is an alternate graphical illustration;
you can recreate the graph directly here.
(Which of these illustrations do you prefer? Are they too cluttered to include in one graph? If so, how can we make this more user friendly?)
Section 2
For this analysis, data was retrieved from SkepticalScience.com. This analysis indicates for how long there has not been significant warming at the 95% level on various data sets.
For RSS the warming is NOT significant for 23 years.
For RSS: +0.130 +/-0.136 C/decade at the two sigma level from 1990
For UAH, the warming is NOT significant for 19 years.
For UAH: 0.143 +/- 0.173 C/decade at the two sigma level from 1994
For Hacrut3, the warming is NOT significant for 19 years.
For Hadcrut3: 0.098 +/- 0.113 C/decade at the two sigma level from 1994
For Hacrut4, the warming is NOT significant for 18 years.
For Hadcrut4: 0.098 +/- 0.111 C/decade at the two sigma level from 1995
For GISS, the warming is NOT significant for 17 years.
For GISS: 0.113 +/- 0.122 C/decade at the two sigma level from 1996
(Note that we have concerns with using data from SkepticalScience.com, however we have not identified another source for this data. Does anyone know of a reliable alternative source where these data points can be readily accessed?)
Section 3
This section provides the latest monthly anomalies in order from January on. The bolded one is the highest for the year so far. I am treating all months equally and adding all anomalies and then dividing by the total number of months. This should not make a difference to the relative ranking at the end of the year unless there is a virtual tie between two years. After I give the average anomaly so far, I say where the year would rank if the anomaly were to stay that way for the rest of the year. I also show the warmest year on each data set along with the warmest month ever recorded on each data set. Then I show the previous year’s anomaly and rank.
The 2011 rankings for GISS, Hadcrut3, Hadsst2, and Hadcrut4 can be deduced through each linked source.
The latest rankings for UAH can be found here.
The rankings for RSS to the end of 2011 can be found here. (Others may also be found here)
With the UAH anomaly for December at 0.202, the average for the twelve months of the year is (-0.134 -0.135 + 0.051 + 0.232 + 0.179 + 0.235 + 0.130 + 0.208 + 0.339 + 0.333 + 0.282 + 0.202)/12 = 0.16. This would rank 9th. 1998 was the warmest at 0.419. The highest ever monthly anomaly was in April of 1998 when it reached 0.66. The anomaly in 2011 was 0.130 and it will come in 10th.
With the GISS anomaly for November at 0.68, the average for the first eleven months of the year is (0.32 + 0.37 + 0.45 + 0.54 + 0.67 + 0.56 + 0.46 + 0.58 + 0.62 + 0.68 + 0.68)/11 = 0.54. This would rank 9th if it stayed this way. 2010 was the warmest at 0.63. The highest ever monthly anomalies were in March of 2002 and January of 2007 when it reached 0.89. The anomaly in 2011 was 0.514 and it will come in 10th assuming 2012 comes in 9th or warmer.
With the Hadcrut3 anomaly for November at 0.480, the average for the first eleven months of the year is (0.217 + 0.194 + 0.305 + 0.481 + 0.473 + 0.477 + 0.445 + 0.512+ 0.514 + 0.491 + 0.480)/11 = 0.417. This would rank 9th if it stayed this way. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2011 was 0.340 and it will come in 13th.
With the Hadsst2 anomaly for October at 0.428, the average for the first ten months of the year is (0.203 + 0.230 + 0.241 + 0.292 + 0.339 + 0.351 + 0.385 + 0.440 + 0.449 + 0.428)/10 = 0.336. This would rank 9th if it stayed this way. 1998 was the warmest at 0.451. The highest ever monthly anomaly was in August of 1998 when it reached 0.555. The anomaly in 2011 was 0.273 and it will come in 13th.
With the RSS anomaly for November at 0.195, the average for the first eleven months of the year is (-0.060 -0.123 + 0.071 + 0.330 + 0.231 + 0.337 + 0.290 + 0.255 + 0.383 + 0.294 + 0.195)/11 = 0.200. This would rank 11th if it stayed this way. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2011 was 0.147 and it will come in 13th.
With the Hadcrut4 anomaly for November at 0.512, the average for the first eleven months of the year is (0.288 + 0.208 + 0.339 + 0.525 + 0.531 + 0.506 + 0.470 + 0.532 + 0.515 + 0.524 + 0.512)/11 = 0.45. This would rank 9th if it stayed this way. 2010 was the warmest at 0.54. The highest ever monthly anomaly was in January of 2007 when it reached 0.818. The anomaly in 2011 was 0.399 and it will come in 13th.
Here are the above month to month changes illustrated graphically;
you can recreate the graph directly here.
Appendix
In addition to the layout above, we also considered providing a summary for each temperature record, as is illustrated below for RSS. Please let us know if you find this format to be adventurous/preferred as compared to the category breakout above, and also please let us know if there are any additional analyses that might be valuable to incorporate.
RSS
1. With the RSS anomaly for November at 0.195, the average for the first eleven months of the year is (-0.060 -0.123 + 0.071 + 0.330 + 0.231 + 0.337 + 0.290 + 0.255 + 0.383 + 0.294 + 0.195)/11 = 0.200. This would rank 11th if it stayed this way. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2011 was 0.147 and it will come in 13th.
The rankings for RSS to the end of 2011 can be found here.
2. RSS has a flat slope since January 1997 or 16 years (goes to December). See:
Recreate graph here.
3. For RSS the warming is NOT significant for 23 years.
For RSS: +0.130 +/-0.136 C/decade at the two sigma level from 1990
See here.
Put in 1990 for the start date; put in 2013 for the end date; click the RSS button; then calculate.
About the Author: Werner Brozek was working on his metallurgical engineering degree using a slide rule when the first men landed on the moon. Now he enjoys playing with new toys such as the WFT graphs. Werner retired in 2011 after teaching high school physics and chemistry for 39 years.
—
Please let us know your thoughts and recommendations in comments below. Thanks Werner & Just The Facts
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.




Monckton of Brenchley says:
January 7, 2013 at 9:00 am
Yes, but that first graph needs to be followed immediately by a second showing the 350 year slow rise of temperatures from 1650 through 2012!
Followed by a simple but clear verbal statement
There was a routing to scrape CRN data via the web posted back in 2007 on Climate Audit:
http://climateaudit.org/2007/08/07/scraping-uscrn-data/
There was also an article that list the various sources of data here:
http://climateaudit.org/station-data/
The script here is old, might work but might need some tweaking if the website has changed since then,
http://www.climateaudit.info/scripts/station/read.uscrn.txt
if you want CRN data I wrote a package to do that back in 2011
Paul, if you want help with CRN and you know R the data can be pulled down and formatted rather easily. The difficult part is creating a us average from this data. Simply averaging the series will give you poor result , for that I have a few standard methods, but it takes a bit more skill
Thanks both, but it’s that level of skill / determination / time that I’m short of, and why I haven’t already done it. Basically to add a dataset I need a single publicly available file with monthly globally averaged data in it (actually for NSIDC I fetch monthly files and grep/sort them to make one). Steve McIntyre’s script is only one station at a time, and whole PhD’s have been written on how to combine multi-station data into a global/regional average!
Of course, someone else could do the scraping and republish it, then I could pull it from them!
Gras Albert says: January 6, 2013 at 3:08 pm
I prepared this graph from data extracted from WFT, it presents decadal trends in temperature anomaly increases/decreases since 1987 along with that of CO2, I thought the graph made the relationship between CO2 forcing and temperature change starkly obvious…
It makes it obvious to me there is no relationship!
Well, CRN is probably easier to than others because there are no UHI or any other “adjustments” required which is why I am a fan of the network. But what I am trying to understand is why there doesn’t seem to be a problem producing a monthly average using a much more difficult network like USHCN where there is a lot of fiddling for station/equipment changes and UHI but the network that requires no such fiddling seems to be *more* difficult to produce a monthly average as nobody has done it yet.
Slightly off topic but as good a place a any to raise this issue.
It is a real problem for me (red green color blind) to determine the colors of the traces on the graphs posted on woodfortrees. It would be a huge help to me and others who are colorblind (about 10% of your viewers) if you could either increase the line width on the chart traces by 2x – 3x or provide a check box where the user could choose a line width for the chart.
Beyond that, I think this is a great idea and visual representations can be very effective and powerful ways to quickly communicate data relationships.
Larry
pkatt says:
January 6, 2013 at 4:17 pm
I should add that I don’t mean to be harsh, but if you put the same data on a chart with +5/-5 degrees off of the zero line, what you see is a little wiggle that doesn’t even hit a degree.
————————
Try putting it on a graph that starts at 0°K and see what kind of line you get!
Showing the data at different scales is a good way to show how trivial the issue is. If displayed on several different scales it is easy to see that most of the alarm is manufactured. This has already been done over at junk science as I recall, he showed how changing the range and scaling of the y axis changes the “apparent severity of the issue”
I can’t find it right now, but the same data was graphed with various vertical scales and ranges on the Y axis.
Larry
Monckton of Benchley says:
“Today’s high CO2 levels – the 97% natural and the 3% human-released”
This may lead people to conclude that human activities have added only 3% to atmospheric CO2. In the interests of clarity, Monckton should point out that the 97% natural contribution refers to CO2 being recycled through the biosphere, whereas the 3% is that added to the atmosphere by the burning of fossil fuels which has seen the CO2 concentration rise from 280 ppm at the beginning of the industrial revolution to 390 ppm today This is a rise of 39%.
I am also unclear about what time period the 3% covers. According to the following sources, the rise in atmospheric carbon was only 2.0 ppm in the decade 2000-2009, which is only a 0.52% rise over that period.
Tans, Pieter. “Trends in Carbon Dioxide”. NOAA/ESRL. http://www.esrl.noaa.gov/gmd/ccgg/trends/.
Carbon Budget 2009 Highlights, globalcarbonproject.org, http://www.globalcarbonproject.org/carbonbudget/09/hl-full.htm,
Someone’s had some cherry-picking fun with that graph, haven’t they? 🙂
http://www.woodfortrees.org/plot/gistemp/from:1966/plot/gistemp/from:2001.95/trend/plot/gistemp/from:1996.8/to:2001.95/trend/plot/gistemp/from:1987/to:1996.8/trend/plot/gistemp/from:1977/to:1987/trend/plot/gistemp/from:1966/to:1977/trend/plot/gistemp/from:1966/trend
I’d like to see “Data Visualization” gurus such as Stephen Few or Edward Tufte (or comparable disciples) take a hand cleaning the imagery up.
The bits where the lines run over the titles/legends is particularly distracting.
Mr Shehan attributes to me a statement that I did not make. Some 40 per cent of the CO2 in the air is anthropogenic, not the 3 per cent that Mr Shehan attributes to me.
woodfortrees (Paul Clark) says:
January 7, 2013 at 8:42 am
> For adding CRN / NOAA and anything else, I need:
> URL to data source file in moderately sane text format (I’m used to reading
> most format horrors now, but basically it needs monthly anomaly data)
Here’s a core dump of what I have available for monthly data
GISS (needs to be reformatted)
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
Hadley 3 (needs to be reformatted)
http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt
Hadley 3 early monthly data (same data set as above, but in a nice columnar format, and also comes out a week or so earlier than the above version)
http://hadobs.metoffice.com/hadcrut3/diagnostics/global/nh+sh/monthly
Hadley 4 monthly data
http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/time_series/HadCRUT.4.1.1.0.monthly_ns_avg.txt
NOAA monthly data (Seems to be refreshed/updated every day or 2)
Note that even though it’s a text file, it gets mime-type “bin”, and web browsers have problems displaying it. Downloading it directly is better.
ftp://ftp.ncdc.noaa.gov/pub/data/anomalies/monthly.land_ocean.90S.90N.df_1901-2000mean.dat
various CET monthly data main page (actual data, not anomalies)
http://www.metoffice.gov.uk/hadobs/hadcet/data/download.html
Monthly sunspot numbers
http://solarscience.msfc.nasa.gov/greenwch/spot_num.txt
Very good Werner, Just The Facts,
This highlights what an excellent resource WoodForTrees is. Thanks Paul!
woodfortrees (Paul Clark) says:
January 7, 2013 at 8:42 am
First up, WTI has now been updated to use UAH 5.5
Thank you very much!
Earlier I had said
3. Combination of 4 global temperatures: since December 2000 or 11 years, 9 months (goes to August)
With the update, it would now be since December 2000 or 12 years (goes to November)
See:
http://www.woodfortrees.org/plot/wti/from:2000.9/plot/wti/from:2000.9/trend
Monckton of Brenchley says:
January 7, 2013 at 9:00 am
The period should be the longest period for which the combined mean of the RSS and UAH monthly global mean lower-troposphere temperature anomalies exhibits a trend that is no greater than statistically-indistinguishable from zero to 95% confidence (i.e., the warming, if any, should not exceed 0.05 K over the period).
In light of this, my question for Paul Clark is if this can be easily done. Thanks!
Werner Brozek says: January 6, 2013 at 7:49 pm
I would just like make some general responses from my perspective so far in no particular order. Justthefacts may also decide which other correlations that have been mentioned could be looked at beyond what my focus has been.
I think there may be two articles coming out of this exercise. One that will be similar to the current one, with a few additions and tweaks, the second is more in vein of what Monckton of Brenchley has suggested. The first article is intended for a more targeted audience and provide an in-depth and multifaceted analysis of the temperature trends. The second article is intended to isolate the most salient points and package them for a broad audience. I think we should focus on polishing up the first article and then we can begin work with on the second.
I agree that sine waves are more accurate, however the statements by NOAA and Santer seem to imply that things are wrong if the linear trend is 0 for 15 or 17 years. If these are their goal posts, we have to use linear trends to show that Earth has scored a goal or is close it. How could one even define a goal in terms of a sine wave?
I agree, the goal here is to document how long there hasn’t been warming, a linear trend does that best.
As far as bar graphs are concerned, at least the graph with the 7 lines can be viewed as a sideways bar graph to see for how long various data sets show no trend. Mind you, the bars are as thin as lines. But for other bar graphs. I would need someone with good computer expertise.
I think we are ok with lines versus bars. I would prefer to use WFT for charts, versus making them ourselves, in order to maintain calculatory independence. Additionally, I’d prefer to keep the production of the content of this article completely in your hands, so that you can readily reproduce it, and aren’t dependent on others to complete it.
(As far as the “0” is concerned, that is a big problem. Just to illustrate, suppose a data set said the anomaly is 0.2. What would this mean? For GISS, it is lower than 25th; for UAH, it is in 7th place; for Hadcrut3, it is in 19th place, etc. So to put things into perspective for the present year, I give the ranking if a certain anomaly were to continue for the rest of the year. So to just put all numbers into a bar graph, some bars would be very high but would mean little due to the baseline that was used.
Baselines are a mess, as John Christy pointed out a while back, even the RSS and UAH anomalies are not comparable because they use different base periods, i.e., “RSS only uses 1979-1998 (20 years) while UAH uses the WMO standard of 1981-2010.” There’s nothing that we can do about this.
I can do what WFT allows me to do, although there may be some tricks I have not figured out yet. I agree that the graph with 14 different things is not useful.
I don’t know if it’s not useful. It’s definitely messy, and isn’t suitable for mass audience, but it does paint a picture.
Would it work if I made 7 graphs, one for each data set? And for each data set, I would start the year from the point where the warming is not significant and draw a trend line from that point. Then on the same graph, I would draw a line for the longest line where the slope is 0. For RSS, it would look as follows:
http://www.woodfortrees.org/plot/rss/from:1990/plot/rss/from:1990/trend/plot/rss/from:1997/trend
I think we may want to create a section at the bottom of the article with an analysis for each data set, as you did already for RSS, such that it’s readers can view the data however they prefer.
Werner Brozek says: January 7, 2013 at 4:53 pm
An aside here, 28 hours, 4,080 views, 69 comments, and no one seems to have identified any flaws in your analysis. Good job. Maybe the Warmists are just going to concede that it’s not getting warmer…
Monckton of Brenchley says:
January 7, 2013 at 9:00 am
The period should be the longest period for which the combined mean of the RSS and UAH monthly global mean lower-troposphere temperature anomalies exhibits a trend that is no greater than statistically-indistinguishable from zero to 95% confidence (i.e., the warming, if any, should not exceed 0.05 K over the period).
If I could just address this one point for now, I can do one or the other, but not both. See the graph below that shows RSS with a slope of 0 for 16 years and the other slope from 1990 with the error bars for the 95% confidence. The slope of the middle line is 0.0128/year and since it goes for 23 years, the total rise is 0.0128 x 23 = 0.29 C. To have an increase of no more than 0.05 K, the time would be somewhere between 16 years and 23 years for RSS.
http://www.woodfortrees.org/plot/rss/from:1990/plot/rss/from:1990/trend/plot/rss/from:1990/detrend:0.3128/trend/plot/rss/from:1990/detrend:-0.3128/trend/plot/rss/from:1997/trend
RACookPE1978 says:
January 7, 2013 at 9:20 am
> “No one denies the world is warming. It has been naturally warming for 350
> years, and that continuous slow natural warming is feeding the hungry, clothing
> the naked, and saving the sick, and heating the cold.
Make that 18,000 years. Another graphic we need is the retreat of ice since the peak of the last ice age. At the peak of that ice age, most of Canada and the northern tier US was covered by ice sheets 1 or 2 *MILES* thick, For the past 18,000 years, the arctic ice cap has been in a general retreat with a few speed bumps along the way (e.g. Younger Dryas and Little Ice Age). Why are the warmists freaking out over the last 30 years? It’s merely continuing the pattern of the last 18,000.
Another embarressing fact is that 18,000 years, when the ice sheets began retreating, there were less than a million human beings on the planet, mostly living in caves unless they were in the tropics. So even if humanity were to have a massive holocaust, and the few survivors went back to being hunter-gatherers living in animal skins, the arctic ice would still continue retreating.
Gras Albert says: January 7, 2013 at 12:42 am
Hmmmm, perhaps the graph is not as clear as I thought, 🙂
It’s a plot of ‘ten year’ trends, i.e. the oldest decadal trend starts in 1987 and finishes in 1996, the most recent starts in 2003 and finishes in 2012, it is current through 2012. The graph shows that the rate of warming per decade peaked in 1992 and has decreased every decade since with all but UAH now showing decadal cooling…
Got it. I think it would need a bunch of explanation and background to support it, as the knee-jerk reaction, which was admittedly mine, is that the data is cherrypicked. Perhaps if you did it as a family of graphs showing the full trend and decadal trend for each record, (or perhaps one for the satellite records and one for the surface records), which then builds up to your final graph, it would be clearer. Also, you would need to copiously document the process and data sources used, ideally mirroring the results using a publicly available resource, as your graph is bound to garner howls and intense scrutiny from those who disagree with what it illustrates. If you are interested and open to investing the time, I would be happy to help you develop it into an article, once we complete this article and a few others in the works.
Werner Brozek says: January 6, 2013 at 9:45 pm
I just thought of an addition to my earlier email. I added two lines to the graphic that I put up before to show the +/- parts of the 95% uncertainty. Does this look better:
http://www.woodfortrees.org/plot/rss/from:1990/plot/rss/from:1990/trend/plot/rss/from:1990/detrend:0.3128/trend/plot/rss/from:1990/detrend:-0.3128/trend/plot/rss/from:1997/trend
I’d be inclined to put all three versions in as links and use this one;
http://www.woodfortrees.org/plot/rss/from:1990/plot/rss/from:1990/trend/plot/rss/from:1997/trend
as the image.
Gras Albert says: January 6, 2013 at 3:08 pm
I prepared this graph from data extracted from WFT, it presents decadal trends in temperature anomaly increases/decreases since 1987 along with that of CO2, I thought the graph made the relationship between CO2 forcing and temperature change starkly obvious…
CoRev says: January 6, 2013 at 6:17 pm
This is another example of how to show temp Vs Co2 http://woodfortrees.org/plot/rss/from:1980/to:2013/scale:50/plot/esrl-co2/from:1980/to:2013/offset:-350/plot/rss/from:1998/to:2013/scale:50/trend/plot/esrl-co2/from:1998/to:2013/offset:-350/trend/plot/uah/from:1998/to:2013/scale:50/plot/uah/from:1998/to:2013/scale:50/trend
I like the inclusion of CO2 in at least one of the header graphics.
Werner, what do you think about creating a introductory and/or concluding chart that includes CO2, along with the key temp trends? We could draft a paragraph or two in support, which discusses what it illustrates in dry and academic terms.