Guest Post by Werner Brozek (Edited By Just The Facts)
GISS and other sets are poised to set new records in 2014, even without an El Nino. The present GISS record is a two way tie for first place with 2005 and 2010 both showing an average anomaly of 0.65. (By the way, in January of this year, it was stated that the 2010 anomaly was 0.67. I do not understand how 2010 lost 0.02 C over the last four months.) The present average over the first four months of 2014 is 0.64, so it is only 0.01 lower. However of greater significance is that the April anomaly was 0.73. If this anomaly were to continue for the rest of the year, the old GISS record would be shattered.
Below, I will provide the corresponding information for the other five data sets that I am following. All differences will be provided to the nearest 1/100 degree.
The current Hadsst3 average is 0.370, which is only 0.05 below its record of 0.416. And as is the case with GISS, the April anomaly was a huge 0.478, so if this anomaly were to continue for the rest of 2014, Hadsst3 would also set a new record.
The current Hadcrut3 average is 0.455, which is only 0.09 below its record of 0.548. And as is the case with GISS, the April anomaly was a huge 0.592, so if this anomaly were to continue for the rest of 2014, Hadcrut3 would virtually tie its record.
The current Hadcrut4 average is 0.500, which is only 0.05 below its record of 0.547. And as is the case with GISS, the April anomaly was a huge 0.641, so if this anomaly were to continue for the rest of 2014, Hadcrut4 would also set a new record.
The current RSS average is 0.222. This is 0.33 below the 1998 record of 0.550. This record seems safe for this year. Even the April anomaly of 0.251 would not challenge the record if it continued for the rest of the year.
The current UAH average is 0.171. This is 0.25 below the 1998 record of 0.419. This record seems safe for this year. Even the April anomaly of 0.184 would not challenge the record if it continued for the rest of the year. (Note: This applies to version 5.5.)
In the table, I have added a row 15 which I have labelled 15.dif and here I give the above differences between rows 4 and 13. Since no rank is first at this point, all numbers have the same sign indicating the present record is still higher than the present average in all cases.
In the parts below, as in the previous posts, we will present you with the latest facts. The information will be presented in three sections and an appendix.
The first section will show for how long there has been no warming on several data sets.
The second section will show for how long there has been no statistically significant warming on several data sets.
The third section will show how 2014 to date compares with 2013 and the warmest years and months on record so far.
The appendix will illustrate sections 1 and 2 in a different way. Graphs and a table will be used to illustrate the data.
Section 1
This analysis uses the latest month for which data is available on WoodForTrees.com (WFT). All of the data on WFT is also available at the specific sources as outlined below. We start with the present date and go to the furthest month in the past where the slope is a least slightly negative. So if the slope from September is 4 x 10^-4 but it is – 4 x 10^-4 from October, we give the time from October so no one can accuse us of being less than honest if we say the slope is flat from a certain month.
On all data sets below, the different times for a slope that is at least very slightly negative ranges from 9 years and 8 months to 17 years and 9 months
1. For GISS, the slope is flat since November 2001 or 12 years, 6 months. (goes to April)
2. For Hadcrut3, the slope is flat since August 2000 or 13 years, 9 months. (goes to April) The latest spike caused the time to start after the 1998 El Nino.
3. For a combination of GISS, Hadcrut3, UAH and RSS, the slope is flat since December 2000 or 13 years, 5 months. (goes to April)
4. For Hadcrut4, the slope is flat since January 2001 or 13 years, 4 months. (goes to April)
5. For Hadsst3, the slope is flat since December 2000 or 13 years, 5 months. (goes to April)
6. For UAH, the slope is flat since September 2004 or 9 years, 8 months. (goes to April using version 5.5)
7. For RSS, the slope is flat since August 1996 or 17 years, 9 months (goes to April).
The next graph shows just the lines to illustrate the above. Think of it as a sideways bar graph where the lengths of the lines indicate the relative times where the slope is 0. In addition, the upward sloping blue line indicates that CO2 has steadily increased over this period.

When two things are plotted as I have done, the left only shows a temperature anomaly.
The actual numbers are meaningless since all slopes are essentially zero. As well, I have offset them so they are evenly spaced. No numbers are given for CO2. Some have asked that the log of the concentration of CO2 be plotted. However WFT does not give this option. The upward sloping CO2 line only shows that while CO2 has been going up over the last 17 years, the temperatures have been flat for varying periods on various data sets.
The next graph shows the above, but this time, the actual plotted points are shown along with the slope lines and the CO2 is omitted.

Section 2
For this analysis, data was retrieved from Nick Stokes’ Trendviewer page. This analysis indicates for how long there has not been statistically significant warming according to Nick’s criteria. Data go to their latest update for each set. In every case, note that the lower error bar is negative so a slope of 0 cannot be ruled out from the month indicated.
On several different data sets, there has been no statistically significant warming for between 16 and 21 years.
The details for several sets are below.
For UAH: Since February 1996: CI from -0.043 to 2.349
For RSS: Since November 1992: CI from -0.022 to 1.867
For Hadcrut4: Since October 1996: CI from -0.033 to 1.192
For Hadsst3: Since January 1993: CI from -0.016 to 1.813
For GISS: Since August 1997: CI from -0.008 to 1.233
Section 3
This section shows data about 2014 and other information in the form of a table. The table shows the six data sources along the top and other places so they should be visible at all times. The sources are UAH, RSS, Hadcrut4, Hadcrut3, Hadsst3, and GISS.
Down the column, are the following:
1. 13ra: This is the final ranking for 2013 on each data set.
2. 13a: Here I give the average anomaly for 2013.
3. year: This indicates the warmest year on record so far for that particular data set. Note that two of the data sets have 2010 as the warmest year and four have 1998 as the warmest year.
4. ano: This is the average of the monthly anomalies of the warmest year just above.
5.mon: This is the month where that particular data set showed the highest anomaly. The months are identified by the first three letters of the month and the last two numbers of the year.
6. ano: This is the anomaly of the month just above.
7. y/m: This is the longest period of time where the slope is not positive given in years/months. So 16/2 means that for 16 years and 2 months the slope is essentially 0.
8. sig: This the first month for which warming is not statistically significant according to Nick’s criteria. The first three letters of the month are followed by the last two numbers of the year.
9. Jan: This is the January 2014 anomaly for that particular data set.
10.Feb: This is the February 2014 anomaly for that particular data set, etc.
13.ave: This is the average anomaly of all months to date taken by adding all numbers and dividing by the number of months. However if the data set itself gives that average, I may use their number. Sometimes the number in the third decimal place differs slightly, presumably due to all months not having the same number of days.
14.rnk: This is the rank that each particular data set would have if the anomaly above were to remain that way for the rest of the year. It will not, but think of it as an update 15 minutes into a game. Due to different base periods, the rank is more meaningful than the average anomaly.
15.dif: This is row 4 minus row 13. A number of less than 0.10 at this point in time means that a record is possible for 2014 for four of the data sets. Both of the satellite data would need a miracle to set a record this year in my opinion.
| Source | UAH | RSS | Had4 | Had3 | Sst3 | GISS |
|---|---|---|---|---|---|---|
| 1. 13ra | 7th | 10th | 8th | 6th | 6th | 7th |
| 2. 13a | 0.197 | 0.218 | 0.486 | 0.459 | 0.376 | 0.59 |
| 3. year | 1998 | 1998 | 2010 | 1998 | 1998 | 2010 |
| 4. ano | 0.419 | 0.55 | 0.547 | 0.548 | 0.416 | 0.65 |
| 5.mon | Apr98 | Apr98 | Jan07 | Feb98 | Jul98 | Jan07 |
| 6. ano | 0.662 | 0.857 | 0.829 | 0.756 | 0.526 | 0.92 |
| 7. y/m | 9/8 | 17/9 | 13/4 | 13/9 | 13/5 | 12/6 |
| 8. sig | Feb96 | Nov92 | Oct96 | Jan93 | Aug97 | |
| Source | UAH | RSS | Had4 | Had3 | Sst3 | GISS |
| 9.Jan | 0.236 | 0.262 | 0.507 | 0.472 | 0.342 | 0.68 |
| 10.Feb | 0.127 | 0.161 | 0.304 | 0.264 | 0.314 | 0.44 |
| 11.Mar | 0.137 | 0.214 | 0.544 | 0.491 | 0.347 | 0.70 |
| 12.Apr | 0.184 | 0.251 | 0.641 | 0.592 | 0.478 | 0.73 |
| Source | UAH | RSS | Had4 | Had3 | Sst3 | GISS |
| 13.ave | 0.171 | 0.222 | 0.500 | 0.455 | 0.370 | 0.64 |
| 14.rnk | 10th | 9th | 5th | 7th | 7th | 3rd |
| 15.dif | 0.25 | 0.33 | 0.05 | 0.09 | 0.05 | 0.01 |
If you wish to verify all of the latest anomalies, go to the following:
For UAH, version 5.5 was used since that is what WFT used.
http://vortex.nsstc.uah.edu/public/msu/t2lt/tltglhmam_5.5.txt
For RSS, see: ftp://ftp.ssmi.com/msu/monthly_time_series/rss_monthly_msu_amsu_channel_tlt_anomalies_land_and_ocean_v03_3.txt
For Hadcrut4, see: http://www.metoffice.gov.uk/hadobs/hadcrut4/data/current/time_series/HadCRUT.4.2.0.0.monthly_ns_avg.txt For Hadcrut3, see: http://www.cru.uea.ac.uk/cru/data/temperature/HadCRUT3-gl.dat
For Hadsst3, see: http://www.cru.uea.ac.uk/cru/data/temperature/HadSST3-gl.dat
For GISS, see:
http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
To see all points since January 2013 in the form of a graph, see the WFT graph below.

As you can see, all lines have been offset so they all start at the same place in January 2013. This makes it easy to compare January 2013 with the latest anomaly.
Appendix
In this part, we are summarizing data for each set separately.
RSS
The slope is flat since August 1996 or 17 years, 9 months. (goes to April)
For RSS: There is no statistically significant warming since November 1992: CI from -0.022 to 1.867.
The RSS average anomaly so far for 2014 is 0.222. This would rank it as 9th place if it stayed this way. 1998 was the warmest at 0.55. The highest ever monthly anomaly was in April of 1998 when it reached 0.857. The anomaly in 2013 was 0.218 and it is ranked 10th.
UAH
The slope is flat since September 2004 or 9 years, 8 months. (goes to April using version 5.5 according to WFT)
For UAH: There is no statistically significant warming since February 1996: CI from -0.043 to 2.349. (This is using version 5.6 according to Nick’s program.)
The UAH average anomaly so far for 2014 is 0.171. This would rank it as 10th place if it stayed this way. 1998 was the warmest at 0.419. The highest ever monthly anomaly was in April of 1998 when it reached 0.662. The anomaly in 2013 was 0.197 and it is ranked 7th.
Hadcrut4
The slope is flat since January 2001 or 13 years, 4 months. (goes to April)
For Hadcrut4: There is no statistically significant warming since October 1996: CI from -0.033 to 1.192.
The Hadcrut4 average anomaly so far for 2014 is 0.500. This would rank it as 5th place if it stayed this way. 2010 was the warmest at 0.547. The highest ever monthly anomaly was in January of 2007 when it reached 0.829. The anomaly in 2013 was 0.486 and it is ranked 8th.
Hadcrut3
The slope is flat since August 2000 or 13 years, 9 months. (goes to April)
The Hadcrut3 average anomaly so far for 2014 is 0.455. This would rank it as 7th place if it stayed this way. 1998 was the warmest at 0.548. The highest ever monthly anomaly was in February of 1998 when it reached 0.756. One has to go back to the 1940s to find the previous time that a Hadcrut3 record was not beaten in 10 years or less. The anomaly in 2013 was 0.459 and it is ranked 6th.
Hadsst3
For Hadsst3, the slope is flat since December 2000 or 13 years and 5 months. (goes to April).
For Hadsst3: There is no statistically significant warming since January 1993: CI from -0.016 to 1.813.
The Hadsst3 average anomaly so far for 2014 is 0.370. This would rank it as 7th place if it stayed this way. 1998 was the warmest at 0.416. The highest ever monthly anomaly was in July of 1998 when it reached 0.526. The anomaly in 2013 was 0.376 and it is ranked 6th.
GISS
The slope is flat since November 2001 or 12 years, 6 months. (goes to April)
For GISS: There is no statistically significant warming since August 1997: CI from -0.008 to 1.233.
The GISS average anomaly so far for 2014 is 0.64. This would rank it as 3rd place if it stayed this way. 2010 and 2005 were the warmest at 0.65. The highest ever monthly anomaly was in January of 2007 when it reached 0.92. The anomaly in 2013 was 0.59 and it is ranked 7th.
Conclusion
Even without an El Nino, it appears likely that some records will be set on at least some surface data sets, however not on the satellite data sets.
Richard Mallett says:
May 25, 2014 at 8:02 pm
So, if the satellite data cannot be trusted, and GISS cannot be trusted, and NOAA NCDC presumably is too untrustworthy even to be considered, that just leaves HadCRUT4
You may wish to check out the following. Hadcrut4 barely came out before it needed adjustments. And guess the direction of those adjustments? See:
http://wattsupwiththat.com/2013/05/12/met-office-hadley-centre-and-climatic-research-unit-hadcrut4-and-crutem4-temperature-data-sets-adjustedcorrectedupdated-can-you-guess-the-impact/
Here is a comment I made then:
“From 1997 to 2012 is 16 years. Here are the changes in thousandths of a degree with the new version of Hadcrut4 being higher than the old version in all cases. So starting with 1997, the numbers are 2, 8, 3, 3, 4, 7, 7, 7, 5, 4, 5, 5, 5, 7, 8, and 15. The 0.015 was for 2012. What are the chances that the average anomaly goes up for 16 straight years by pure chance alone if a number of new sites are discovered? Assuming a 50% chance that the anomaly could go either way, the chances of 16 straight years of rises is 1 in 2^16 or 1 in 65,536. Of course this does not prove fraud, but considering that “HadCRUT4 was introduced in March 2012”, it just begs the question why it needed a major overhaul only a year later.
I believe people should not wonder why suspicions are aroused as to whether or not everything is kosher.”
So if none of the data sets can be trusted, how do we know that the pause is real ?
When drought is upon us here on the West Coast, I expect hot day time temps and low night time temps. A case in point, California is often the place of record highs and lows within a 24 hour period during drought. El Nino will reverse that with its incoming clouds and rain. The day time temps will temper a bit and the night time temps will warm up.
wbrozek says: May 25, 2014 at 6:26 pm
“I can understand that there could possibly be good reasons why things could be adjusted from a hundred years ago, but why would 2010 be adjusted in the last 4 months? And why would the all time record month of January 2007 go down from 0.94 to 0.92 over the last 4 months?”
Here is the GISS log of changes. They record that:
“January 21, 2014: The GISS analysis was repeated this morning based on today’s status of the GHCN data. The changes were well within the margin of error, e.g. the L-OTI mean for 2013 changed from 0.6048+-0.02°C to 0.6065+-0.02°C, a change of less than 0.002°C. However, rounding to 2 digits for the L-OTI table changed the 0.60°C used in some documents prepared last week to 0.61°C. This minuscule change also moved year 2013 from a tie for the 7th place to a tie for the 6th place in the GISS ranking of warmest years, demonstrating how non-robust these rankings are.”
Clearly the change in January reflects a catch-up to past changes in GHCN. GHCN changes as belated data comes in, and also if the adjustment algorithm changes. GISS now relies on the GHCN homogenization; they don’t do their own
As I recall, Hansen’s “automatic” routines at NASA-GISS recalculate every month every temperature ever recorded based on the latest (most recent!) night time light index record for that station’s area. This is because Hansen/NASA/GISS was desperate at that time to show as wide a temperature record as possible for as long as possible, while maintaining the “perfect-untarnished-records” for as many different sites as possible.
Thus, he HAD TO create a temperature record of “perfect stations” while appearing to compensate for urban heat islands while reading old records that admittedly varied in location, were missing records, and were taken at various times of day and by various quality of stations by many, many thousands individuals – each of who changed every day in how accurate they did (or did not) follow exact procedures. From these competing needs, and with a cooperative “peer-reviewed paper back in 1980’s, he developed his 600 km “smoothing” of records (later expanded into a 1200 km “smoothing” of temperature data) because “as long as the trends were the same the anomalies could be treated as the same…”
Worse, if today’s values for ANY station change or miss a digit or or “blacked out” in the monthly night time light index, then the “smoothing” algorithm and the station-break algorithm and the station history algorithm re-calculate and re-set ALL previous station daily averages (and ALL previous and current station anomalies!) going all the way back to the first record written down.
Hansen’s magical computer routines, as I understand, have never been changed: Every month the past temperature records ARE CHANGED based on the most recent night time light index based on some 1980-level “standard” for urban, semi-urban, and rural NASA index. I have read of several computer programmers who have attempted to both re-create his program, and others who have attempted to debug (re-read and sort out) what few parts of his program have been released. None were successful. I know of no one who has gotten access to the entire routine, nor anybody (outside of his original FORTRAN punch cards) who can duplicate his methods or results.
I can find no one who can explain why NASA-GISS must recalculate every month temperature records written by hand in 1915.
Nick Stokes says:
May 25, 2014 at 8:55 pm
Thank you.
So on January 20, 2013 had an anomaly of 0.60 to 2 SD.
On January 22, 2013 had an anomaly of 0.61 to 2 SD.
Today, 2013 has an anomaly of 0.59 to 2 SD.
In January 2014, 2003 was also 0.60, but today, it is 0.59.
GISS is really a pain to keep up to date.
The comments about GISS’ “dancing data” intrigued me. I’ve been downloading most major world datasets for a few years. GISS is available at http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt I whipped up a quickie bash script to go through my downloaded files and see what GISS has been reporting for 2010. Note that GISS reports a whole number value, i.e. anomaly * 100. Divide by 100 to get the actual number. The following is the output. The columns are…
1) The month for which the data ends. Obviously, it can start no earlier than 201012
2) The sum of the 12 monthly anomalies (January-December).
3) The average anomaly for 2010
Remember to divide columns 2 and 3 by 100
201012 759 63
201101 757 63
201102 757 63
201103 758 63
201104 760 63
201105 757 63
201106 757 63
201107 760 63
201108 760 63
201109 760 63
201110 756 63
201111 760 63
201112 757 63
201201 756 63
201202 752 63
201203 756 63
201204 750 63
201205 757 63
201206 752 63
201207 749 62
201208 755 63
201209 751 63
201210 769 64
201211 767 64
201212 791 66
201301 796 66
201302 795 66
201303 800 67
201304 797 66
201305 798 67
201306 795 66
201307 801 67
201308 799 67
201309 809 67
201310 804 67
201311 800 67
201312 800 67
201401 794 66
201402 806 67
201403 792 66
201404 785 65
Nick Stokes says:
> May 25, 2014 at 8:55 pm
>
> Clearly the change in January reflects a catch-up to past
> changes in GHCN. GHCN changes as belated data comes
> in, and also if the adjustment algorithm changes. GISS now
> relies on the GHCN homogenization; they don’t do their own
I’ve heard of “belated data” but please don’t tell me that there’s “belated data” all the way back to 1880, coming in every month.
Werner: two comments:
First: “If we average a thousand readings that are only accurate to the nearest degree, we can easily get an average to the nearest 1/1000 degree.”
Sure, you can take the numbers out to as many decimal places as you want, but the numbers beyond the decimal are meaningless. Significant figures dictate that when your readings are only accurate to the nearest degree, any calculated answer is also only accurate to the nearest degree. Only freshman chemistry students and climastrologists make this mistake.
Second: “…it appears likely that some records will be set on at least some surface data sets, however not on the satellite data sets.”
Exactly! And this is the crux of the issue. Why is there a growing discrepancy between surface data sets and satellite data sets? One of the data sets is wrong, but which one and why? That should be the topic of your next post.
Why are the anomalies associated with the satellite measurements significantly smaller than those from the surface measurements? Is their potential upward bias in the surface measurements due to thermometer placement; i.e. heat island effect?
RACookPE1978 says: May 25, 2014 at 9:04 pm
“Hansen’s magical computer routines, as I understand, have never been changed: Every month the past temperature records ARE CHANGED based on the most recent night time light index based on some 1980-level “standard” for urban, semi-urban, and rural NASA index. I have read of several computer programmers who have attempted to both re-create his program, and others who have attempted to debug (re-read and sort out) what few parts of his program have been released. None were successful. I know of no one who has gotten access to the entire routine, nor anybody (outside of his original FORTRAN punch cards) who can duplicate his methods or results.”
Not true. To start at the end, the code is here. And here is a project by British software developers to port GISTEMP to Python. They were able to reproduce the results perfectly.
And it is quite untrue that Hansen’s routines have never been changed. There was a radical change when GHCN V3 came out, and GISS removed their own homogenisation, relying entirely on GHCN.
Ebeni asked: “I am a lay person so excuse me if I am missing something. Is there something in climate science that dispenses with the entire discipline of measurement accuracy and confidence and application of significant figures?”
The rabbit hole is much deeper than mere statistical sloppiness now that the latest hockey stick was wholly fabricated, statistics not even being needed, just spurious data drop-off at the end after some input data was re-dated to afford that as a pure artifact blade that made world headlines:
http://s6.postimg.org/jb6qe15rl/Marcott_2013_Eye_Candy.jpg
Innovations that dispense with scientific integrity include these:
Steig’s innovation of illegally spreading Antarctic Peninsula warming over the whole continent, or Mann’s innovation of ignoring the majority of proxies that show a bowl instead of a hockey stick, or Mann’s innovation of then using an algorithm to cherry pick noisy proxies that lined up with thermometer plots so he could call this objectively unbiased “filtering,” or Marcott’s innovation of a pure data-drop off hockey stick blade, of the innovation of creating virtual sea levels that are then labeled as “sea level,” or the innovation of calling debate foes “deniers” in press releases for papers, or the innovation of allowing a single tree to create a hockey stick shape, or the innovation of raising confidence levels from 90% to 95% after further deviation of temperature from predictions, or the innovation of invoking consensus as a *scientific* instead of profoundly anti-scientific principle, or the innovation of Hiding The Decline by just throwing away new data, or the innovation of claiming that greater uncertainty equates with greater urgency and risk, or the innovation of dissolving sea shells in acid and extrapolating to the whole ocean, or the innovation of calling mild ocean neutralization “acidification,” or the innovation of finding four dead polar bears and expanding that to species endangerment, or the innovation of using satellite data to up-adjust the global average temperature in a way that the same satellite data in fact falsifies, or the innovation of doing risk analysis devoid of any and all benefit analysis as balance, or the innovation of “reversing the null hypothesis,” or the innovation of theoretically hiding heat in an ocean that shows no corresponding extra volume expansion, of the innovation of calling climate model runs “experiments,” of the innovation of invoking a surge of weather intensity in abstracts as actual weather intensity has declined, or the innovation of referencing IPCC reports which themselves reference activist literature almost as much as peer reviewed science, or the innovation of using mere mentions of man made warming in abstracts as offering empirical support *of* that theory, or the innovation of NASA itself not using NASA satellite data in their only temperature product, or the innovation of asserting that recent temperature variation is outside of natural variability without mentioning the near exact precedent for it in the first half of the thermometer record, or the innovation of claiming the 350 year old Central England record that falsifies climate alarm is merely an insignificant local affair that just by chance shows near exact correlation with the global average plots, or the innovation of using the systematic mismatch between tide gauges (relative to land) and satellite altimetry (absolute) to imply a sudden burst in sea level rise that is falsified by the tide gauge data itself.
Walter Dnes says: May 25, 2014 at 9:31 pm
“I’ve heard of “belated data” but please don’t tell me that there’s “belated data” all the way back to 1880, coming in every month.”
Again, you should look at the log of changes.
“December 14, 2011: GHCN v2 and USHCN data were replaced by the adjusted GHCN v3 data. “
Switching from GISS to GHCN homogenisation
“September 26, 2012: NOAA/NCDC replaced GHCN v3.1 by GHCN v3.2. Hence the GISS analysis is based on that product starting 9/14/2012. Version v3.2 differs from v3.1 by minor changes in the homogenization of the unadjusted data.”
“January 16, 2013: Starting with the January 2013 update, NCDC’s ERSST v3b data will be used to estimate the surface air temperature anomalies over the ocean instead of a combination of Reynold’s OISST (1982 to present) and data obtained from the Hadley Center (1880-1981).”
These changes can well affect data back to 1880. GISTEMP isn’t just station readings.
Louis Hooffstetter says:
May 25, 2014 at 9:32 pm
Thank you!
As a retired physics teacher, I certainly agree with you on SD. However there is no way that I will take all of the given numbers and reduce them to the nearest 1/10 degree. The lack of recent warming is plain, whether we use numbers to the nearest 1/10 or 1/1000 degree, and that is what I hope to show.
As for showing which data set is wrong, I am really bothered by the fact that UAH and RSS are so far apart with respect to the time for a slope of 0. That is something that needs to be sorted out by the people capable of doing so. I certainly cannot touch that one.
@RACookPE1978:
I ported GIStemp to Linux and made it go. I’ve not done the newer version (maybe I ought to…) after they claim to not be doing their own homogenizing. To call GISS “data” is, IMHO, an error. It is a “data food product” at best. ( In America, non-cheese dairy synthetic product must be labeled with “cheese food product” so you know it isn’t really cheese 😉
It’s just GHCN after NOAA got done fudging it with more fudging added via a variety of odd machinations involving a load of averaging and (in the older version at least) a set of “serial homogenizing” that could smear temperatures from 1200 km away into other places. But then could repeat that process up to 3 times in different sections of code. (as ‘fill in missing data’, as ‘homogenize the values’ and as ‘make temperatures into grid / box scores’… each sequentially done and using smeared data food product from the prior step…)
IMHO it is not much more than an exercise in data smearing, over averaging and dancing in the error bands of those averaging processes, and expecting truth to come out of a data blender…
It’s a bit data now, but more than you ever want to know about GISS GIStemp, and how the deed is done can be found here:
https://chiefio.wordpress.com/gistemp/
Later I figure out the real data-buggery had been moved upstream to NOAA / NCDC and got to looking at how the GHCN was molested into a rise. Since HADCRUT uses the GHCN as well (they said they lost their real data but it could be retrieved from the GHCN for all intents and purposes, so anyone needing to say they do not use GHCN needs to take it up with them…); the bottom line is that there is ONE set of source data, that gets folded spindled and homogenized by NCDC and smeared around by GIStemp and is more or less used in HADCRUT (which gets a different Sea Surface mix, but not by much). To call them “different data sets” is a stretch. More like different molestations of the same original data.
More here: https://chiefio.wordpress.com/category/ncdc-ghcn-issues/
The bottom line is that we don’t have any real “surface temperature data set”. We have a collection of badly abused numbers that were, once upon a time, based on temperatures; but have long since ceased being historical recordings of temperatures.
Louis Hooffstetter says: May 25, 2014 at 9:32 pm
“Why is there a growing discrepancy between surface data sets and satellite data sets? One of the data sets is wrong, but which one and why? That should be the topic of your next post.”
It was the topic of this very recent post. But the divergence between UAH and the surface indices is a lot less than the divergence between UAH and RSS.
A “pause”? What “pause”?
Rick Adkison says:
May 25, 2014 at 9:34 pm
Why are the anomalies associated with the satellite measurements significantly smaller than those from the surface measurements?
The satellite anomalies are measured relative to a later base period which was warmer than an earlier base period. That is why I believe the rank number is much more meaningful than the anomaly number.
If you want to see what the anomaly is, based solely on station measurements (no interpolation for places not actually measured, no other adjustments) follow the link in my name.
Nick Stokes says:
> May 25, 2014 at 9:54 pm
>
> Again, you should look at the log of changes.
I think we’re talking past each other. You pointed to 3 changes during the past 3 years. My point is that data going back to the 1880’s *CHANGES EVERY SINGLE MONTH* that I download it. The changes may be relatively minor, but they do happen. Can someone do a FOIA request to get GISS monthly anomaly downloads back to day 1? It would be interesting.
alex says:
May 25, 2014 at 10:03 pm
A “pause”? What “pause”?
It is the one that Trenberth is trying to explain via heat in the deep ocean.
Werner Brozek says:
May 25, 2014 at 6:59 pm
If we average a thousand readings that are only accurate to the nearest degree, we can easily get an average to the nearest 1/1000 degree.
Increased readings gives greater precision, not greater accuracy. In addition, this only applies to measurements of the same thing. Measurements of temperatures around the world in different places are not measurements of the same object. If it were true that greater accuracy could be achieved by simply more readings, we could get the Earth’s population to hold their fingers in the air to estimate temperature, then calculate an average to a high accuracy.
@ur momisugly Dr Burns, Increased readings gives greater precision, not greater accuracy.
Thank you, that is precise!
Of course it’ll break a record. That was the objective of this year, to break the record come hook or crook. Obviously it’s been adjusted already. It shows the temps way above everyone else.
This is all about denying the pause exists. These people are criminals.
Ronan Connolly has a detailed critique of how poorly GISS handle UHI effect.
http://notalotofpeopleknowthat.wordpress.com/2014/05/20/is-the-giss-adjustment-for-uhi-adequate/