Guest essay by Neil Catto
The CET record started in 1659 close to the minimum of the little ice age. As such, it is with no surprise that last year (2014) was the warmest on record. It would appear to be a natural recovery. The monthly mean temperature of 8.87 Deg C in 1659 has increased to 10.95 Deg C in 2014; which equates to 0.06 Deg C/decade.
I used the CET mean monthly data 1659-2014: Downloaded 6th Jan 2015 http://www.metoffice.gov.uk/hadobs/hadcet/data/download.html
My main interest in this data set is to gain better understanding between natural variation and AGW. I consider the CET as a reasonable representation of Northern Hemisphere trends. In 1739 Mount Tarumae in Japan erupted with a VEI force 5. The mean monthly CET temperature in 1739 was 9.21 Deg C, in 1740 there was a significant drop to 6.84 Deg C à and in 1741 a recovery to 9.32 Deg C. This natural occurrence had the equivalent drop in temperature of -23.5 Deg C/decade and recovery of 24.6 Deg C/decade. With a natural variation of this magnitude I never understand the alarm about 2.0 deg C/decade, human life survived and exponentially grew in numbers.
The last time I downloaded CET data was 22nd May 2013. Out of interest I thought I would compare the two data sets. The results were interesting to say the least.
Fig 1 anomalies between CET downloaded in May 2013 with CET downloaded in Jan 2015 (data to Dec 2014)
It is noticeable that nearly every adjustment is positive, with no negative changes. The whole data set shows an average increase of 0.03 Deg C in 20 months or equivalent to 0.18 Deg C/decade.
Discussion:
What is the reason for these data adjustments?
How often and by how much are these data adjusted?
Is this anthropogenic warming caused by man-made adjustments?
Fig 1 would have been more meaningful if the Y scale was adjusted (!) to show the full extent of the negative-going spikes. Why “clip” the plot?
“Why “clip” the plot?”
It has not been clipped. The anomolies are all positive, there are no “negative-going spikes” …that is one of the writers main points. Why are all of the adjustments positive?
Would you prefer to see a -.01 under the zero line where no anomlies reached it?…..you would probably complain that the -.01 graphing was unnecessary!
I understand that this article implies that
a) CET temp have been “adjusted”
b) it is unclear if these adjustments led to conclusions of AGW being whipped.
There has been no standard scale of temperature measurement until Huygens in 1665 proposed the boiling and freezing point of water as reference. So each thermometer used in the early CET days, until scientific sandards were imposed, ought to be known and be referred to individually to make readings comparable. Have adjustments been made to these thermometer readings? Who performed them, when and on what basis? Are these adjustments made “bona fide” or, as one might assume knowing about the procedures of a certain Mann and that ilk, malevolently? Are the resulting, adjusted data correct?
From the Met Office Hadley Centre on CET:
“These daily and monthly temperatures are representative of a roughly triangular area of the United Kingdom enclosed by Lancashire, London and Bristol. The monthly series, which begins in 1659, is the longest available instrumental record of temperature in the world. The daily series begins in 1772. Manley (1953, 1974) compiled most of the monthly series, covering 1659 to 1973. These data were updated to 1991 by Parker et al (1992), who also calculated the daily series. Both series are now kept up to date by the Climate Data Monitoring section of the Hadley Centre, Met Office. Since 1974 the data have been adjusted to allow for urban warming.”
Since 1974 the data have been adjusted to allow for urban warming.
That does not mean, “Only the data cllected since 1974 is corrected for UH effects.”
It means what it says, that is, “The entire historical CET dataset has been rejiggered as UKMO sees fit under the guise of urban warming adjustment. We started these adjustments in 1974.”
It seems the explanation for the adjustment may be on their page.
“Since 1974 the data have been adjusted to allow for urban warming.”
The normal convention is that current data is held correct, so past data is adjusted relative to it. If current data should be adjusted down for UHI, the effect will be that past data is adjusted up. This could well be just the adjustment for 2014. It’
s true that the “keep the present correct” is not really appropriate for UHI, but it may be what they do.
So why isn’t the past adjustment uniform? That could well be rounding. Clearly it is rounded to two figures. If the months were adjusted, then rounded, then annually averaged, this scatter could well result.
It’s an explanation of how they do things. Unfortunately what they are doing is insane
I’ve just had to have a drink to get into the ‘logic’ (I use the term loosely) of your statement.
Let me see:
The present is always correct so the past has to be adjusted to reflect that.
If the present needs to be adjusted then the past has to be adjusted too.
Sounds like ‘heads I win tails you lose’
That doesn’t sound insane to you?
No, Alex, I think Nick was explaining how the two ends of the series are relative: if one end goes down it’s the same as if teh other had gone up – relatively.
Harry
He said it is adjusted. Its not like one end is adjusted and therefore the trend line ‘naturally’ changes. It sounds to me , from what he says, that both ends are adjusted. I am not accusing him of doing this. Just interpreting what he is telling me about what what ‘they’ do.
It’s just a convention. If an instrument is moved, for example, you have to adjust one to meet the other. The present is always with us, so it makes a reasonable reference point. With UHI, it might seems less logical, thinking that the past is right and now is wrong. But that’s not really sustainable. Urbanisation might be artificial, but it isn’t going to go away.
Alex,
I heartily endorse your logic visualization technique.
To me, the insanity is not necessarily all the adjustments and deletions, but the notion that what you wind up with is a very good facsimile of reality.
The expectation on the skeptic side that there is a different way that will render the Truth is equally deluded.
Without all the hubris, zeal and hysteria, studying the weather would be good, clean fun with elements of practical merit, but I’ll confess, it wouldn’t have drawn my interest to the insane extent it has.
“The normal convention is that current data is held correct, so past data is adjusted relative to it. ”
IMHO that is crazy logic.
‘Current’ data is current on one day only.
And here we have yet another definition of insanity.
Stokes is telling us that since changes need to be made to the current measurements because of Urban Warming (UHI) that we hold current measurements as correct even though we know they are not; and change the past measurements — the ones before urbanization. In other words, If today’s temp is wrong (too high) I don’t reduce today’s temperature at all. No sir, that would be bad for the scam so I pretend to know how to adjust temperature way in the past as if I had a time machine.
An honest group of men (good luck finding such men in climatology) would adjust the current temperatures downward due to the urban heat island (UHI) effect. Instead they use the hottest temperature they can find and adjust every other measurement to that.
They admit this bogus operation and pretend it is “science”. I guess they think some hand-waving and statistical mumbo-jumbo will fool us all.
And this site will automatically send your comment to moderation if you use the Fr*ud word? Oh my.
If there is a growing UHI problem that oberstates a temperature trend, there are two logical ways to adjust to get a ‘true’ trend. Either cool the present or warm the past. Cooling the present results in discrepancy to current actual observation. So the preference is to warm the past. NASA GISS website uses Tokyo as the example.
The problem is, that when you actually compare what is done, the past is cooled on average, not warmed. This is so even where there should be no adjustment at all because no UHI.
Many specific examples for many land temperature series are provided in essay When Data Isn’t in Blowing Smoke. This CET post perhaps adds to the pile of examples.
BS Nick….I am always finding on my “accuweather” app that current temperature compared to predicted temperature are not the same.
Example…8am temp was said to be 50f and yet current temp at 8am is 46f. AND we have clowns shoving down our throats these hundreths of a degree!!!!
Pure and simple BULLSHYT.
If the temp cannot be accurately measured for me today by all of our technology……no one can telk me what it will be in 100 years.PERIOD.
This really depends on where you are relative to the reporting station. I’m about 4 miles from mine according to “Accuweather” (yes, I use the term VERY loosely), and probably 150ft higher. The two will rarely, if ever, exactly agree.
So the past is always wrong Nick?
When the temperature trend falls, those climate data detail devils will ‘readjust’ the data down?
Honest data is kept as-is, period. Adjustments are kept as-is, period.
All reports using the data are identified as ‘adjusted’ is so and then adjustments are identified in detail.
e.g. ** Report uses TOB adjustments; TOB adjustments incurred 1930…2014 inclusive, on a monthly basis.
Tell me that when the data is so frequently manipulated/tortured that readers would not immediately question the efficacy of the report and ask about an ‘original’ untouched report?
Dear Mr Catto
Below is a link to the work of Professor Gordon Manley. The CET record is to 1974 showing actual temperatures. You of course may already be aware of this, if not it should help in showing what Jones et al have done with the record before 1974. The paper is worth a read in any case as the Professor identifies the effects of urbanisation on temperature readings.
If the link doesn’t work Google manley and royal meterological society
http://www.rmets.org/sites/default/files/qj74manley.pdf
Regards
Thankfully the “official” data is homogenised in the Met Office blender- http://www.metoffice.gov.uk/hadobs/hadcet/Parker_etalIJOC1992_dailyCET.pdf . Manley gives a different version http://www.metoffice.gov.uk/hadobs/hadcet/Parker_etalIJOC1992_dailyCET.pdf which shows a much more steady increase from the LIA. No wonder the results keep changing!
I would prefer to look at the figures another way ignoring CO2, adjustments etc
2 degrees C in 355 years equates to a warming rate of 0.6 degrees C per century, which is well within the bounds of natural variability?
An over simplification of course but Je Suis Charlie
There is nothing suspicious about this:
Few months ago I wrote to the Met Office about a small error in their annual data:
CET monthly data is given with one decimal place, these are added together, divided by 12 and rounded off (to the math’s rule) at two decimal places.
I repeated process and a zero difference between so calculated (annual from the monthly) data and the Met office annual numbers (last column in http://www.metoffice.gov.uk/hadobs/hadcet/cetml1659on.dat)
confirms the method.
Since monthly data is made of the daily numbers and months are of different length, it is a wrong method, resulting in the Met Office data being fractionally wrong.
As a test (using 1900-2013 data) I multiplied each month’s data by number of days, added all months then divided the sum by 365 or 366 as appropriate.
This method gives annual data which is fractionally warmer, mainly due to short February, warmest long Jul & Aug are balanced by cold long Dec & Jan, the rest makes tiny difference.
http://www.vukcevic.talktalk.net/CETisCool.gif
this is not much, but still important, in every single year the second decimal digit (hundredth of a degree) is wrong, maximum difference is 0.07 and minimum 0.01 C.
Ialso discussed subject with Tony Brown (TonyB) via email, and even may have commented on the WUWT.
That’s very informative, thanks.
If the effect is mainly due to a February short but cold bias, your graph suggests February has been getting colder since about 1995.
You are entirely correct.
February had nearly flat trend for 250 years (1730 – 1980)
http://www.vukcevic.talktalk.net/CET-Feb.gif
(I alerted Met Office to the February effect and error in their dat in early August 2014)
It has, but strangely, January has been getting warmer.
January used to be much colder than February but now they are much closer (30 year mean), although that may be a return to the situation in the early 1700’s.
Are you saying that the adjustments depicted in Mr. Catto’s graph all result from correcting the month-weighting error you brought to their attention? If so, you may want to suggest to Mr. Watts or Mr. Catto that an update be made to the head post.
I only did test for 1900-2013, then wrote to Met Office. As far as I can see my graph is identical to the one posted by Mr. Catto (post 1900). I am only a visitor here and not qualified to comment on the editorial matters at the WUWT.
This is actually not the responsibility of Mr Watts, Mr Catto or anybody – except the custodians of the CET who made the change! I accept that change needed to be made, and that it appears to have little effect on the trend – but why is is not documented so that people who use the data KNOW WHAT HAS BEEN DONE.
Sorry for shouting, but as someone who uses databases regularly, this kind of lack of documentation really screws up my day.
The oldest CET data file I can find is from July 2012 and I did a comparison between that file and the end of 2014.
As far as I can tell from the graph, the annual adjustments are the same as above, and the average adjustment is 0.02796c.
It is possible that this was a “one off” adjustment and consequently translating it into a 0.18c/decade may not be valid.
I don’t know when these adjustments were made, but it is possible that the is an explanation somewhere on the MO website.
It should also be remembered that the location of sites used to calculate CET have changed over the years.
By the way, the warmest year in the CET record was not 2014, but the 12 months ending April 2007, at 11.63c, compared to calendar year 2014 at 10.88c.
I found a file for 2007, and there were no changes to figures between 2007 and 2012.
“By the way, the warmest year in the CET record was not 2014, but the 12 months ending April 2007, at 11.63c, compared to calendar year 2014 at 10.88c.”
Then by that reckoning the 2006 record had already been beaten in 2007! Funny that nobody mentioned that until now.
Plus, if you’re going to take the warmest consecutive 12 month period in a temperature record as the warmest ‘year’, then you have to also accept that the warmest ‘year’ in HadCRUT4 ended in September 2014, and in both NOAA and GISS it ended in November 2014.
By that system we don’t need to await the December figures; 2014 is already ends the warmest ‘year’ in all three surface data sets.
Yes!
Can you explain to me why we should only use (arbitrary) calendar years and ignore all 12 month periods?
It is sheer chance that we measure our “year” from January to December.
Rolling annual figures tell us more than calendar year figures do.
Why not use the daily data? Or, on the other hand, why limit it to a 12 month period? Why not 6 months, 3 months… 63 months?!
Whole calendar years is probably an easier metric for the average punter to grasp, isn’t it?
The 12 month period has a natural, astronomical basis.
By not using rolling 12 month periods, misses the fact that annual CET reached 11.63c.
You could use daily data, but it has complications, e.g. leap years which make it difficult to calculate.
Averages are calculated over 3 month periods, e.g. winter/summer which also have astronomical significance.
You didn’t really answer my question, merely replaced it with several others.
David R- So what?
Sorry for repeating this from previous threads, but I can’t seem to get an answer.
Its about adjustments, so seems pertinent on this thread.
I thought that USHCN and USCRN were very different systems. using totally different climate stations.
How can it be that on this page…(link below) they match so closely ?
Thanks for any explanations.
http://www.ncdc.noaa.gov/temp-and-precip/national-temperature-index/time-series?datasets%5B%5D=uscrn&datasets%5B%5D=cmbushcn¶meter=anom-tavg&time_scale=p12&begyear=2005&endyear=2014&month=12
The CET record is often claimed to be the longest continuous temperature series but this is far from the truth as it is in fact a compiled record from many differing measurements sites that have changed frequently – not least since the 1950’s when sites like Ringway (now a major airport) were replaced by other sites including Rothamsted that is close to another major airport Luton. The CET data we see presented most often is the annual mean temperature that is misleading as the maximum and minimum data (that is not available in as much detail from the beginning of the series) show different trends with the minimum values having been subject to substantial adjustments. To be fair to the UK Met Office the uncertainties in the values and the problems in compiling the record have been openly published in papers by Parker and Horton that anyone considering this record needs to be familiar with:
http://www.metoffice.gov.uk/hadobs/hadcet/ParkerHorton_CET_IJOC_2005.pdf
https://cet365.wordpress.com/
Of importance is the trend in rainfall over the period and not just the annual distribution but the soil types of the stations used in the record on which it falls since soil moisture content and the ease with which that moisture is released to the atmosphere can have substantial effects on minimum and maximum surface temperatures – cooling the maximum and warming the minimum. These effects depend on surface solar insolation (hence cloud cover) as well as wind speed and other factors considered in this model by Berg and colleagues:
http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-13-00591.1
and this work by Monroe and colleagues
http://www.geomet.uni-koeln.de/fileadmin/Dokumente/Arbeitsgruppen/Meteorologie/Shao__Yaping/pdfs/Munro_etal_1998_land_surface_interactions_AUS.pdf
and this modelling study by Hirsch and colleagues:
http://gewex.org/2014conf/pdfs/Hirsch_11-7.pdf
or this by Herb and colleagues:
http://static.msi.umn.edu/rreports/2008/319.pdf
There are several other studies but these serve to show that evaluating surface temperature without regard to factors such as rainfall, solar surface insolation and soil conditions leads to erroneous results, especially in the homogenization of data from sites with differing profiles for these factors and especially for the UK where surface insolation has increased since the 1960’s.
This link between temperature, soil conditions, solar insolation and rainfall has been well known since the Koppen climate zone classification was first promoted and in effect means that global temperature trends have little meaning, especially when presented as annual means. The only trends that have some value are those within climate zones ( such as the work presented by Frank Lasner of Hidethedecline) and especially on the periphery of such zones that indicate the direction of the climate trend – for example the cycles in climate linked to the Sahel area of Africa which changes as the Hadley cell expands and retreats from the equator bringing changes in rainfall and surface insolation. Perhaps of more importance in this regard is the expansion/contraction of the cereal growing areas of the Northern Hemisphere that brought crop failures hence widespread famine during the Little Ice Age – a situation that looks like it is about to repeat.
To get back to the CET record anyone familiar with the climate of the UK will see that the record is a mixture of data from several climate zones and urbanized areas and as such, whilst interest as a guide to long term trends lacks the precision to detect real changes in climate over decadal periods. Also discussing the mean CET trend is misleading as one needs to look at the mean minimum and maximum values and consider the impact of UHI as well as rainfall, soil moisture, surface insolation and wind speed. Finally, it is interesting to observe that a recognized measure of these values in agriculture is the Class A Pan Evaporation data that globally shows no clear warming trend though there are local trends depending on the factors discussed above:
http://www.science.org.au/sites/default/files/user-content/nc-ess-pan-evap.pdf
First: The temperature data in these files are not anomalies. They are absolute temperatures.
Second: The whole series in the annual column has been shifted upward. So there is no change in trend.
Third: I had av version of the file downloaded December 2013. Comparing this file to the newest gives the same result as in this post. But when I compute the mean of the monthly values from the two files the results are identical. The monthly values are actually unchanged. So computing the mean of the monthly values from the newest file and comparing those with the annual column the new file I get the same result as comparing the yearly columns from the two files.
That means: This is just a glitch in the year column. An error. It is not an adjustment at all. And the error will be corrected.
That correction might of course result in some interesting thoughts from some.
rooter,
Thanks for that. I agree that there is no change in trend.
The entire ‘man-made global warming’ scare is based on just a few years’ of anomalous warming. That has stopped. But the scare continues as if global warming had continued.
It was important to ask the question. Seems we have the answers from vukcevic and rooter. Even if CET data are reliable, can we say the same for NOAA and others? The discussion should continue.
Hadcrut V18.23 will show warming between 1850 and 2014 was indeed 2.0C just as the climate models had predicted.
The data will also show that there was immense impacts on the climate as severe hurricanes tripled in number and rainfall extremes increased by 137% while there was over 50 million climate refugees by 2010. And cats and dogs went extinct in 2023.
Bill,
You forgot to mention that the adjusted summer Arctic ice extent fell below 1milliom km^2 in 2013, and the last polar bears drowned in 2015.
Thanks for all the comments.
Whilst vukcevic provides an explanation for the changes, I would rather have a file showing raw (unadjusted) data. From these data it would be possible for anyone to provide their own interpretation for situations such as UHI. Mainly because I disagree with P. Jones figure for UHI and consider it 0.8 Deg C after an analysis between London Heathrow and London Gatwick (using 15 years of data)..
I can see there is no trend in the adjustments, however I asked the questions “how often and by how much are these adjustments done?” It was a bit sarcastic of me to relate the adjustment to a /decade figure not knowing the answers to the questions. If these adjustments happen frequently then the whole record warms and there is a greater potential for the latest year to be warmer.
As far as the warmest year is concerned, I have been collecting 24 hourly data on a daily basis for 27 locations in the UK since October 1998 as part of my work. One of the locations is Birmingham, right in the middle of the CET. When I analysed the average daily/monthly/yearly data I found in 1999 the average temperature was 11.1, which is 0.6 Deg C higher than the CET for 201.
[“higher than the CET for 201.” ? .mod]
I alerted Met Office to the error in early August 2014 and suggested method of recalculation which they appear to have adopted and corrected the annual values.
Just ask for the raw unadjusted data the.
No big deal.
For this series they apply an UHI adjustment from 0,1 – 0,3 degrees (depending on month – biggest adjustment i summer). That gives of course lower adjusted temperature than raw unadjusted.
Like in Birmingham….
But if you think that i wrong you can of course suggest another adjustment. Just see to give good arguments for change of adjustments.
Or perhaps you would keep the raw unadjusted data?
I would keep both adjusted and raw data side by side. That way everyone can see what has been done.
If vukcevic’s explanation is correct and they did correct their annual mean calculation then the original data from which it is derived will still be there and will be unchanged only the derived annual data will have changed. Should be easy to check.
An analogy exists between average global temperature resulting from forcings and the level of water in a bucket containing a hole in the bottom being filled with a hose. If the inflow or hole size is suddenly (or gradually) changed to a different value the level of water would slowly change until equilibrium between inflow and outflow was reestablished. The water level would change according to the time-integral of the difference between inflow and outflow. Likewise, average global temperature depends on the time-integral of the net effect of forcings.
If global warming was caused by CO2 (which it isn’t), warming rate (rate-of-change of average global temperature) instead of (as usually presented) the temperature itself would vary with the CO2 level. To be valid, the comparison should be between the temperature and the time-integral of CO2 level and/or the time-integral of any other factor(s) proportional to energy rate.
Thus any co-plot of CO2 level and temperature or any other implication that average global temperature depends directly on CO2 level is misleading and physically and mathematically wrong.
An analysis at http://agwunveiled.blogspot.com derives a physics-based equation which, using the time-integral of forcings, accurately calculates the uptrends and down trends of average global temperatures irrespective of whether CO2 change is included or not. The paper at this link discloses:
1. A reference which provides historical evidence that CO2 change does not cause climate change.
2. The two factors that do explain climate change. The correlation is 95% with average global temperatures since before 1900; including the current plateau. The analysis also predicts the ongoing down trend of average global temperature.
3. An explanation of why any credible CO2 change does not cause significant climate change.
The two factors are also identified in a peer reviewed paper published in Energy and Environment, vol. 25, No. 8, 1455-1471.
phillipbratby
January 8, 2015 at 10:43 pm
Phillipbratby, think about this (I mean it sincerely). If the world is going to warm 2 to 5C and this is very alarming, there is no need to do all this adjusting of temperature records because:
A) The warming WILL overwhelm any errors. Thermometers show (roughly?) a global 0.6- 0.7C per century increase and this is supported in most cases regionally with different thermometer sets in different continents. It does not matter if this is out 0.1-0.2C, an order of magnitude less than the signal we are worried about.
B) If a modest network of ideally located thermometers is used for monitoring the warming and assessing its seriousness, they can be left raw. The magical-seeming Central Limit Theorem [CLT] (large number of measurements of a metric averages out to a normal distribution [bell curve] with the temperature we seek indicated by the peak in the bell. The application of the CLT is a legitimate one for this use, because, over the several hundred years since the thermometer’s invention, it has been calibrated at 0C (or the earlier 32F) using a mix of ice and water, boiling point at 100C (212F) at sea level, and later, human body temperature for interpolation and probably today an number of other known constant temperature types. Each of these calibrations is essentially one iteration of hundreds of thousands of measurements of freezing, boiling and body temperatures. Their averages will be precise for the task at hand. Individuals will read somewhat too warm or too cool but the averages you can rely on.
I’m surprised that this hasn’t been prominently presented in any of the thousands of posts I’ve read on warming and the data sets. Indeed, a dozen pristinely located thermometers distributed around the world would be fully adequate to give us a signal of any worrying trends UP or Down. I think the problem is modern hubris, an idea that our quaint predecessors efforts at instrumentation were crude in their manufacture. Had I time, I would be happy to expand on this. Suffice it to say for now that hand crafted instrumentation of several hundred years ago was remarkably intricate.
I have a ‘recipe’ for making a reflecting telescope parabolic mirror that is that of Sir Isaac Newton himself (not an original but the recipe is). It uses two flat round plates of glass – the tool (on top) and the reflector with grit to grind down the reflector between them (ever finer grit as you progress until you get to jewelers rouge). I won’t get into the regular turning of the assembly and the walking around in the opposite direction it to grind back and forth from all directions (CLT works admirally here, too!!).
When this stage is complete, you have a spherical indentation from the edge inwards in the reflector. Now you have to change this to a flatish parabolic shape so that the lines of light converge at a focus. This is done by coating the tool with beeswax, grooving the beeswax in a reticular pattern and resuming the same grinding motions with jewelers rouge. The beeswax, not being rigid deforms slightly with each rub and in time you have a parabolic, polished surface.
To check it out, you set the mirror on its edge, find the focus using a light passing through a slit and adjusting the distance until the entire reflector is aluminated (remember Newton was also a seminal researcher in optics). To check for “bumps” you slowly slide a razor blade in guides into the slit and look for topographic irregularities (usually smooth, polished, rounded bumps) as the slit of light sweeps the refector. This method will permit you to map bumps greater than 2 to 3 millionths of an inch on your reflector. You remove these bumps by putting a thin coating of jewelers rouge on your scrubbed thumb and giving the bump a few light rubs. Repeat until the deflection from the bumps can no longer be detected with the advancing razor blade.
I don’t have a link and the out-of-print book is a few thousand kilometres away from me in my home, but I’m sure you can find it or another by googling Newtons method. Oh, and 2millionths of an inch is 0.05 microns (cigarette smoke particles are 2 microns). How’s this for precision in the 17th Century. You may not also know that modern telescope mirrors also get the final by-human-hand finishing!! They have the same recipe, I’m sure.
I don’t see any problem here. I don’t see any trend in the difference graph, it just adds approximately 0.03 degree celsius to the whole record, making both present and past look veeeery slightly warmer. Magnitude of warming is difference between now and then, and this difference is in general untouched. So are any temperature trends.
As to why are these changes made, my honest guess is they come from improved methodology, perhaps slight changes in weights given to individual stations, or changes to interpolation for missing data.
I don’t see any reason to assume conspiracy.
‘I don’t see any reason to assume conspiracy.’
I think you only have to look at the track rerecord of those making these adjustments , often carried out in poor way with little or no justification , to see why people tend to ask questions of them . After all if someone spends 90% of their time lying to you , you are likely to question what they say for the other 10% too.
Typo near the end. It says 2.0 C/decade.
Oh and apropos of my comment above, many of the historic thermometers are still available for testing.
http://www.britishpathe.com/video/thermometers-issue-title-is-england-expects
This is a complete nothing. Slight change in the 1961-90 monthly means. Doesn’t change trends.
Nor is the pre-anomaly data hard to find. It’s here
http://www.metoffice.gov.uk/hadobs/hadcet/cetml1659on.dat
Completely different animal than GISS two-legged adjustments.
Given your reputation, I am surprised you would accept any change, regardless of how benign. Perhaps you’re only interested in the large changes. Perhaps you have become numb to these things these days. Unfortunately I think that one finger instead of three is still rape.
Reply to SMc ==> absolutely correct. The reported “anomaly” is hundredths of a degree — if this were a medical study, it would fail the “Minimal Clinically Important Difference” test. — kh
There is of course a marked difference between temperature readings in towns and cities and the countryside but even if we took all readings in rural areas there could still be a marked difference. My readings are taken in my garden at 400ft above sea level. On a still clear night readings taken at the bottom of the valley 300ft below will regularly be 1 to 2C below my own.
Dan Pangburn.
Thanks for your input to the discussion. I am an Electrical Engineer M.Sc. and have been working with process control systems in the geothermal industry for decades. I was also a lecturer in Modern Control Engineering (Ogata) at the local university for several years, so I have a fair understanding of thermodynamics and mathematical modeling of physical systems.
I agree with you that the correct method is to use the time-integral of the forcing signal. It is simply wrong to compare directly the instant value of CO2 or variable TSI and the Earth’s temperature.
Therefore I consider your analysis at http://agwunveiled.blogspot.com very valuable.
Best regards
Agust
What is the reason for these data adjustments?
Zealots changing the record.
How often and by how much are these data adjusted?
Every chance they get.
Is this anthropogenic warming caused by man-made adjustments?
Yes.
There you go. My best guesses.
I got a copy of the file from december 2014. The annual column in that file was identical to the file from 2013. And had of course therefore the same difference in comparison to the latest file.
Supports the glitch-hypthesis.
http://wattsupwiththat.com/2014/09/19/laki-caused-1783-could-icelands-bardarbunga-volcano-trigger-another-year-without-a-summer/#comment-1742472
Mount Tambora
[Excerpt from wiki]
With an estimated ejecta volume of 160 km3 (38 cu mi), Tambora’s 1815 outburst was the largest volcanic eruption in recorded history. The explosion was heard on Sumatra island more than 2,000 km (1,200 mi) away. Heavy volcanic ash falls were observed as far away as Borneo, Sulawesi, Java, and Maluku Islands. Most deaths from the eruption were from starvation and disease, as the eruptive fallout ruined agricultural productivity in the local region. The death toll was at least 71,000 people, of whom 11,000–12,000 were killed directly by the eruption;[6] the often-cited figure of 92,000 people killed is believed to be overestimated.[7]
The eruption caused global climate anomalies that included the phenomenon known as “volcanic winter”: 1816 became known as the “Year Without a Summer” because of the effect on North American and European weather. Crops failed and livestock died in much of the Northern Hemisphere, resulting in the worst famine of the 19th century.[6]
[end of excerpt]
[Excerpt from my above post]
The Dalton Minimum had 2 back-to-back low SC’s with SSNmax of 48 in 1804 and 46 in 1816. Tambora erupted in 1815.
Two of the coldest years in the Dalton were 1814 (7.75C year avg CET) and 1816 (7.87C year avg CET).
[end of excerpt]
Commentary:
So, for CET’s, it appears that the 1815 eruption of Tambora had minimal effect, since CET’s in 1814 were slightly lower than CET’s for 1816.
However, the anecdotal evidence suggests that 1816 was a much harder year for humanity than 1814.
What to believe?
Best to all, Allan