A way of calculating local climate trends without the need for a government supercomputer

This method may or may not have merit – readers are invited to test the merit of it themselves, the method is provided – Anthony

Guest essay by Ron Clutz

People in different places are wondering: What are temperatures doing in my area? Are they trending up, down or sideways? Of course, from official quarters, the answer is: The globe is warming, so it is safe to assume that your area is warming also.

But what if you don’t want to assume and don’t want to take someone else’s word for it. You can answer the question yourself if you take on board one simplifying concept:

“If you want to understand temperature changes, you should analyze temperature changes, not the temperatures.”

Analyzing temperature change is in fact much simpler and avoids data manipulations like anomalies, averaging, gridding, adjusting and homogenizing . Temperature Trend Analysis starts with recognizing that each microclimate is distinct with its unique climate patterns. So you work on the raw, unadjusted data produced, validated and submitted by local metrorologists. This is accessed in the HADCRUT3 dataset made public in July 2011.

The dataset includes 5000+ stations around the world, and only someone adept with statistical software running on a robust computer could deal with all of it. But the Met Office provides it in folders that cluster stations according to their WMO codes.

http://www.metoffice.gov.uk/research/climate/climate-monitoring/land-and-atmosphere/surface-station-records

 

Anyone with modest spreadsheet skills and a notebook computer can deal with a set of stations of interest. Of course, there are missing datapoints which cause much work for temperature analysts. Those are not a big deal for trend analysis.

The method involves creating for each station a spreadsheet that calculates a trend for each month for all of the years recorded. Then the monthly trends are averaged together for a lifetime trend for that station. To be comparable to others, the station trend is normalized to 100 years. A summary sheet collects all the trends from all the sheets to provide trend analysis for the geographical area of interest.

I have built an Excel workbook to do this analysis, and as a proof of concept, I have loaded in temperature data for Kansas . Kansas is an interesting choice for several reasons:

1) It’s exactly in the middle of the US with no (significant) changes in elevation;

2) It has a manageable number of HCN stations:

3) It has been the subject lately of discussion about temperature processing effects;

4) Kansas legislators are concerned and looking for the facts; and

5) As a lad, my first awareness of extreme weather was the tornado in OZ, after which Dorothy famously said: “We’re not in Kansas anymore, Toto.”

I am not the first one to think of this. Richard Wakefield did similar analyses in Ontario years ago, and Lubos Motl did trend analysis on the entire HADCRUT3 in July 2011. With this simplying concept and a template, it is possible for anyone with modest spreadsheet skills and a notebook computer to answer how area temperatures are trending. I don’t claim this analysis is better than those done with multimillion dollar computers, but it does serve as a “sanity check” against exaggerated claims and hype.

For the Kansas example, we see that BEST shows on its climate page that the State has warmed 1.98 +/-0.14°C since 1960. That looks like temperatures will be another 2°C higher in the next 50 years, and we should be alarmed.

Well, the results from temperature trend analysis tell a different story.

From the summary page of the workbook:

 

Area State of Kansas, USA  
History 1843 to 2011  
Stations 26  
Average Length 115 Years
Average Trend 0.70 °C/Century
Standard Deviation 0.45 °C/Century
Max Trend 1.89 °C/Century
Min Trend -0.04 °C/Century

 

So in the last century the average Kansas station has warmed 0.70+/-0.45°C , with at least one site cooling over that time. The +/- 0.45 deviation shows that climate is different from site to site even when all are located on the same prairie.

And the variability over the seasons is also considerable:

 

Month °C/century Std Dev
Jan 0.59 1.30
Feb 1.53 0.73
Mar 1.59 2.07
Apr 0.76 0.79
May 0.73 0.76
June 0.66 0.66
July 0.92 0.63
Aug 0.58 0.65
Sep -0.01 0.72
Oct 0.43 0.94
Nov 0.82 0.66
Dec 0.39 0.50

 

Note that February and March are warming strongly, while September is sideways . That’s good news for farming, I think.

Temperature change depends on your location and time of the year. The rate of warming is not extreme and if the next 100 years is anything like the last 100, in Kansas there will likely be less than a degree C added.

 

Final point:

When you look behind the summary page at BEST, it reports that the Kansas warming trend since 1910 is 0.75°C +/-0.08, close to what my analysis showed. So the alarming number at the top was not the accumulated rise in termperatures, it was the Rate for a century projected from 1960. The actual observed century rate is far less disturbing. And the variability across the state is considerable and is much more evident in the trend analysis. I had wanted to use raw data from BEST in this study, because some stations showed longer records there, but for comparable years, the numbers didn’t match with HADCRUT3.

Not only does this approach maintain the integrity of the historical record, it also facilitates what policy makers desperately need: climate outlooks based on observations for specific jurisdictions. Since the analysis is bottom-up, microclimate trends can be compiled together for any desired scope: municipal, district, region, province, nation, continent.

If there is sufficient interest in using this method and tool, I can provide some procedural instructions along with the template.

The Kansas Excel workbook is provided as an example: HADCRUT3 Kansas.xls

Advertisements

  Subscribe  
newest oldest most voted
Notify of
Tom J

My understanding of the AGW theory (whoops; hypothesis) is that the most pronounced warming should be present in the coldest and driest air masses, and during winter. While that’s interpreted to be polar regions I think Kansas during the winter is in a position to compete. It’s interesting, therefore, that while February and March exhibit the strongest upward seasonal trend (seemingly validating the hypothesis; oops, theory), December and January exhibit the smallest upward trend (contradicting the … oh, forget it) excepting September. And following February and March the next highest trend was in July which is where it shouldn’t be.
Another example of the mysterious AGW thingy that also backflips when it comes to Arctic and Antarctic regions.

NikFromNYC

September is when briefly neither heating nor air conditioning are used much in buildings so the urban heat island effect on thermometer stations is lessened.

CET is the best known and the longest regional data set. It may come as a surprise to many that its 360 year long record conclusively shows that increased insolation suppresses long term temperature up-trend ( see link )

tjfolkerts

“For the Kansas example, we see that BEST shows on its climate page that the State has warmed 1.98 +/-0.14°C since 1960. That looks like temperatures will be another 2°C higher in the next 50 years, and we should be alarmed.”
This is misinterpreting the BEST results. The page clearly labels the result as “Warming since 1960
(°C / century)”. Ron gets it right later, but there is no reason to say BEST is misleading anyone. It does NOT look like it will be warming 2C in 50 years; it looks exactly like the warming will be 2C in the next 100 years.
“Since the analysis is bottom-up, microclimate trends can be compiled together for any desired scope: municipal, district, region, province, nation, continent.”
But what you would really want is some sort of weighted average. If the eastern half of Kansas had 20 stations and the western half (with far fewer people and cities) had 6, simply averaging these will not give a good estimate of the warming for the state. Similar problems continue when going to national or global estimates.
“So in the last century the average Kansas station has warmed 0.70+/-0.45°C …
One rather interesting thing here is that this means there was COOLING for the first half of the century, and warming for the second half. If the stations warmed ~ 1 C in the last 50 years (ie 2 C/century), then simple subtraction says the stations must have cooled by ~ 0.3 C in the previous 50 years to get the 0.7 C figure for the entire 100 years.

If anyone is interested in looking a little deeper into the weather station trends, here are a few pointers.
First, the weather station data is the air temperature measured in an enclosure placed at eye level 1.5 to 2 m above the ground. The minimum temperature is an approximate measure of the bulk air temperature of the base of the local weather systems as they pass through. The maximum temperature is a measure of the convective mixing of the warm air rising from the solar heated ground below with the cooler air at the thermometer level. This depends on cloud cover and the surface evaporation.
In many regions, the prevailing weather systems are formed over the oceans. The trend in the minimum temperature could have a strong signal from the trend in ocean surface temperature in the region of origin. For example, in California, the PDO sticks out like a sore thumb. In the UK and surrounding regions it is the AMO.
To start, download the relevant ocean trend, (AMO. PDO etc.), do a 5 year rolling average for both the ocean and station data, do an XY plot and look for the ‘fingerprint’. Then run a linear fit to the 5 year averages and get the slope from the Excel ‘Trendline’ function. The differences in slope should give information on the local urban heat island effects.
Now subtract the maximum temperatures from the minimum and look at the trend in the delta T. The delta T will increase with sunlight and low rainfall (less surface evaporation).
By the way, none of this has anything to do with CO2 ……..
Further details can be found at:
http://scienceandpublicpolicy.org/originals/pacific_decadal.html
http://www.venturaphotonics.com

Gamecock

Unless you use some funky, made up definition of “climate,” a few degrees change does NOT constitute climate change. My Cwa would still be Cwa with a few degrees change.
http://en.wikipedia.org/wiki/K%C3%B6ppen_climate_classification

Sweet Old Bob

Feb. and Mar. this year , in “my” part of Ks were 8.37 and 2.75 deg F below avg.(high temps )
( Per AccuWeather records) And if Big Joe B . is right , this trend may continue …Brrrr…

Jeff D.

Since I have the spreadsheet skills on par with a “Climate Scientist” could someone tell me if possibly one of the station records is a reliable one and would pass Mr. Watts station sighting criteria for minimal UHI? And if so what is its trend all by itself?

Latitude

When you look behind the summary page at BEST, it reports that the Kansas warming trend since 1910 is 0.75°C +/-0.08
===============
Paul Homewood:USHCN Adjustments In Kansas
In addition to recent temperatures being adjusted upwards, we also find that historical ones have been adjusted down. So, for instance we find that the January 1934 mean temperature at Ashland has been adjusted from 3.78C to 3.10C, whilst at Columbus there is a reduction from 4.00C to 3.52C.
In total, therefore, there has been a warming trend of about 1C added since 1934
http://notalotofpeopleknowthat.wordpress.com/2014/06/28/ushcn-adjustments-in-kansas/

Latitude
F. Ross

Or there is The Old Farmer’s Almanac (also cheaper than a super computer).
Here is the prediction for Topeka, KS for 2014
>> http://www.almanac.com/weather/longrange/KS/Topeka
For August about 3°F above average
It would be interesting to see how the Almanac compares with the author’s spreadsheet.

I did a similar analysis in February 2010, using HADCRUT3 data, for 87 US cities in the database. Results are sorted by state and posted at
http://sowellslawblog.blogspot.com/2010/02/usa-cities-hadcrut3-temperatures.html

Sweet Old Bob

@ F Ross at 9:21 am…The AccuWeather record and forecast has July being 1.77 deg below avg.
TOFA says a couple deg. above..will be interesting to see which is closest to actual temps.

Greg Goodman

“For the Kansas example, we see that BEST shows on its climate page that the State has warmed 1.98 +/-0.14°C since 1960. That looks like temperatures will be another 2°C higher in the next 50 years, and we should be alarmed.”
STOP. This is just the stupid simplistic kind of thing that AGW is based on.
First there is no reason to fit a linear model to climate data because nothing about it is linear. It’s a fallacy before you do the calculation. Second and probably worst is the idea that the average straight line through all the ups and downs of climate will be the same for the next 100 years as it was for the last 100 (or 50 or 35 or whatever amount of data you happen to have for one area) . What on earth is that idea based on?
The whole problem is this idea of a “trend”. Climate does not “trend”, it changes rather irratically. Somewhere in this word “trend” is unspoken assumption that once a “trend” is measured we can assume that it will be “likely” to continue. THIS IS A FALLACY.
“If you want to understand temperature changes, you should analyze temperature changes, not the temperatures.”
That’s great. I’ve been saying this for years but stick to looking at rate of change of temperature and stop thinking of “trends”.
BTW the land data you are looking is not HadCrut3, that is a land and sea dataset made from combining HadSST2 and CRUTem3. I think you mean CRUTem3 . It does wonders for cred. it you at least know the name of dataset you are using and know the difference between land and sea.
You should probably correct this in the article.

Greg Goodman

Another problem you have with this approach is what I call “cosine warming”.
http://climategrog.wordpress.com/?attachment_id=209
It depends upon where you start and finish in the data. This has been heavily discussed, usually with reference to “cherry picking”. Even if you do not cherry pick but use the data you have, if each station you look has a different time span, your idea of “nomalising” then all to a per century “trend” will be heavily biased by where any particular records starts and ends.
You make the erroneous assumption that they are all somehow comparable once you have “normalised” them.
This is a major problem since you can easily double a rate of change found in a cosine, which has zero long term trend anyway. The whole thing is a fiction.

Greg Goodman

http://www.metoffice.gov.uk/hadobs/hadcrut3/
“HadCRUT3 dataset
The gridded data are a blend of the CRUTEM3 land-surface air temperature dataset and the HadSST2 sea-surface temperature dataset. ”
However, having read the page you linked to I can see why you were confused. It certainly reads like HadCrut3 is the land station data you used. This is very poor presented.

For the last several months I have been downloading Environment Canada data for sites that I have interest in, sometimes because people have said it is warming and I really haven’t noticed much.
The data says some places are warming, some places are not. Some show that there is significant MEAN warming, but what virtually every data set I have looked at shows NO INCREASE in Extreme Maximum or Mean Maximum temperatures. What they do show, is that it is getting LESS COLD. That is, the Extreme Minimum and the Mean Minimum Temperatures are not as low as they were 60, 80 or 100 years ago. That increases the Mean Temperature and provides a Global Warming “Signal” even though it isn’t getting hotter – at least in my terms.
And in those locations where the Extreme Maximum Annual temperatures and the Maximum Mean Annual temperatures do show an upward trend, it is because seasonally low maximums (winter) have increased, not because summer temperatures have increased. In fact in several cases I have looked at (Winnipeg, Manitoba for example) the summer maximum temperatures show a slight decline in trend (cooling) while the winter maximum temperatures show a slight rise in trend (warming). In fact, I see a number of CONVERGING trend lines in the data when plotting sites with 80 to over 100 years of temperature data.
I have seen some papers on WUWT suggesting that Average global warming is more due to LESS cold than actual warming. Of course, that is net warming but I think it is important to know what is causing that warming in the data because it isn’t because it is getting so hot we are all going to have to move towards the poles. It seems that weather is simply getting less extreme, with low mean and low extreme temperatures getting LESS cold.
With the exception of our desire in my region to see extreme cold weather to kill the western pine beetle, this would seem to be a benefit. It isn’t any hotter in the summer (the hot years were back in the 30’s and 40’s) and you don’t have to use so much heat in the winter (as also opposed to the 30’s and 40’s and the turn of the century as where I live temperatures regularly hit -45 C and even lower – Dec 1924 & Jan 1935: -53.9, Feb 1936: -55.6, Environment Canada data, Rocky Mountain House, AB)
I am not a statistician, climatologist or mathematician, just an old engineer/rancher interested in comparing data to what I have experienced and heard from my prairie ancestors.
I have no idea if the Environment Canada data has been adjusted but I draw my speculation from what I have downloaded from the EC site at this site. I also admit that I have at times spliced data due to changes in the local stations since one will have a record from say 1917 to 1940 and a second from 1942 to 2007. A lot of stations stop at 2007 for some reason. And a lot of stations have missing data and breaks especially some of the really old and rural sites.
And finally, since what I have provided is just anecdotal so far, I have posted two sites I just started some work on yesterday for people to examine if they wish.
Rocky Mountain House, AB: https://www.dropbox.com/s/j2bu889j037q6j6/Rocky%20Mountain%20House.ods
Winnipeg, MB:
https://www.dropbox.com/s/98i6yrkf94o8zpx/Winnipeg.ods

ecowan

The local rag ran an article about the warming of the Northeast and Southwest and a companion article, in which they claim that the average summer temps in Carson City, NV have risen by 6.8 degrees F since 1984.
The last couple of summers we have not had more than 1-2, or NO days of 100+ temps.
This June seems to be warmer than usual. The incompetent idiots who supply the weather data to the NV Appeal, always have highs 1-4 degrees higher than anything the TV weather stations report.
In any event, an increase 6.8 degrees seems over the top!
Maybe the temps are recorded beside a compost pile? Since they are all in for AGW, I wouldn’t put it past them.

Rob

I like this approach because whenever you take any kind of average you are losing information and in this case, the averaging is being done AFTER the trend analysis rather than before making each of them easier to both understand and address. Thus, if someone wants to argue that the mean for the state needs weighting, they can do that to their heart’s content separately from the trend analysis and if someone wants to complain that the trends are not calculated properly, they can do that separately from the averaging.
The bad thing about it is that is gives people lots of scope to do exactly that – keep playing around with weighting and trend analysis until they get an answer that suits them. This is the crux of the matter with land temperature records – they are full of so many little problems (TOBS bias, UHI effects, station site changes) that you can pretty much justify almost any manipulation you like until to come to the answer you are looking for.
At the end of the day, historical surface records are only ever going to be a battle-ground in the climate change argument and one in which the equivalent of trench warfare ca. 1914-1918 is being undertaken: Lots of effort and many casualties for little overall gain.

ecowan says:
July 12, 2014 at 11:26 am
Take a look at the minimum temperatures in Carson. Per my post above, I suspect the minimums have rising thus causing the “Average” temperature to rise. I have looked at several sites and most of them show this warming bias due to the low temperatures becoming LESS cold over time.

I would not trust HAdcrut
rather use tutiempo.net
I looked at the average change per annum, see here for my procedure
http://blogs.24.com/henryp/2013/02/21/henrys-pool-tables-on-global-warmingcooling/
Again folks
there is no man made warming
whatsoever
There simply is no room for it in my last [established] equation
(at the end of the minima table)

Richard Mallett

Thank you so much for this. I have been collecting data from the longest records in the world in CRUTem4. I will probably end up with 28 stations with good full year coverage both before and after 1850 (which people seem to use as the start of the industrial era)

Andrew

Gee, warming very little in summer and strongly in winter?? Quelle horreur!

FrankK

vukcevic says:
July 12, 2014 at 8:37 am
CET is the best known and the longest regional data set. It may come as a surprise to many that its 360 year long record conclusively shows that increased insolation suppresses long term temperature up-trend ( see link )
——————————————————————————————-
Explain what you mean by “insolation” please.

FrankK says: July 12, 2014 at 2:06 pm
Explain what you mean by “insolation”please.
See here
http://www.eoearth.org/view/article/151844/

John F. Hultquist

Gamecock says:
July 12, 2014 at 8:52 am
“funky

I’ve been making statements such as this for almost 6 years since getting a fast internet connection in fall of 2008. That funky thing – average global temperature – seems to have stuck in people’s minds regardless of how useless it is. Now in a BSk and today at the local air station (KELN) the temp just reached 102°F. Six months from now we will have a night or two (or weeks) when it will go to -10°F; or maybe colder. Our precip is mostly controlled by the rain shadow effect of the Cascade Mountains; averaging about 8 inches per year.

Lies, damn lies and statistics. I wonder what is the “rate” since, say, 1990?

You can look up the corresponding GHCN unadjusted trends on this globe map if you want. Just click on stations for details.
“The +/- 0.45 deviation shows that climate is different from site to site even when all are located on the same prairie.”
It doesn’t show that. It shows that the raw readings, which are actually reports of max/min markers at certain times and places, yield a different calculated monthly average from station to station. You choose to disregard adjustments, but they are what are needed to see whether the deviations are indeed climate.
A station changing its time of observation can easily make a difference of 0.8°C to the average monthly temperature calculated from the readings. That’s not a difference in climate.

pat

what a difference six months can make!!!!
13 July: Khaleej Times: Unmasking the chilling conspiracy of global warming
A tragedy turned an Indian chemistry graduate into a filmmaker. Now he wants to expose what he says 
is the global warming myth, 
reports Nivriti Butalia…
Nine countries in the past four months. Not a bad record, especially when you consider that Zanane Rajsingh, a 26-year old filmmaker from Gujarat, first went abroad in January…
His new film is a full-length feature, an American-Indian venture on climate change produced by US-based company Nanoland. The script is by Dr Rajeshkumar Acharya, Nanoland’s corporate advisor.
The preparation for the movie took time: Six months of reading to be convinced that global warming was a farce, as claimed by the script writer, Dr Acharya…
“People think climate change is (due to) global warming. But global warming is a myth.”
That’s what the film talks about, the global warming ‘conspiracy’.
The first part of the film releases in January 2015. Currently, the director is on a packed schedule and constantly travelling…
http://www.khaleejtimes.com/kt-article-display-1.asp?xfile=data/nationgeneral/2014/July/nationgeneral_July55.xml&section=nationgeneral
8 Feb: Times of India: Ankur Tewari: ‘I am Earth, ravaged by climate change’
All these and other mysteries about the environment will be dealt with in India’s first ‘feature film’ on climate change – It’s Tomorrow’ – written by non-resident Gujarati, Dr Rajeshkumar Acharya, who currently lives in the US. It is directed by Zanane Rajsingh, who was born in Nagpur but studied in Ahmedabad. The director said they had initially planned a documentary but had finally decided to make a feature-length film on the subject. Interestingly, the movie has the Earth talking to the audience directly and telling them about its plight caused by climate change.
Will human beings become extinct in another 100 years? Startling questions like this one about the environment and life on earth will be answered in the film which is about 20% ready. It will explain climate change, its impact and the factors that influence climate on earth, Rajsingh said.
“‘It’s Tomorrow’ aims to sensitize the audience about climate change and global warming. It is a wake-up call to the world. The one-and-a-half-hour film explores what a worst-case scenario might look like and gives a glimpse of the kind of floods and other natural disasters that might visit us in future…
Talking about the crew of the film, Acharya said that ‘It’s Tomorrow’ is a political drama that revolves around the ongoing climate change phenomenon around the world. “Hence we’ve political giants, renowned scientists and actors playing roles in it. Actor Letiana Bohlke from Colombia will play the role of a strong political leader of Argentina, while Xenia Henriquez will essay the role of a former president of Philippines. Kiyoshi Kuzuwa, a noted Japanese scientist, will play himself in the film,” said Acharya…
The film is jointly produced by a US-based firm Nanoland Inc and an Indian firm, Nanoland Ltd…
http://timesofindia.indiatimes.com/city/ahmedabad/I-am-Earth-ravaged-by-climate-change/articleshow/30026768.cms

John Mann

Changing climates? Forget temperature; watch the vegetation. If you do not want to use Koeppen, use the USDA growing regions map (plant and seed catalogs) and see if the plants in a higher numbered region are starting to grow naturally and thrive in your (lower numbered?) region. Crepe Myrtles grow well in zone 7-9, for instance. I live in Maryland and CAGW becomes real for me when palm trees grow naturally in my back yard, moving zone 9 vegetation up to the north end of zone 7. How long will that take do you think?

skorrent1

Agree with Gamecock. The author deals with temperature and talks of “climate”. The climate of a site or region cannot be described by talking of temperature alone. Equally important are factors such as humidity, altitude, prevailing winds, precipitation, and daily and seasonal variations in them. The CAGW alarmists “jumped the shark” when they latched on to “climate change” so that they could latch onto every extreme weather event. We should call them out consistently on this, not follow them in their bastardization of the meaning of “climate”.

Leigh

We in Australia know exactly what they,who control our temperature data are doing.
As do you in America.
But that is not the problem.
It’s holding these fraudsters to account that is the problem.
Yes, fraudsters is a very strong accusation but these adjustments did not happen by accident.
They were deliberately done to project a specific result.
Every adjustment was up to further enhance the fraud of gloabal warming.
Over at Nova an independant check of only 100 stations in Australia has exposed 84 of those station’s with some significant abnomylys.
The following being but one example.
“The most extreme example that Ken found of data corruption was at Amberley, near Brisbane, Queensland, where a cooling minima trend was effectively reversed, ” 
We already know what they are doing now we have to bring it to the attention of the wider public.
Where it directly affects who sits on the gravy train of politics in Australia.
I don’t know about America but thats how its done in Australia.
The current government gives us a glimmer of hope but are battling to get anything done against a tide of opposition corruption.
Who wholeheartedly support the fraud of global warming.

Brian H

GG;
“irratically” ain’t not no word, nohow.

Brian H

Leigh;
stations
anomalies
Do you have a good link for following this Australian saga? It’s very interesting, and important.

Leigh

Yes thanks for the correction Brian but spell check can be a bitch when your typing angry.
The link is just one of many over at Jo Nova.
If you do a search there under “BOM audit” many more will pop up.
Like a broken record I continue to post these links so others that may have missed them get to read them.
Unlike the global warmists, they are not fairytales.
They are evidence of crimes committed.
People need to understand the keepers of temperature records around the world are systematically altering history to further the global warming fraud.
Those data changes are not happening by accident.
Once or twice might be an accident but hundreds of times is not.
Somebody in every government controlled temperature record keeping place around the world is consciously altering temperature records.
They call them adjustments to raw data.(with out explanation)
I call it fraud.
We know their doing it but how do we stop them?
http://joannenova.com.au/2014/06/australian-bom-neutral-adjustments-increase-minima-trends-up-60/

gregole

This post is timely for me. I was just thinking the exact thing – what is my local climate doing? I had downloaded USHCN data for a couple of stations and actually drove out in the desert (Arizona) looking for them to make sure they were really rural. Both were, one nestled in a farm and one on an Indian reservation. Tomorrow, out to Ajo. Great tips. Thanks much.
I have also measured UHI out here in Arizona and it is wicked – 5 to 7 degrees F delta between city and field.
Great tip, I just have to say Thanks!! again. You saved me SOOoooo much time and effort.

CET-record adjustments?
I am not statistician, but it seems to me that the CET annual distribution has at least two suspect areas, of cours I could be wrong. It appears that there is a unexpected boost in numbers of years with cooler temperatures (around 8.6C) which is below the 9.2C average, and some above the average (around 10.5C). One possible effect of this (depending on the actual years concerned) would make the cooler past a bit colder and the more recent warm period even warmer. However, these are likely to be natural outliers in the normally expected annual distribution of some 650 data points.

It looks like “data dancing” to me.

Leigh

“It appears that there is a unexpected boost in numbers of years with cooler temperatures”
In Australia those downward “adjustments” are pre 1950.
Upward “Adjustments” are after 1950 to give an appearance of rapid temperature rises to coincide with increased industrialization after the war.
It is a simple “trick” being applied world wide to temperature records.
Commonly known as catostropheric anthropological global warming.
We know their doing it and we know it’s no accident.
But how do average people in the street stop them.
It is going to take a greater combined effort by those who can academically challenge them.
Back in March Eric Worrall touched on that in this post.
http://wattsupwiththat.com/2014/03/22/occams-razor-and-climate-change/

Ron Clutz

Interesting to read the responses here. Thanks for reading and commenting.
Some say the climate is much more than temperature, and I agree. A proper climate outlook would cover additional variables important to life, especially precipitation. A corollary principle: If you want to understand precipitation changes, study the changes, not the rainfall itself.
Preoccupation with surface temperatures comes from the global warming theory that posits temperatures are/will rise with rising CO2. I thought people should have a tool to look at the trends as they are.
It is said that a line is too simple a depiction of a complex phenomenon. True, but here’s the thing: The attempts to extract other signals from the noise are fraught with difficulty and statistical controversy. Just look at the long discussion at Bishop Hill (with some insightful comments from Nullis in Verba).
http://www.bishop-hill.net/blog/2014/7/3/where-there-is-harmony-let-us-create-discord.html#comments
It is common practice to start looking for the relation between two variables (here, time and temperature) by drawing a linear regression. That applies here, and it is rough, but useful. The deviations suggest that much more is going on and needs to be understood. But it’s a start to have a frame of reference.
I object to the BEST summary description of Kansas temperatures as rising 1.98 +/-0.14°C. Why? Because the number is partly fact and partly projection, a mixup of observed and not yet observed. It’s another indication of the malaise in climate science, where expectation and observation are repeatedly confused.
An example in the thread above are Nick Stoke’s comments. He basically is saying: These stations should not have readings so different from each other. There must be mistakes, and once we adjust for them, the records will be more in line. Well, yeah! BEST got the deviation down from 0.45 to 0.14. More about this later.
In keeping with what I said about BEST, I have to drop from my workbook the one station with an history less than 100 years. I didn’t like its shortness, but didn’t want to be accused of cherry picking (i.e. excluding available data from analysis). It happens to be only 64 years long with a century trend of 1.89°C, and removing it makes little difference to the results. Now the station ages range from 96 to 148 years old, so century observations are covered.
The point: Trustworthy people, including scientists, make it clear when they are reporting facts that have occurred, and when they are predicting future facts. Knowledge is proven ex post facto. That is why I said in the post, “if the next 100 years is anything like the last 100, then . . .” The conditional clause tells you that what follows depends on assuming the long term trend continues. If temperatures are much above or below trend, then we know something different is happening.

Ron Clutz

BTW, there is something valuable I left hidden in the workbook.
Think of it as a egg. “Egg”=something that is an hidden like to more information.
Has anyone found a peculiar thing about the workbook?

Ron Clutz

Bah. like should be link

Results are sorted by state and posted at
http://sowellslawblog.blogspot.com/2010/02/usa-cities-hadcrut3-temperatures.html
===========
what is interesting about these graphs is that they show how much variation there is in the data as compared to the average.
most of the stations show a small increase or decrease in average temperatures. but the change itself is miniscule in relation to the natural variability. thus, it could be that temperatures are going up or down simply as a result of chance, due to the large variability in the underlying data.

have you ever turned on the TV only you see your favorite team get scored on? have you ever thought that maybe it was you watching the game that somehow caused your team to get scored on? so you turned off the TV, and heard the next day that your team immediately rallied after you turned off the TV, and came back to with the game. have you ever thought as a result, that maybe somehow you can help your team win by not watching them?
is it possible that climate change, global warming is no different? that temperature is going up and down largely by chance, and we think we are the cause, so we try and figure out what we are doing that affects temperature. but in this case, instead of turning the TV on/off to help our team, we think that turning industrial production on/off will help temperatures.

Ron Clutz

Wayne Delbeke says:
July 12, 2014 at 11:25 am
That’s interesting. My next project is Canadian long term stations. Can you provide a link to raw station data from EC? I have data from HADCRUT3, and comparing could be interesting, also maybe EC records are longer.

An example in the thread above are Nick Stoke’s comments. He basically is saying: These stations should not have readings so different from each other. There must be mistakes, and once we adjust for them, the records will be more in line.
===========
the problem is that there is no allowance for station quality. maybe two stations are different because there is a quality difference. as a result you may end up adjusting the good data to match the bad data. especially when confounding factors such as urbanization and land clearing affect the majority of the stations and their data quality. in the end, the large number of poor data quality stations will overwhelm the good data quality stations, so that all you have left is poor data quality.
the problem is that the temperature data was never intended to be used to measure climate change. it was intended to provide short term weather forecasting. as such it didn’t matter if there was drift in the station data year to year, the short term weather forecasting is largely immune to this sort of change. however, even the slightest drift invalidates the data for climate purposes, because the climate signal over 100 years is so small as compared to the daily temperature signal.
Daily temperatures vary by 10-20C or more. And from this we are trying to tease out a climate signal of what, about 1C or less over 100 years. This is the nonsense of using weather station data. This data could easily drift 1C over 100 years and it would make no difference to short term weather forecasting, because the daily cycle is so large in comparison. So there was no reason to make the weather stations accurate enough for climate analysis. They were accurate enough for their purpose, which was short term weather forecasting.

Latitude

ferdberple says:
July 13, 2014 at 8:12 am
thus, it could be that temperatures are going up or down simply as a result of…..”adjustments”
http://stevengoddard.wordpress.com/2014/07/13/nasas-impressive-rate-of-data-tampering/

Ron Clutz

Latitude, ferdberple
I concur with many issues you are raising. It is why I wanted data as close to the station sources as possible. At least with HADCRUT3, the raw data is distinguishable from their gridded product. As Greg Goodman points out, there is some nomenclature confusion. in the link below, the land station data is referred to as a HADCRUT3 “subset” without naming it specifically; ocean data being another “subset.”
The FAQs say that this raw data is cleaned observation, prior to any gridding, anomalies or averaging.
“The station data that we are providing are from the database used to produce the global temperature series. Some of these data are the original underlying observations and some are observations adjusted to account for non climatic influences, for example changes in observations methods or site location.
The database, therefore, consists of the ‘value added’ product that has been quality-controlled and adjusted to account for identified non-climatic influences. Adjustments were only applied to a subset of the stations, so in many cases the data provided are the underlying data minus any obviously erroneous values removed by quality control. The Met Office do not hold information as to adjustments that were applied and, so, cannot advise as to which stations are underlying data only and which contain adjustments.
Underlying data are held by the National Meteorological Services (NMSs) and other data providers. Such data have in certain cases been released for research purposes under specific licences that govern their usage and distribution.
It is important to distinguish between the data released by the NMSs and the truly raw data, e.g. the temperature readings noted by the observer. The data may have been adjusted to take account of non-climatic influences, for example changes in observing methods, and in some cases this adjustment may not have been recorded, so it may not be possible to recreate the original data as recorded by the observer.”
http://www.metoffice.gov.uk/research/climate/climate-monitoring/land-and-atmosphere/surface-station-records/faq
Interesting that the adjustments Nick wants to make have already been done by HADCRU, and possibly also by the NMSs. Too many cooks could spoil this broth.
Someone (our host, perhaps) said recently: “Raw data ain’t what it used to be.”
But I persist because I know that there will continue to be claims of rising temperatures.