A way of calculating local climate trends without the need for a government supercomputer

This method may or may not have merit – readers are invited to test the merit of it themselves, the method is provided – Anthony

Guest essay by Ron Clutz

People in different places are wondering: What are temperatures doing in my area? Are they trending up, down or sideways? Of course, from official quarters, the answer is: The globe is warming, so it is safe to assume that your area is warming also.

But what if you don’t want to assume and don’t want to take someone else’s word for it. You can answer the question yourself if you take on board one simplifying concept:

“If you want to understand temperature changes, you should analyze temperature changes, not the temperatures.”

Analyzing temperature change is in fact much simpler and avoids data manipulations like anomalies, averaging, gridding, adjusting and homogenizing . Temperature Trend Analysis starts with recognizing that each microclimate is distinct with its unique climate patterns. So you work on the raw, unadjusted data produced, validated and submitted by local metrorologists. This is accessed in the HADCRUT3 dataset made public in July 2011.

The dataset includes 5000+ stations around the world, and only someone adept with statistical software running on a robust computer could deal with all of it. But the Met Office provides it in folders that cluster stations according to their WMO codes.

http://www.metoffice.gov.uk/research/climate/climate-monitoring/land-and-atmosphere/surface-station-records

 

Anyone with modest spreadsheet skills and a notebook computer can deal with a set of stations of interest. Of course, there are missing datapoints which cause much work for temperature analysts. Those are not a big deal for trend analysis.

The method involves creating for each station a spreadsheet that calculates a trend for each month for all of the years recorded. Then the monthly trends are averaged together for a lifetime trend for that station. To be comparable to others, the station trend is normalized to 100 years. A summary sheet collects all the trends from all the sheets to provide trend analysis for the geographical area of interest.

I have built an Excel workbook to do this analysis, and as a proof of concept, I have loaded in temperature data for Kansas . Kansas is an interesting choice for several reasons:

1) It’s exactly in the middle of the US with no (significant) changes in elevation;

2) It has a manageable number of HCN stations:

3) It has been the subject lately of discussion about temperature processing effects;

4) Kansas legislators are concerned and looking for the facts; and

5) As a lad, my first awareness of extreme weather was the tornado in OZ, after which Dorothy famously said: “We’re not in Kansas anymore, Toto.”

I am not the first one to think of this. Richard Wakefield did similar analyses in Ontario years ago, and Lubos Motl did trend analysis on the entire HADCRUT3 in July 2011. With this simplying concept and a template, it is possible for anyone with modest spreadsheet skills and a notebook computer to answer how area temperatures are trending. I don’t claim this analysis is better than those done with multimillion dollar computers, but it does serve as a “sanity check” against exaggerated claims and hype.

For the Kansas example, we see that BEST shows on its climate page that the State has warmed 1.98 +/-0.14°C since 1960. That looks like temperatures will be another 2°C higher in the next 50 years, and we should be alarmed.

Well, the results from temperature trend analysis tell a different story.

From the summary page of the workbook:

 

Area State of Kansas, USA  
History 1843 to 2011  
Stations 26  
Average Length 115 Years
Average Trend 0.70 °C/Century
Standard Deviation 0.45 °C/Century
Max Trend 1.89 °C/Century
Min Trend -0.04 °C/Century

 

So in the last century the average Kansas station has warmed 0.70+/-0.45°C , with at least one site cooling over that time. The +/- 0.45 deviation shows that climate is different from site to site even when all are located on the same prairie.

And the variability over the seasons is also considerable:

 

Month °C/century Std Dev
Jan 0.59 1.30
Feb 1.53 0.73
Mar 1.59 2.07
Apr 0.76 0.79
May 0.73 0.76
June 0.66 0.66
July 0.92 0.63
Aug 0.58 0.65
Sep -0.01 0.72
Oct 0.43 0.94
Nov 0.82 0.66
Dec 0.39 0.50

 

Note that February and March are warming strongly, while September is sideways . That’s good news for farming, I think.

Temperature change depends on your location and time of the year. The rate of warming is not extreme and if the next 100 years is anything like the last 100, in Kansas there will likely be less than a degree C added.

 

Final point:

When you look behind the summary page at BEST, it reports that the Kansas warming trend since 1910 is 0.75°C +/-0.08, close to what my analysis showed. So the alarming number at the top was not the accumulated rise in termperatures, it was the Rate for a century projected from 1960. The actual observed century rate is far less disturbing. And the variability across the state is considerable and is much more evident in the trend analysis. I had wanted to use raw data from BEST in this study, because some stations showed longer records there, but for comparable years, the numbers didn’t match with HADCRUT3.

Not only does this approach maintain the integrity of the historical record, it also facilitates what policy makers desperately need: climate outlooks based on observations for specific jurisdictions. Since the analysis is bottom-up, microclimate trends can be compiled together for any desired scope: municipal, district, region, province, nation, continent.

If there is sufficient interest in using this method and tool, I can provide some procedural instructions along with the template.

The Kansas Excel workbook is provided as an example: HADCRUT3 Kansas.xls

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

73 Comments
Inline Feedbacks
View all comments
FrankK
July 12, 2014 2:06 pm

vukcevic says:
July 12, 2014 at 8:37 am
CET is the best known and the longest regional data set. It may come as a surprise to many that its 360 year long record conclusively shows that increased insolation suppresses long term temperature up-trend ( see link )
——————————————————————————————-
Explain what you mean by “insolation” please.

July 12, 2014 3:21 pm

FrankK says: July 12, 2014 at 2:06 pm
Explain what you mean by “insolation”please.
See here
http://www.eoearth.org/view/article/151844/

John F. Hultquist
July 12, 2014 3:42 pm

Gamecock says:
July 12, 2014 at 8:52 am
“funky

I’ve been making statements such as this for almost 6 years since getting a fast internet connection in fall of 2008. That funky thing – average global temperature – seems to have stuck in people’s minds regardless of how useless it is. Now in a BSk and today at the local air station (KELN) the temp just reached 102°F. Six months from now we will have a night or two (or weeks) when it will go to -10°F; or maybe colder. Our precip is mostly controlled by the rain shadow effect of the Cascade Mountains; averaging about 8 inches per year.

AP
July 12, 2014 4:13 pm

Lies, damn lies and statistics. I wonder what is the “rate” since, say, 1990?

Nick Stokes
July 12, 2014 4:40 pm

You can look up the corresponding GHCN unadjusted trends on this globe map if you want. Just click on stations for details.
“The +/- 0.45 deviation shows that climate is different from site to site even when all are located on the same prairie.”
It doesn’t show that. It shows that the raw readings, which are actually reports of max/min markers at certain times and places, yield a different calculated monthly average from station to station. You choose to disregard adjustments, but they are what are needed to see whether the deviations are indeed climate.
A station changing its time of observation can easily make a difference of 0.8°C to the average monthly temperature calculated from the readings. That’s not a difference in climate.

pat
July 12, 2014 4:43 pm

what a difference six months can make!!!!
13 July: Khaleej Times: Unmasking the chilling conspiracy of global warming
A tragedy turned an Indian chemistry graduate into a filmmaker. Now he wants to expose what he says 
is the global warming myth, 
reports Nivriti Butalia…
Nine countries in the past four months. Not a bad record, especially when you consider that Zanane Rajsingh, a 26-year old filmmaker from Gujarat, first went abroad in January…
His new film is a full-length feature, an American-Indian venture on climate change produced by US-based company Nanoland. The script is by Dr Rajeshkumar Acharya, Nanoland’s corporate advisor.
The preparation for the movie took time: Six months of reading to be convinced that global warming was a farce, as claimed by the script writer, Dr Acharya…
“People think climate change is (due to) global warming. But global warming is a myth.”
That’s what the film talks about, the global warming ‘conspiracy’.
The first part of the film releases in January 2015. Currently, the director is on a packed schedule and constantly travelling…
http://www.khaleejtimes.com/kt-article-display-1.asp?xfile=data/nationgeneral/2014/July/nationgeneral_July55.xml&section=nationgeneral
8 Feb: Times of India: Ankur Tewari: ‘I am Earth, ravaged by climate change’
All these and other mysteries about the environment will be dealt with in India’s first ‘feature film’ on climate change – It’s Tomorrow’ – written by non-resident Gujarati, Dr Rajeshkumar Acharya, who currently lives in the US. It is directed by Zanane Rajsingh, who was born in Nagpur but studied in Ahmedabad. The director said they had initially planned a documentary but had finally decided to make a feature-length film on the subject. Interestingly, the movie has the Earth talking to the audience directly and telling them about its plight caused by climate change.
Will human beings become extinct in another 100 years? Startling questions like this one about the environment and life on earth will be answered in the film which is about 20% ready. It will explain climate change, its impact and the factors that influence climate on earth, Rajsingh said.
“‘It’s Tomorrow’ aims to sensitize the audience about climate change and global warming. It is a wake-up call to the world. The one-and-a-half-hour film explores what a worst-case scenario might look like and gives a glimpse of the kind of floods and other natural disasters that might visit us in future…
Talking about the crew of the film, Acharya said that ‘It’s Tomorrow’ is a political drama that revolves around the ongoing climate change phenomenon around the world. “Hence we’ve political giants, renowned scientists and actors playing roles in it. Actor Letiana Bohlke from Colombia will play the role of a strong political leader of Argentina, while Xenia Henriquez will essay the role of a former president of Philippines. Kiyoshi Kuzuwa, a noted Japanese scientist, will play himself in the film,” said Acharya…
The film is jointly produced by a US-based firm Nanoland Inc and an Indian firm, Nanoland Ltd…
http://timesofindia.indiatimes.com/city/ahmedabad/I-am-Earth-ravaged-by-climate-change/articleshow/30026768.cms

John Mann
July 12, 2014 5:46 pm

Changing climates? Forget temperature; watch the vegetation. If you do not want to use Koeppen, use the USDA growing regions map (plant and seed catalogs) and see if the plants in a higher numbered region are starting to grow naturally and thrive in your (lower numbered?) region. Crepe Myrtles grow well in zone 7-9, for instance. I live in Maryland and CAGW becomes real for me when palm trees grow naturally in my back yard, moving zone 9 vegetation up to the north end of zone 7. How long will that take do you think?

skorrent1
July 12, 2014 6:18 pm

Agree with Gamecock. The author deals with temperature and talks of “climate”. The climate of a site or region cannot be described by talking of temperature alone. Equally important are factors such as humidity, altitude, prevailing winds, precipitation, and daily and seasonal variations in them. The CAGW alarmists “jumped the shark” when they latched on to “climate change” so that they could latch onto every extreme weather event. We should call them out consistently on this, not follow them in their bastardization of the meaning of “climate”.

Leigh
July 12, 2014 6:33 pm

We in Australia know exactly what they,who control our temperature data are doing.
As do you in America.
But that is not the problem.
It’s holding these fraudsters to account that is the problem.
Yes, fraudsters is a very strong accusation but these adjustments did not happen by accident.
They were deliberately done to project a specific result.
Every adjustment was up to further enhance the fraud of gloabal warming.
Over at Nova an independant check of only 100 stations in Australia has exposed 84 of those station’s with some significant abnomylys.
The following being but one example.
“The most extreme example that Ken found of data corruption was at Amberley, near Brisbane, Queensland, where a cooling minima trend was effectively reversed, ” 
We already know what they are doing now we have to bring it to the attention of the wider public.
Where it directly affects who sits on the gravy train of politics in Australia.
I don’t know about America but thats how its done in Australia.
The current government gives us a glimmer of hope but are battling to get anything done against a tide of opposition corruption.
Who wholeheartedly support the fraud of global warming.

Brian H
July 12, 2014 9:45 pm

GG;
“irratically” ain’t not no word, nohow.

Brian H
July 12, 2014 9:49 pm

Leigh;
stations
anomalies
Do you have a good link for following this Australian saga? It’s very interesting, and important.

Leigh
July 12, 2014 10:26 pm

Yes thanks for the correction Brian but spell check can be a bitch when your typing angry.
The link is just one of many over at Jo Nova.
If you do a search there under “BOM audit” many more will pop up.
Like a broken record I continue to post these links so others that may have missed them get to read them.
Unlike the global warmists, they are not fairytales.
They are evidence of crimes committed.
People need to understand the keepers of temperature records around the world are systematically altering history to further the global warming fraud.
Those data changes are not happening by accident.
Once or twice might be an accident but hundreds of times is not.
Somebody in every government controlled temperature record keeping place around the world is consciously altering temperature records.
They call them adjustments to raw data.(with out explanation)
I call it fraud.
We know their doing it but how do we stop them?
http://joannenova.com.au/2014/06/australian-bom-neutral-adjustments-increase-minima-trends-up-60/

gregole
July 12, 2014 10:28 pm

This post is timely for me. I was just thinking the exact thing – what is my local climate doing? I had downloaded USHCN data for a couple of stations and actually drove out in the desert (Arizona) looking for them to make sure they were really rural. Both were, one nestled in a farm and one on an Indian reservation. Tomorrow, out to Ajo. Great tips. Thanks much.
I have also measured UHI out here in Arizona and it is wicked – 5 to 7 degrees F delta between city and field.
Great tip, I just have to say Thanks!! again. You saved me SOOoooo much time and effort.

July 13, 2014 12:37 am

CET-record adjustments?
I am not statistician, but it seems to me that the CET annual distribution has at least two suspect areas, of cours I could be wrong. It appears that there is a unexpected boost in numbers of years with cooler temperatures (around 8.6C) which is below the 9.2C average, and some above the average (around 10.5C). One possible effect of this (depending on the actual years concerned) would make the cooler past a bit colder and the more recent warm period even warmer. However, these are likely to be natural outliers in the normally expected annual distribution of some 650 data points.

July 13, 2014 2:40 am

It looks like “data dancing” to me.

Leigh
July 13, 2014 2:58 am

“It appears that there is a unexpected boost in numbers of years with cooler temperatures”
In Australia those downward “adjustments” are pre 1950.
Upward “Adjustments” are after 1950 to give an appearance of rapid temperature rises to coincide with increased industrialization after the war.
It is a simple “trick” being applied world wide to temperature records.
Commonly known as catostropheric anthropological global warming.
We know their doing it and we know it’s no accident.
But how do average people in the street stop them.
It is going to take a greater combined effort by those who can academically challenge them.
Back in March Eric Worrall touched on that in this post.
http://wattsupwiththat.com/2014/03/22/occams-razor-and-climate-change/

July 13, 2014 3:49 am

Interesting to read the responses here. Thanks for reading and commenting.
Some say the climate is much more than temperature, and I agree. A proper climate outlook would cover additional variables important to life, especially precipitation. A corollary principle: If you want to understand precipitation changes, study the changes, not the rainfall itself.
Preoccupation with surface temperatures comes from the global warming theory that posits temperatures are/will rise with rising CO2. I thought people should have a tool to look at the trends as they are.
It is said that a line is too simple a depiction of a complex phenomenon. True, but here’s the thing: The attempts to extract other signals from the noise are fraught with difficulty and statistical controversy. Just look at the long discussion at Bishop Hill (with some insightful comments from Nullis in Verba).
http://www.bishop-hill.net/blog/2014/7/3/where-there-is-harmony-let-us-create-discord.html#comments
It is common practice to start looking for the relation between two variables (here, time and temperature) by drawing a linear regression. That applies here, and it is rough, but useful. The deviations suggest that much more is going on and needs to be understood. But it’s a start to have a frame of reference.
I object to the BEST summary description of Kansas temperatures as rising 1.98 +/-0.14°C. Why? Because the number is partly fact and partly projection, a mixup of observed and not yet observed. It’s another indication of the malaise in climate science, where expectation and observation are repeatedly confused.
An example in the thread above are Nick Stoke’s comments. He basically is saying: These stations should not have readings so different from each other. There must be mistakes, and once we adjust for them, the records will be more in line. Well, yeah! BEST got the deviation down from 0.45 to 0.14. More about this later.
In keeping with what I said about BEST, I have to drop from my workbook the one station with an history less than 100 years. I didn’t like its shortness, but didn’t want to be accused of cherry picking (i.e. excluding available data from analysis). It happens to be only 64 years long with a century trend of 1.89°C, and removing it makes little difference to the results. Now the station ages range from 96 to 148 years old, so century observations are covered.
The point: Trustworthy people, including scientists, make it clear when they are reporting facts that have occurred, and when they are predicting future facts. Knowledge is proven ex post facto. That is why I said in the post, “if the next 100 years is anything like the last 100, then . . .” The conditional clause tells you that what follows depends on assuming the long term trend continues. If temperatures are much above or below trend, then we know something different is happening.

July 13, 2014 3:55 am

BTW, there is something valuable I left hidden in the workbook.
Think of it as a egg. “Egg”=something that is an hidden like to more information.
Has anyone found a peculiar thing about the workbook?

July 13, 2014 3:56 am

Bah. like should be link

ferdberple
July 13, 2014 8:12 am

Results are sorted by state and posted at
http://sowellslawblog.blogspot.com/2010/02/usa-cities-hadcrut3-temperatures.html
===========
what is interesting about these graphs is that they show how much variation there is in the data as compared to the average.
most of the stations show a small increase or decrease in average temperatures. but the change itself is miniscule in relation to the natural variability. thus, it could be that temperatures are going up or down simply as a result of chance, due to the large variability in the underlying data.

ferdberple
July 13, 2014 8:17 am

have you ever turned on the TV only you see your favorite team get scored on? have you ever thought that maybe it was you watching the game that somehow caused your team to get scored on? so you turned off the TV, and heard the next day that your team immediately rallied after you turned off the TV, and came back to with the game. have you ever thought as a result, that maybe somehow you can help your team win by not watching them?
is it possible that climate change, global warming is no different? that temperature is going up and down largely by chance, and we think we are the cause, so we try and figure out what we are doing that affects temperature. but in this case, instead of turning the TV on/off to help our team, we think that turning industrial production on/off will help temperatures.

July 13, 2014 8:32 am

Wayne Delbeke says:
July 12, 2014 at 11:25 am
That’s interesting. My next project is Canadian long term stations. Can you provide a link to raw station data from EC? I have data from HADCRUT3, and comparing could be interesting, also maybe EC records are longer.

ferdberple
July 13, 2014 8:49 am

An example in the thread above are Nick Stoke’s comments. He basically is saying: These stations should not have readings so different from each other. There must be mistakes, and once we adjust for them, the records will be more in line.
===========
the problem is that there is no allowance for station quality. maybe two stations are different because there is a quality difference. as a result you may end up adjusting the good data to match the bad data. especially when confounding factors such as urbanization and land clearing affect the majority of the stations and their data quality. in the end, the large number of poor data quality stations will overwhelm the good data quality stations, so that all you have left is poor data quality.
the problem is that the temperature data was never intended to be used to measure climate change. it was intended to provide short term weather forecasting. as such it didn’t matter if there was drift in the station data year to year, the short term weather forecasting is largely immune to this sort of change. however, even the slightest drift invalidates the data for climate purposes, because the climate signal over 100 years is so small as compared to the daily temperature signal.
Daily temperatures vary by 10-20C or more. And from this we are trying to tease out a climate signal of what, about 1C or less over 100 years. This is the nonsense of using weather station data. This data could easily drift 1C over 100 years and it would make no difference to short term weather forecasting, because the daily cycle is so large in comparison. So there was no reason to make the weather stations accurate enough for climate analysis. They were accurate enough for their purpose, which was short term weather forecasting.

Latitude
July 13, 2014 8:50 am

ferdberple says:
July 13, 2014 at 8:12 am
thus, it could be that temperatures are going up or down simply as a result of…..”adjustments”
http://stevengoddard.wordpress.com/2014/07/13/nasas-impressive-rate-of-data-tampering/

July 13, 2014 10:02 am

Latitude, ferdberple
I concur with many issues you are raising. It is why I wanted data as close to the station sources as possible. At least with HADCRUT3, the raw data is distinguishable from their gridded product. As Greg Goodman points out, there is some nomenclature confusion. in the link below, the land station data is referred to as a HADCRUT3 “subset” without naming it specifically; ocean data being another “subset.”
The FAQs say that this raw data is cleaned observation, prior to any gridding, anomalies or averaging.
“The station data that we are providing are from the database used to produce the global temperature series. Some of these data are the original underlying observations and some are observations adjusted to account for non climatic influences, for example changes in observations methods or site location.
The database, therefore, consists of the ‘value added’ product that has been quality-controlled and adjusted to account for identified non-climatic influences. Adjustments were only applied to a subset of the stations, so in many cases the data provided are the underlying data minus any obviously erroneous values removed by quality control. The Met Office do not hold information as to adjustments that were applied and, so, cannot advise as to which stations are underlying data only and which contain adjustments.
Underlying data are held by the National Meteorological Services (NMSs) and other data providers. Such data have in certain cases been released for research purposes under specific licences that govern their usage and distribution.
It is important to distinguish between the data released by the NMSs and the truly raw data, e.g. the temperature readings noted by the observer. The data may have been adjusted to take account of non-climatic influences, for example changes in observing methods, and in some cases this adjustment may not have been recorded, so it may not be possible to recreate the original data as recorded by the observer.”
http://www.metoffice.gov.uk/research/climate/climate-monitoring/land-and-atmosphere/surface-station-records/faq
Interesting that the adjustments Nick wants to make have already been done by HADCRU, and possibly also by the NMSs. Too many cooks could spoil this broth.
Someone (our host, perhaps) said recently: “Raw data ain’t what it used to be.”
But I persist because I know that there will continue to be claims of rising temperatures.