More on the Beeville, TX weather station

More about Beeville

by Ecotretas

The Beeville story just keeps getting better.

In the comments section of yesterday’s WUWT post, I got a couple of ideas. First, there is a very interesting site where we can graph adjusted vs. non-adjusted temperatures of GHCN. The first graph above is the result for the Beeville station. A clear difference is visible between adjusted and non-adjusted temperatures, especially during the first half of the XX century. And looking at the blue line does give us an impression that Global Warming might not be happening in Beeville.

Being a skeptic, I searched for the raw data. The monthly data is available at the NOAA site. Got the data for Beeville and plotted the second graph above (click the graphs for better detail). Does anyone see any warming going on? Doing a linear trendline on the monthly data gives us “y = -0.0637x + 829.59”, which means that temperatures have gone down! And now, imagine which were the 20 hottest months at Beeville, for the last 113 years:

Month Temperature (x 10 ºF)
1951/8 888
2009/7 880
1998/7 879
1952/8 878
2009/8 877
1953/7 876
1902/8 875
1998/6 872
1897/7 871
1915/7 871
1980/7 871
1914/7 869
1915/8 869
1916/6 869
1938/7 869
1951/7 869
1958/8 869
1911/8 868
1954/8 867
1927/8 866

Might Julisa Castillo deserve a prize, after-all?

0 0 votes
Article Rating
65 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
June 9, 2010 10:44 am

Monthly data can also be plotted at the Appinsys climate grapher referenced in the post (as can individual month data or multi-month averages).
Here is the monthly data for Beeville: http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100MJanDecI188020090900110AR42572251007x

markinaustin
June 9, 2010 10:55 am

that’s a few miles from us here in austin!

Peter Miller
June 9, 2010 11:10 am

Obvious question – just how many more Beevilles are there?
Can someone do a random audit of about 100 stations to try and get an idea of just how systematic this data manipulation has been?

Grant Hillemeyer
June 9, 2010 11:22 am

It looks like the temperatures were adjusted down 2 degrees for the first half of the century. Who makes that adjustment. Is there an explanation for it?
Grant

Enneagram
June 9, 2010 11:29 am

This clearly shows what the weather is ☺. But, seriously, what makes all the difference is, that if the Y axis is scaled in ONE by ONE degree, all warmings (and warmers too) disappear and become straight lines.

Edvin
June 9, 2010 11:36 am

You’re doing it wrong, obviously. What you need to plot is “value added”, where the value added is the narrative of CAGW.

kwik
June 9, 2010 11:53 am

Its time to show young Peters project again;

phys_hack
June 9, 2010 11:55 am

The alarmist sites hail these adjustments as a good thing. An improvement. But as a data pro, I would say that if the adjustments are big enough to be the big story, then their methodology is what ought to be disclosed. It’s not good enough to call it an improvement, programmed by top experts. Precisely what improvement are we talking about? What was wrong with the original data? What was done to it? If for no other reason, what happens if we discover some new factor that ought to be added to the mix? Can’t do it if we don’t know what the mix is.

Mike McMillan
June 9, 2010 12:00 pm

markinaustin says:June 9, 2010 at 10:55 am
that’s a few miles from us here in austin!

That’s Texas miles, of course, which are biggern’ regular miles.
This is just the regular fiddling with the data that Dr Hansen’s been doing with his GISS homogenization for years. GISS warming is bogus, and now they’ve migrated it into the USHCN data set. Doing all the statistical analyses in the world to tease trends out of the new and improved data won’t do a bit of good, because your only teasing out what the adjustment algorithms have programed into it.

Buffoon
June 9, 2010 12:02 pm

Why the factor of ten? Is this a traditional data-handling method?

Jacob
June 9, 2010 12:06 pm

What is needed is for some auditor to run through the adjustment process, step after step, and try to replicate the results. I understand that these adjustment algorithms have been published, and also the computer code (in ancient Fortran). Replication should be possible.

Robert M
June 9, 2010 12:11 pm

That fraudulent shill for big oil fourth grader cherry picked the station! There are TONS of stations out there that have not been corrupted yet!
But you can rest assured AGW supporters are on the job! And they will not rest, until all of the station data proves what is already known to be true. The science is settled, and soon the data will match. Sure the adjustments create warming where there does not appear to be any, and yes, adjusting older temps down seems weird, but it solves some pressing problems with our effort.
We used to have a terrible time with uneducated people calling to complain, saying things like, “Dude, it wasn’t that hot at my house yesterday! Watts up with that?”. Adjusting older temps down gives us the results we need to prove AGW exists without the sheeple complaining. No one (except for pesky fourth graders) notices when we adjust hundred year old data down to make the current period look warmer. The people who still complain about this don’t understand that in the past it was colder than it was, so the records have to be adjusted for that. Anyone who cannot accept such simple, irrefutable facts needs to be silent and let the experts handle it.
/sarc

Enneagram
June 9, 2010 12:11 pm

Does anybody know how many degrees of temperature are we, humans, able to discriminate? One degree?, Half a degree?, two degrees?

Mark Wagner
June 9, 2010 12:22 pm

I still say this looks like a sign error in their code.
Instead of adjusting future years for UHI, they’ve got a minus in the year counter and it’s instead applying the cooling adjustment to prior years. And each year the focal point moves forward and the adjustment to year 1 gets even cooler. This would account for why adjustments to earlier years are constantly “changing.”
Run it out 100 years and year 1 would fall to below zero.
It’s a simple error to make, but difficult to catch without very exacting tests, and perhaps difficult to locate and correct. Once the program is “done” they just run it every month and never look at the data in this fashion to even be aware that there’s a problem.

Earle Williams
June 9, 2010 12:22 pm

The factor of 10 allows them to store and process the temp as an integer. That in itself has some implications within the processing steps.

Anthony Scalzi
June 9, 2010 12:24 pm

That’s a very handy tool for comparing unadjusted and adjusted records. Here’s Southern New England:
Groton, CT: 1.5 degree downward adjustment at beginning of record, 0 adjustment at present.
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572507002x
Stamford, CT: adjustment adds .5 degree cooling
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572504001x
Storrs, CT: nearly 1 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572508002x
3 out of 8 stations in Connecticut had trends changed by adjustments.
—-
Block Island, RI: over .5 degree downward adjustment at begginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572507001x
Providence, RI: 1.5 degree downward adjustment at beginning of record.
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572507006x
Providence is also a HADCRU station.
http://www.appinsys.com/GlobalWarming/climapgr.aspx?statid=NH:42572507000
2 out of 4 stations in Rhode Island had trends changed by adjustments.

Amherst, MA: 1.5 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42574491001x
Framingham, MA: .5 degree downward adjustment at beggining of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42574490002x
Lawrence, MA: .5 degree downward adjustment at begining of record and slight upward adjustment at present.
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42574490007x
Bedford, MA: .2 degree downward adjustment at begining of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42574490006x
Chestnut Hill, MA: 1 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572509001x
Taunton, MA: 2 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572507007x
Plymouth, MA: .5 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42574492001x
New Bedford, MA: a rare 1 degree upward adjustment at the beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572507005x
8 out of 19 stations in Massachusetts had trends changed by adjustments.

Paul
June 9, 2010 12:32 pm

When it comes to the timing of the adjustments, i.e. downwards until the 50s then upwards after, if you really wanted to help fit the temp data to changes in carbon emmissions, in particular those from fossil fuels (i.e. anthropogenic), that’s pretty much what you would have to do. If I look at graphs estimating carbon emissions, for example this one,
http://cdiac.ornl.gov/trends/emis/glo.html
the carbon emissions really take off around the 50s, and the adjustments to the temperature records make the temp graph mirror the carbon graph in a way the unadjusted temps don’t. Would they be so crude and so obvious? As a subscriber to Occam’s razor and its corollary the KISS principle, I would have to say yes until it was demonstrated to be otherwise.

Anthony Scalzi
June 9, 2010 12:35 pm

I’ve going through the New England stations using the tool from the post.
So far in southern New England (Connecticut, Rhode Island, and Massachusetts) 13 out of 31 stations have adjustments that change the trend.

Mike M
June 9, 2010 12:36 pm

Well as purely anecdotal evidence i can tell you there is a BIG difference here where i am sitting when my AC is set to 78 compared to when it is set to 77.

RHS
June 9, 2010 12:40 pm

Don’t forget we can not use the US an example because it only represents about 2% of the earth’s surface.
Guess we can’t use Beeville as example because it is significantly smaller than even 2% of Texas…

Alan S. Blue
June 9, 2010 12:43 pm

Can we have the nearest available satellite cell’s data as well? (Not the hemispheric or global mean.)
This pattern of “adjusting the past” is recurrent. All three of microsite, UHI, and any true warming affect the ground station. The same is not true for a satellite based estimate of the same gridcell. Thus any time there’s a “known” effect, the adjustment appears to be made to the pre-satellite data to minimize any discrepancy during the overlap period.
The problem is: A microsite issue should cause a discrepancy. UHI for cities that are significantly smaller than a gridcell should also turn up as a discrepancy.
The adjustments seem designed to avoid that.

Steve Oregon
June 9, 2010 1:13 pm

“Might Julisa Castillo deserve a prize, after-all?”
Or the beginning of a college fund started by WUWT contributors?
That might also attract more attention.

Rhoda R
June 9, 2010 1:15 pm

Could someone explain to me how you can convert a ‘text’ file to a useable spreadsheet please? I’d like to play with some of these numbers also, but I get defeated trying to re-enter all the numbers. thanks.

Tim Clark
June 9, 2010 1:15 pm

Jacob says: June 9, 2010 at 12:06 pm
What is needed is for some auditor to run through the adjustment process, step after step, and try to replicate the results. I understand that these adjustment algorithms have been published, and also the computer code (in ancient Fortran). Replication should be possible.

It’s been done, read all about it. This site is listed on the column to your immediate right.
http://chiefio.wordpress.com/

stephen richards
June 9, 2010 1:21 pm

Paul
Even if I wasn’t a screamingly cynical scientist I would still have to agree with you. It is too much of a coincidence that temps before 1950 were adjusted down after 1980.

Nuke
June 9, 2010 1:21 pm

Let’s see:
Take away the adjustments and the warming goes away. Take away the UHI effect and the warming goes away. Factor in the loss of reporting stations and the warming goes away.
So far, I’ve accounted for 300% of the warming. Are we sure we’re not in a cooling trend right now?
But seriously folks, don’t forget it’s the AGW proponents who must to show the warming is occurring, it’s the AGW proponents who must to show the warming unprecedented and it’s the AGW proponents who must to show the warming is not natural. So far, when the evidence is viewed in detail, the hypothesis looks to be all hat and no cattle.

Bill Yarber
June 9, 2010 1:49 pm

Interesting that none of the years in the 30’s, which were warmer than the 90’s in the NH are in the tow twenty warmest years at Beeville. Not sure what, if any, significance this has. But it sure is obvious, as wee have seen from Hansen and GISS before when the 1997 US temperature trend is compared with the 2007 trend. The period from 1880 to 1950 is adjusted down just as this was, but Hansen made sure the later years were adjusted up. Why is this fraud still working for NASA?

noaaprogrammer
June 9, 2010 2:12 pm

It would be interesting to start correlating all of the temperature adjustments that are being unearthed. This might allow us to see if the adjustments are being made based on the “need to do it” for a given location, or if a few fixed sets of adjustments are being used accross the board.

markA
June 9, 2010 2:13 pm

At Beeville 5 NE an adjustment for change of observation time needs to be applied for the period 1955-1964 since the observation time was changed from 5 or 6 PM to 8 AM in 1964. The period 1955-64 needs to be adjusted lower by perhaps 1.5 deg C to be properly compared with the post 1964 period. So at least some of the adjustment you see in the graph appears justified. Prior to 1955 the observation time was not noted until 1953 when it was given as 8 AM. The time series prior to 1953 was adjusted in such a way (red line) to imply the observation time was 5 or 6 PM, as it was from 1955-64. A check of the Texas Climatological Data publication for May 1949 indicates an obs time of 7 PM. Maybe someone at the Texas AgriLife Res Station (http://ccag.tamu.edu/locations/beeville/index.php) has a record of observation times prior to 1955. Without reliable knowledge of observation times prior to 1955, it is difficult to judge whether the adjustments are justified.
Using one of NCDC’s metadata sources I was able to come up with this timeline of important events at Beeville 5 NE since station establishment back in 1895:
Aug 1895 – Station begins.
Jun 1897 – Thermometer in shelter at regulation height
May 1914 – Shelter over sod, 30 ft from one-story building,
door opens to south, height 3 ft from ground
Feb 1918 – Shelter over garden soil and some grass, faces north,
bottom 4.5 ft above ground
May 1921 – Shelter over grass covered ground, faces north,
bottom 4 ft above ground
May 1949 – CD shows obs time of 7 PM for both temperature and precip
Aug 1953 – Time of Obs 8 AM, CRS with Max/Min Thermometers
Mar 1955 – Time of Obs 5 PM, station moved to better site
and reconditioned
May 1955 – station 450 ft NE of Experiment Farm office
Evaporation observation taken at 8 AM
Aug 1962 – Obs times changed: Temperature time of obs 6 PM, precip 7 AM
Apr 1964 – Obs times changed: 8 AM for both precip and temperature
Mar 1965 – New fence 30′ x 30′ installed by USWB; CRS, SRG, and
RRG relocated
Jan 1968 – Station relocated, moved 480 ft NW,
obs time still 8 AM
Mar 1972 – Cleaned CRS. Replaced min thermometer.
Jan 1978 – Palmer dial soil thermometer placed at 4 inch depth
Mar 1985 – MMTS installed
May 1993 – Equipment moved 900 ft SE

Anthony Scalzi
June 9, 2010 2:25 pm

Northern New England:
Cavendish, VT: .5 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42574484002x
Cornwall, VT: net .4 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572617001x
Chelsea, VT: somewhere between .5 and 1 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572614001x
Northfield, VT: 1.5 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572617002x
4 out of 8 stations in Vermont had trends changed by adjustments.

First Connecticut Lake(New Hampshire): .8 degree downward adjustment at beginning of record and .2 upward adjustment at present
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42572612002x
Keene, NH: cooling adjustments
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42574484001x
2 out of 6 stations in New Hampshire had trends changed by adjustments.

Portland, ME: 1.5 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42572606000x
Lewiston, ME: .5 degree downward adjustment at beginning of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42574389001x
Gardiner, ME: .2 degree upward adjustment at end of record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42574392002x
Farmington, ME: 1.5 degree downward adjustment at beginning of record. Changes a cooling trend to a warming trend as dramatically as Beeville.
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42572618001x
Orono, ME: .5 degree downward adjustment to the middle of the record
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42572619003x
Eastport, ME: .4 degree downward adjustment at end of record. Another rare station where warming is reduced by adjustments.
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42572608000x
Millinocket, ME: .7 degree downward adjustment at beginning of record and .2 degree upward adjustment at end of record.
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI185020090900111AR42572619005x
Ripogenus Dam, ME: .5 degree downward adjustment at beginning of record and .5 degree upward adjustment at end of record.
http://www.appinsys.com/GlobalWarming/climgraph.aspx?pltparms=GHCNT100AJanDecI188020090900111AR42572619006x
8 out of 19 stations in Maine had trends changed by adjustments.

14 out of 33 stations had adjustments that affected trends in northern New England. For all of New England, 27 out of 64 stations had trend affecting adjustments. Keep in mind that most of the stations that are unaffected by adjustments are too short to get a trend from anyway.

Mark Wagner
June 9, 2010 2:28 pm

to convert text to excel:
file/open – change file type to “text”
next, you’ll have to know if the data are separated by something: comma, tab, etc or if it’s just plain old text with just spaces.
delimited data are pretty straightforward. flat text files will require you to place the separator between data. this can be tricky if there are no visual clues.
Hope that helps.

James Sexton
June 9, 2010 2:39 pm

Rhoda R says:
June 9, 2010 at 1:15 pm
“Could someone explain to me how you can convert a ‘text’ file to a useable spreadsheet please? I’d like to play with some of these numbers also, but I get defeated trying to re-enter all the numbers. thanks.”
What spreadsheet program are you using and how is the txt file delimited and what OS are you using?
On a typical windows pc, just use the open with(rt. click) and tell windows what spreadsheet program(I use excel) you want it to open the txt file with. Then tell the spreadsheet prog. how the file is delimited(comma, space, ect.) If you have more questions, or more details, I’d be happy to help.

Steven mosher
June 9, 2010 2:54 pm

markA says:
MarkA does a fine job of explaining why stations get adjusted. It is a well known fact that changing the Time of Observation (TOB) will change the min/max recorded. We covered this issue at ClimateAudit in some detail back in 2007, so people should realize that this ground is well traveled. You can see the effect of changing TOB for yourself, by downloading hourly data from any ASOS. On the ClimateAudit TOBS thread I provide a link to a large sample ( 190 stations or so) of data where you can see how changing the TOB from any hour of the day to any other hour of the day has a small but PREDICTABLE impact. The seminal work here is Karls 1986 paper which has recently been updated ( err cant recall that one off the top of my head, but we discussed it over at Lucia’s )
Let me sum up in a General way what the situation is.
1. There is a known effect from changing the TOB
2. Historically those changes tend to require an adjustment that “cools” the past
3. The model for TOBS adjustment is an empirical model that has been tested in a double blind like fashion. That is, building a model while hold a sub sample out. then testing the accuracy of the model. The model considers a variety of factors such as lat/lon of the station, season, ect. The model works. Although I’d like to see the code.
The issue with TOBS as some of us have pointed out for the past two years is NOT the direction of the adjustment. That is well established. The issue is the failure to account for the standard error of prediction in the final calculations of uncertainty.
In short, when you adjust a record with a model ( see Wm Briggs ) you must carry forward the error of prediction. So, when one adjusts a monthly temp from 1C to
.8C, there is an error of prediction associated with this adjustment. That error of prediction is greater than the underlying measurement uncertainty.
Simply, adjustments are required. So merely pointing out that adjustments are made is not a very strong point. This issue is the math underlying the adjustements and the un accounted for uncertainties. The more people focus on the real issue the better.

Ian George
June 9, 2010 2:58 pm

I have seen this raw v adjusted data for many sites on GISS and Australia’s BOM where adjusted data have been ‘dumbed down’ prior to the 50’s and maintained after that to show an exaggerated warming trend.
I ran 50 long-term w/s around Australia and compared the raw data in 1914 yearly av maximum temps with 2008 and found that 1914 was a degree warmer. Yet the official graph shows 2008 warmer than 1914. Go figure.
I have also noticed that just recently, GISS have taken down their adjusted data and reverted back to the raw data for some sites. Does anyone know why?

Mike G
June 9, 2010 3:01 pm

kwik says:
June 9, 2010 at 11:53 am
Its time to show young Peters project again;
I wonder if young Peter’s dad repeated their project now if the rural temperature trend slope would still be zero. Or, have the inconvenient data they found for the rural sites on the GISS site since been updated to remove and inconvenient truths from the data?

strawbale
June 9, 2010 3:04 pm

from: Michael Hammer, June 27th, 2009
http://jennifermarohasy.com/blog/2009/06/how-the-us-temperature-record-is-adjusted/
“It is obvious that the only adjustment which reduces the reported warming is UHI which is a linear correction of 0.1F or about 0.06C per century, Figure 2. Note also that the latest indications are that even this minimal UHI adjustment has now been removed in the latest round of revisions to the historical record. To put this in perspective, in my previous article on this site I presented bureau of meteorology data which shows that the UHI impact for Melbourne Australia was 1.5C over the last 40 years equivalent to 3.75C per century and highly non linear.
Compare the treatment of UHI with the adjustments made for measuring stations that have moved out of the city centre, typically to the airport. These show lower temperatures at their new location and the later readings have been adjusted upwards so as to match the earlier readings. The airport readings are lower because the station has moved away from the city UHI. Raising the airport readings, while not adding downwards compensation for UHI, results in an overstatement of the amount of warming. This would seem to be clear evidence of bias. It would be more accurate to lower the earlier city readings to match the airport readings rather than vice versa.
Note also the similarity between the shape of the time of observation adjustment and the claimed global warming record over the 20th century especially the steep rise since 1970. This is even more pronounced if one looks at the total adjustment shown in Figure 3 (again from the same site as Figure 1). As a comparison, a recent version of the claimed 20th century global temperature record downloaded from http://www.giss.nasa.gov is shown in Figure 4. “

Mike G
June 9, 2010 3:16 pm

Why would it matter what time of the day you read a max/min thermometer?

June 9, 2010 3:24 pm

Ian, provide an example. I’ll take a look

June 9, 2010 3:27 pm

I’m sending the little girl $100 to help her with the trip she reportedly wants to take to Huntsville. What do you guys think? Are you in too?
Wisdom from Beeville

Mike G
June 9, 2010 3:54 pm

@Ken Coffman
Linked to your post and enjoyed it. Then I found your post “Undeniable Proof Al Gore is a Moron” and LMAO. You’r engineer did make an Al Gore caliber mistake when he said maybe the surface of the sun was that temperature, though.
http://www.gather.com/viewArticle.action?articleId=281474977943986

Ian George
June 9, 2010 4:10 pm

Steve,
One example – raw data shows 1915 at 27.4C at Lismore (Centre St) – now closed.
http://www.bom.gov.au/jsp/ncc/cdio/weatherData/av?p_nccObsCode=36&p_display_type=dataFile&p_startYear=&p_stn_num=058037
Trend data shows 1915 at 26.7C.
http://reg.bom.gov.au/cgi-bin/climate/hqsites/site_data.cgi?variable=maxT&area=nsw&station=058037&period=annual
Many more examples at BOM and GISS.
Off-line now for 2 days. If you need more, let me know.

James Sexton
June 9, 2010 4:36 pm

Mike G says:
June 9, 2010 at 3:16 pm
“Why would it matter what time of the day you read a max/min thermometer?”
Just guessing, but you may not have reached the max temp or min temp if you are careless about what time of day the thermometer is read. I would have thought the standard would be midnight every 24 hrs., but I’m weird that way.

June 9, 2010 5:13 pm

Mike G says:
Start here:
http://climateaudit.org/2007/09/24/tobs/
The issue has been discussed to death, both there and at other places.

KenB
June 9, 2010 5:14 pm

Sadly it seems we need a whistleblower to release a few Harry read me files and inconvenient emails from within this web of intrigue. There must be a lot of honest and dedicated people that work within those organisations who were naive enough to believe the absolute need and urgency of the media grabbing scientists, to alert the world to possible evil consequences, gain extra taxpayer funding in a competitive environment, that now, also realise that all the adjustments did was skew climate science in the wrong direction. It is inevitable that errors will be exposed and in the process become grist for the mill of the MSM to tarnish the reputation not just of the media hound scientists promoting the scares, but also. the day to day reputations of those that either chose not to speak out, or were too frightened to do so.
Better to now do the right thing and step forward and expose what you know, than be dragged down with those that are desperately defending the indefensible.
Lets do the “cap and trade” of honesty in science, before it is too late.
Just my two cents!!

SezaGeoff
June 9, 2010 5:15 pm

James Sexton and Mike G:-
But if you had left the Min/Max for 24 hours it would have experienced the min and max during that time. It would make a difference to the day that it was recorded against, but it will have experienced the extremes of the past 24 hours.

Mike G
June 9, 2010 7:56 pm

Who knows what those dummies back in the day would have done. Me, if the boss told me to read it at 0800, I’d put the min in today’s spot on the form for min and I’d put the max in yesterday’s spot on the form for max. It might not be correct for every situation. But, would it really matter in the long run?

Ray Boorman
June 9, 2010 8:08 pm

I agree with those who say TOBS does NOT matter. If you use a max/min thermometer, you will always record the lowest & highest temp each 24hrs. Can anyone prove that changing TOBS using max/min thermometers from morning to night or night to morning makes the slightest difference. Naturally, if all you are recording is the current temp once a day, than the records produced are USELESS for determining averages anyway, no matter what time you take them.

Mark C
June 9, 2010 8:11 pm

I just read the recommended thread at Climate Audit and finally had the clue light come on regarding the time-of-observation bias. In simple terms, if the min or max for the day tends to occur near the observation time, that leads to a bias. For example, if the obs time is 7am and the low for the day tends to be around that time, an extreme low temp at 7am may count as the min for two days instead of one, pulling the average down.
But do read that thread to get a fuller understanding of the TOB issue.

Mike G
June 9, 2010 8:16 pm

Mosh.
I Read your link. Looks like you have to assume the data taker never establishes any sort of routine for TOB to matter. I rather suspect most people do establish some kind of routine, which might get perturbed from time to time, which might introduce some small bias. Looks to me like maybe it has been convienient to assume old timers were constantly having their routine perturbed in just the optimum pattern to introduce maximum bias. That way we can convieniently subtract large amounts of perceived bias from the old records.

Rhoda R
June 9, 2010 9:18 pm

Mark Wagner, James Sexton — thanks to both of you.

James Sexton
June 9, 2010 9:49 pm

SezaGeoff says:
June 9, 2010 at 5:15 pm
James Sexton and Mike G:-
But if you had left the Min/Max for 24 hours it would have experienced the min and max during that time. It would make a difference to the day that it was recorded against, but it will have experienced the extremes of the past 24 hours.
Yes, that’s true, but one would have to redefine the dates and be consistent about when read, and, of course, to be truly accurate, consistent world wide. It’s not. As Steven Mosher said, “The issue has been discussed to death, both there and at other places.” So, I’ll just leave us with this, one can get the median of the highs and lows, but one cannot get an average temp for a day because you can’t divide by time.

Gail Combs
June 9, 2010 10:01 pm

As I mention on the other Beeville, TX thread, I looked at the North Carolina data, and I found three interesting things.
First there is no data for the mountainous area. None zip zilch despite the fact that Asheville NC is a big city in the mountains and home of the Biltmore Estate (1895)
Second Norfolk City and Norfolk International Airport show the influence of an airport very nicely.
Third all the cities, oceanside and rual areas showed the sine wave pattern seen in the Atlantic Multidecadal Oscillation
The rest of you might want to check and see if the mountainous areas in your states are conspicuous by there absence too.

Evan Jones
Editor
June 9, 2010 10:06 pm

Agh! TOBS is a very real problem. (I didn’t believe it myself until I set up an artificial sample and noted the effects.)
Let me give a hypothetical example:
If you take your readings at sat 4PM (‘way too near typical Tax), and say it hits 90 on Tuesday afternoon, the 90-degree reading will show up as Tmax on Tuesday (at 3:59PM) AND Wednesday (at 4:01PM), even though 24 hours later at 3:59PM on Wednesday the temperature is a mere 70. The 70 reading is lost entirely and 90 goes into the books twice!
The way to avoid this problem is to take your readings when Tmax and Tmin are unlikely to occur. Such as, say, 10AM or 10PM. (And even then you will get an occasional glitch.)
BTW, the infamous Mohonk Lake (which I surveyed last year) station observation time is (drumroll) 4PM . . .

James Sexton
June 9, 2010 10:09 pm

lol, just got back from a ….thing….I pushed “enter” …..there’s no reason to be redundant, so if you guys wish, go ahead and delete. Beer is really cool…………..sometimes.

James Sexton
June 9, 2010 10:15 pm

@ Rhoda R ……no problem! By your last post, you have seemed to be able to move forward with data and spreadsheets. I hope I was at least a small part. I tell my grandkids, ” I knew her when.”……..good luck!

Editor
June 9, 2010 10:30 pm

Rather than referring unquestioningly to that 1986 paper discussing theoretical TOBS “corrections” please me – using the actual numbers for 100 actual sites – exactly what TOBS “correction” was made at exactly what years and for what reason the so-called TOBS “correction” affected actual max-min temperatures for every remaining year of the record.
We see that Hanson uses his own 1987 paper to justify smoothing temperature data across up to 1200 km from a single point – why (other than 1.3 trillion dollars in tax review) should every TOBS “correction” made change the early temperature lower by such large amounts?

AlanG
June 9, 2010 10:46 pm

If I could adjust my numbers like this, I wouldn’t be paying any income tax.

Steven mosher
June 9, 2010 11:29 pm

“Rather than referring unquestioningly to that 1986 paper discussing theoretical TOBS “corrections” please me – using the actual numbers for 100 actual sites – exactly what TOBS “correction” was made at exactly what years and for what reason the so-called TOBS “correction” affected actual max-min temperatures for every remaining year of the record.”
I don’t think anyone refers unquestiongly to the 1986 paper. The need for a TOBS adjustment is based in fact. Those facts are open and available to you if you want to study. That was the approach I took over 2 years ago when I questioned Karl’s work.
Question. work through the math. Move on to the real problems.
You can go download JerryBs data. Independent of Karls data. or you can go download CRN data. When you do this and spend a couple weeks looking at the problem you will see that changing the TOB does change the min/max recorded.
The data you want to look at is here. thats easy for anybody to read.
http://www.john-daly.com/tob/TOBSUM.HTM
you should recognize john daly’s name. this analysis was performed by JerryB. This is independent of Karl’s work
http://www.john-daly.com/tob/TOBSUMC.HTM
here are the data files
http://www.john-daly.com/tob/SUMDATC.HTM
Now, WRT the tobs correction. to understand how it is made you can probably write to NOAA and request the code. or you can write your own regression. I would suggest R. The code is an empirical model as I explained. Your question suggests that you may not know what that is, so I’d suggest reading the Karl paper to start. But I can give you a little cartoon sketch of how it works.
You take a state, say Iowa. You create a dataset of hourly temperature readings for
say 100 stations within a 500km radius. lets say you have 20 years of hourly data.
You take 50 of the 100 stations and hold that data aside. This is your verification
dataset.
Then with the 50 remaining stations you build a model.
if you collect the min/max at midnight.. you call that 1200 min, 1200 max.
Now since you have the data for every HOUR …you just look! if we collected it
at 1am, 2am, 3am, 4am, etc. Everyone one of these hours will give you a different min/max reading ( from 1/10s to full degrees) This BIAS is location dependent
and SEASON dependent, and position of the sun dependent, etc. You construct a function that says Deltatemp = f(lat,lon,season etc)
Now, you use that function make predictions on the 50 stations you held apart.
So station one.. you look at the ACTUAL min/max recorded at 6am. You predict the minmax
at midnight using the model. The model predicts 14C/8c. You check the actual data.
the actual is 14.1C/8.1C
this gives you your SE. standard error of prediction.
There are other approaches to TOBS adjustments, but if you spend any time with hourly data you will understand better. Since I was highly skeptical of this adjustment and since I took the time to download data and work through the problem myself, I will suggest that is your best path to enlightenment. My sense has always been that if the data is freely available and I have doubts then I should put the work in. My sense was that asking others to do my bidding was a bit precious.

Geoff Sherrington
June 10, 2010 12:46 am

evanmjones says:
June 9, 2010 at 10:06 pm
“If you take your readings at sat 4PM (‘way too near typical Tax), and say it hits 90 on Tuesday afternoon, the 90-degree reading will show up as Tmax on Tuesday (at 3:59PM) AND Wednesday (at 4:01PM), even though 24 hours later at 3:59PM on Wednesday the temperature is a mere 70. The 70 reading is lost entirely and 90 goes into the books twice!”
But are you not forgetting that after you read the temperature at 4pm, you reset the pins back to where the mercury ends and so (most often) remove the 90 degree reading from earlier in the day so it does not carry over to the next day?

Grumbler
June 10, 2010 3:31 am

“Enneagram says:
June 9, 2010 at 12:11 pm
Does anybody know how many degrees of temperature are we, humans, able to discriminate? One degree?, Half a degree?, two degrees?”
You need to look at psychRometrics. It’s a comfort envelope bounded by temperature, air velocity and humidity. Interestingly as the air gets warmer from AGW ;-), the air movement would increase and we’d feel the same. Could be a good research paper?
By the way with noise the air pressure is measured in decibels and a doubling is about +3. However we only say we notice a doubling when it’s about +10. The unit is a ‘sone’ Not sure if there is an equivalent measure for temperature.
cheers David

899
June 10, 2010 5:23 am

Anthony Scalzi says:
June 9, 2010 at 12:24 pm
That’s a very handy tool for comparing unadjusted and adjusted records. Here’s Southern New England:
Well you see, Anthony? It’s the ‘devil in the details.’
Where’s old Daniel Webster when you need him?
http://tarlton.law.utexas.edu/lpop/etext/devil/devil.htm

A C Osborn
June 10, 2010 5:55 am

Steven mosher says:
June 9, 2010 at 11:29 pm
“Now, you use that function make predictions on the 50 stations you held apart.
So station one.. you look at the ACTUAL min/max recorded at 6am. You predict the minmax
at midnight using the model. The model predicts 14C/8c. You check the actual data.
the actual is 14.1C/8.1C
this gives you your SE. standard error of prediction.”
One major problem that appears straight away with that technique. We have seen from many other posters data analysis of local measuring sites that you can get real 10 degrees difference in tempertaure in just a few 10s of miles, how do you handle that to give you the SE?

A C Osborn
June 10, 2010 5:56 am

Steven mosher says:
June 9, 2010 at 11:29 pm ( … )
Or are you only talking Anomalies, not actual readings?

schumpeter
June 10, 2010 6:57 am

I hope you guys only ever measure your temperatures in millimetres of mercury. Because if you ever calibrated (adjusted, corrected) your garden thermometer into some kind of common unit of measurement (let’s say for argument, degrees fahrenheit) so that you can compare your weather with your mate on the other coast, or the temperature this year with the same time last year, that would be cheating, right?

Evan Jones
Editor
June 10, 2010 1:24 pm

But are you not forgetting that after you read the temperature at 4pm, you reset the pins back to where the mercury ends and so (most often) remove the 90 degree reading from earlier in the day so it does not carry over to the next day?
Of course they are reset. But as soon as you have done that, the Tmax goes right back up to the 90-degree level. So it does carry over. And your Tmax is 90 on Wednesday and then (a mere few minutes after resetting), 90 degrees for Thursday.
Work it out yourself. Assume reading time is 4 and temps are 90 from 3 to 5 on Monday, Wednesday, Friday. Assume temps are 70 from 3 – 5 on Tuesday, Thursday, Saturday.
Your result will be that all 6 days had a Tmax of 90: A TOBS error of 10 degrees.
Then assume readings are taken at 10AM. Tmax for 3 days will be 90 and 3 days will be 70. No TOBS error.

%d bloggers like this: