by Ecotretas
The Beeville story just keeps getting better.
In the comments section of yesterday’s WUWT post, I got a couple of ideas. First, there is a very interesting site where we can graph adjusted vs. non-adjusted temperatures of GHCN. The first graph above is the result for the Beeville station. A clear difference is visible between adjusted and non-adjusted temperatures, especially during the first half of the XX century. And looking at the blue line does give us an impression that Global Warming might not be happening in Beeville.
Being a skeptic, I searched for the raw data. The monthly data is available at the NOAA site. Got the data for Beeville and plotted the second graph above (click the graphs for better detail). Does anyone see any warming going on? Doing a linear trendline on the monthly data gives us “y = -0.0637x + 829.59”, which means that temperatures have gone down! And now, imagine which were the 20 hottest months at Beeville, for the last 113 years:
| Month | Temperature (x 10 ºF) |
|---|---|
| 1951/8 | 888 |
| 2009/7 | 880 |
| 1998/7 | 879 |
| 1952/8 | 878 |
| 2009/8 | 877 |
| 1953/7 | 876 |
| 1902/8 | 875 |
| 1998/6 | 872 |
| 1897/7 | 871 |
| 1915/7 | 871 |
| 1980/7 | 871 |
| 1914/7 | 869 |
| 1915/8 | 869 |
| 1916/6 | 869 |
| 1938/7 | 869 |
| 1951/7 | 869 |
| 1958/8 | 869 |
| 1911/8 | 868 |
| 1954/8 | 867 |
| 1927/8 | 866 |
Might Julisa Castillo deserve a prize, after-all?

SezaGeoff says:
June 9, 2010 at 5:15 pm
James Sexton and Mike G:-
But if you had left the Min/Max for 24 hours it would have experienced the min and max during that time. It would make a difference to the day that it was recorded against, but it will have experienced the extremes of the past 24 hours.
Yes, that’s true, but one would have to redefine the dates and be consistent about when read, and, of course, to be truly accurate, consistent world wide. It’s not. As Steven Mosher said, “The issue has been discussed to death, both there and at other places.” So, I’ll just leave us with this, one can get the median of the highs and lows, but one cannot get an average temp for a day because you can’t divide by time.
As I mention on the other Beeville, TX thread, I looked at the North Carolina data, and I found three interesting things.
First there is no data for the mountainous area. None zip zilch despite the fact that Asheville NC is a big city in the mountains and home of the Biltmore Estate (1895)
Second Norfolk City and Norfolk International Airport show the influence of an airport very nicely.
Third all the cities, oceanside and rual areas showed the sine wave pattern seen in the Atlantic Multidecadal Oscillation
The rest of you might want to check and see if the mountainous areas in your states are conspicuous by there absence too.
Agh! TOBS is a very real problem. (I didn’t believe it myself until I set up an artificial sample and noted the effects.)
Let me give a hypothetical example:
If you take your readings at sat 4PM (‘way too near typical Tax), and say it hits 90 on Tuesday afternoon, the 90-degree reading will show up as Tmax on Tuesday (at 3:59PM) AND Wednesday (at 4:01PM), even though 24 hours later at 3:59PM on Wednesday the temperature is a mere 70. The 70 reading is lost entirely and 90 goes into the books twice!
The way to avoid this problem is to take your readings when Tmax and Tmin are unlikely to occur. Such as, say, 10AM or 10PM. (And even then you will get an occasional glitch.)
BTW, the infamous Mohonk Lake (which I surveyed last year) station observation time is (drumroll) 4PM . . .
lol, just got back from a ….thing….I pushed “enter” …..there’s no reason to be redundant, so if you guys wish, go ahead and delete. Beer is really cool…………..sometimes.
@ur momisugly Rhoda R ……no problem! By your last post, you have seemed to be able to move forward with data and spreadsheets. I hope I was at least a small part. I tell my grandkids, ” I knew her when.”……..good luck!
Rather than referring unquestioningly to that 1986 paper discussing theoretical TOBS “corrections” please me – using the actual numbers for 100 actual sites – exactly what TOBS “correction” was made at exactly what years and for what reason the so-called TOBS “correction” affected actual max-min temperatures for every remaining year of the record.
We see that Hanson uses his own 1987 paper to justify smoothing temperature data across up to 1200 km from a single point – why (other than 1.3 trillion dollars in tax review) should every TOBS “correction” made change the early temperature lower by such large amounts?
If I could adjust my numbers like this, I wouldn’t be paying any income tax.
“Rather than referring unquestioningly to that 1986 paper discussing theoretical TOBS “corrections” please me – using the actual numbers for 100 actual sites – exactly what TOBS “correction” was made at exactly what years and for what reason the so-called TOBS “correction” affected actual max-min temperatures for every remaining year of the record.”
I don’t think anyone refers unquestiongly to the 1986 paper. The need for a TOBS adjustment is based in fact. Those facts are open and available to you if you want to study. That was the approach I took over 2 years ago when I questioned Karl’s work.
Question. work through the math. Move on to the real problems.
You can go download JerryBs data. Independent of Karls data. or you can go download CRN data. When you do this and spend a couple weeks looking at the problem you will see that changing the TOB does change the min/max recorded.
The data you want to look at is here. thats easy for anybody to read.
http://www.john-daly.com/tob/TOBSUM.HTM
you should recognize john daly’s name. this analysis was performed by JerryB. This is independent of Karl’s work
http://www.john-daly.com/tob/TOBSUMC.HTM
here are the data files
http://www.john-daly.com/tob/SUMDATC.HTM
Now, WRT the tobs correction. to understand how it is made you can probably write to NOAA and request the code. or you can write your own regression. I would suggest R. The code is an empirical model as I explained. Your question suggests that you may not know what that is, so I’d suggest reading the Karl paper to start. But I can give you a little cartoon sketch of how it works.
You take a state, say Iowa. You create a dataset of hourly temperature readings for
say 100 stations within a 500km radius. lets say you have 20 years of hourly data.
You take 50 of the 100 stations and hold that data aside. This is your verification
dataset.
Then with the 50 remaining stations you build a model.
if you collect the min/max at midnight.. you call that 1200 min, 1200 max.
Now since you have the data for every HOUR …you just look! if we collected it
at 1am, 2am, 3am, 4am, etc. Everyone one of these hours will give you a different min/max reading ( from 1/10s to full degrees) This BIAS is location dependent
and SEASON dependent, and position of the sun dependent, etc. You construct a function that says Deltatemp = f(lat,lon,season etc)
Now, you use that function make predictions on the 50 stations you held apart.
So station one.. you look at the ACTUAL min/max recorded at 6am. You predict the minmax
at midnight using the model. The model predicts 14C/8c. You check the actual data.
the actual is 14.1C/8.1C
this gives you your SE. standard error of prediction.
There are other approaches to TOBS adjustments, but if you spend any time with hourly data you will understand better. Since I was highly skeptical of this adjustment and since I took the time to download data and work through the problem myself, I will suggest that is your best path to enlightenment. My sense has always been that if the data is freely available and I have doubts then I should put the work in. My sense was that asking others to do my bidding was a bit precious.
evanmjones says:
June 9, 2010 at 10:06 pm
“If you take your readings at sat 4PM (‘way too near typical Tax), and say it hits 90 on Tuesday afternoon, the 90-degree reading will show up as Tmax on Tuesday (at 3:59PM) AND Wednesday (at 4:01PM), even though 24 hours later at 3:59PM on Wednesday the temperature is a mere 70. The 70 reading is lost entirely and 90 goes into the books twice!”
But are you not forgetting that after you read the temperature at 4pm, you reset the pins back to where the mercury ends and so (most often) remove the 90 degree reading from earlier in the day so it does not carry over to the next day?
“Enneagram says:
June 9, 2010 at 12:11 pm
Does anybody know how many degrees of temperature are we, humans, able to discriminate? One degree?, Half a degree?, two degrees?”
You need to look at psychRometrics. It’s a comfort envelope bounded by temperature, air velocity and humidity. Interestingly as the air gets warmer from AGW ;-), the air movement would increase and we’d feel the same. Could be a good research paper?
By the way with noise the air pressure is measured in decibels and a doubling is about +3. However we only say we notice a doubling when it’s about +10. The unit is a ‘sone’ Not sure if there is an equivalent measure for temperature.
cheers David
Anthony Scalzi says:
June 9, 2010 at 12:24 pm
That’s a very handy tool for comparing unadjusted and adjusted records. Here’s Southern New England:
Well you see, Anthony? It’s the ‘devil in the details.’
Where’s old Daniel Webster when you need him?
http://tarlton.law.utexas.edu/lpop/etext/devil/devil.htm
Steven mosher says:
June 9, 2010 at 11:29 pm
“Now, you use that function make predictions on the 50 stations you held apart.
So station one.. you look at the ACTUAL min/max recorded at 6am. You predict the minmax
at midnight using the model. The model predicts 14C/8c. You check the actual data.
the actual is 14.1C/8.1C
this gives you your SE. standard error of prediction.”
One major problem that appears straight away with that technique. We have seen from many other posters data analysis of local measuring sites that you can get real 10 degrees difference in tempertaure in just a few 10s of miles, how do you handle that to give you the SE?
Steven mosher says:
June 9, 2010 at 11:29 pm ( … )
Or are you only talking Anomalies, not actual readings?
I hope you guys only ever measure your temperatures in millimetres of mercury. Because if you ever calibrated (adjusted, corrected) your garden thermometer into some kind of common unit of measurement (let’s say for argument, degrees fahrenheit) so that you can compare your weather with your mate on the other coast, or the temperature this year with the same time last year, that would be cheating, right?
But are you not forgetting that after you read the temperature at 4pm, you reset the pins back to where the mercury ends and so (most often) remove the 90 degree reading from earlier in the day so it does not carry over to the next day?
Of course they are reset. But as soon as you have done that, the Tmax goes right back up to the 90-degree level. So it does carry over. And your Tmax is 90 on Wednesday and then (a mere few minutes after resetting), 90 degrees for Thursday.
Work it out yourself. Assume reading time is 4 and temps are 90 from 3 to 5 on Monday, Wednesday, Friday. Assume temps are 70 from 3 – 5 on Tuesday, Thursday, Saturday.
Your result will be that all 6 days had a Tmax of 90: A TOBS error of 10 degrees.
Then assume readings are taken at 10AM. Tmax for 3 days will be 90 and 3 days will be 70. No TOBS error.