Guest essay by Jim Steele, Director emeritus Sierra Nevada Field Campus, San Francisco State University
For researchers like myself examining the effect of local microclimates on the ecology of local wildlife, the change in the global average is an absolutely useless measure. Although it is wise to think globally, wildlife only responds to local climate change. To understand how local climate change had affected wildlife in California’s Sierra Nevada and Cascade Mountains, I had examined data from stations that make up the US Historical Climate Network (USHCN).
I was quickly faced with a huge dilemma that began my personal journey toward climate skepticism. Do I trust the raw data, or do I trust the USHCN’s adjusted data?
For example the raw data for minimum temperatures at Mt Shasta suggested a slight cooling trend since the 1930s. In contrast the adjusted data suggested a 1 to 2°F warming trend. What to believe? The confusion resulting from skewing trends is summarized in a recent study that concluded their “results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.” 13.
I began exploring data at other USHCN stations from around the country and realized that a very large percentage of the stations had been adjusted very similarly. The warm peaks from the 1930s and 40s had been adjusted downward by 3 to 4°F and these adjustments created dubious local warming trends as seen in examples from other USHCN stations at Reading, Massachusetts and Socorro, New Mexico.
Because these adjustments were so widespread, many skeptics have suspected there has been some sort of conspiracy. Although scientific papers are often retracted for fraudulent data, I found it very hard to believe climate scientists would allow such blatant falsification. Data correction in all scientific disciplines is often needed and well justified. Wherever there are documented changes to a weather station such as a change in instrumentation, then an adjustment is justified. However unwitting systematic biases in their adjustment procedure could readily fabricate such a trend, and these dramatic adjustments were typically based on “undocumented changes” when climate scientists attempted to “homogenize” the regional data. The rationale for homogenization is based on the dubious assumption that all neighboring weather stations should display the same climate trends. However due to the effects of landscape changes and differently vegetated surfaces,1,2 local temperatures often respond very differently and the minimum temperatures are especially sensitive to different surface conditions.
For example even in relatively undisturbed regions, Yosemite’s varied landscapes respond in very contrary ways to a weakening of the westerly winds. Over a 10-year period, one section of Yosemite National Park cooled by 1.1°F, another rose by 0.72°F, while in a third location temperatures did not change at all.16 Depending on the location of a weather station, very different trends are generated. The homogenization process blends neighboring data and obliterates local differences and then fabricates an artificial trend.
Ecologists and scientists who assess regional climate variability must only use data that has been quality controlled but not homogenized. In a climate variability study, scientists computed the non-homogenized changes in maximum and minimum temperatures for the contiguous United States.12 The results seen in Figure A (their figure 1b) suggest recent climate change has been more cyclical. Those cyclical changes parallel the Pacific Decadal Oscillation (PDO). When climate scientists first began homogenizing temperature data, the PDO had yet to be named, so I would like to suggest instead of a deliberate climate science conspiracy, it was their ignorance of the PDO coupled with overwhelming urbanization effects that caused the unwarranted adjustments by causing “natural change points” that climate scientists had yet to comprehend. Let me explain.
Homogenizing Contrasting Urban and Natural Landscape Trends
The closest USHCN weather station to my research was Tahoe City (below). Based on the trend in maximum temperatures, the region was not overheating nor accumulating heat. Otherwise the annual maximum temperature would be higher than the 1930s. My first question was why such a contrasting rise in minimum temperature? Here changing cloud cover was not an issue. Dr. Thomas Karl, who now serves as the director of the NOAA’s National Climatic Data Center partially answered the question when he reported that in over half of North America “the rise of the minimum temperature has occurred at a rate three times that of the maximum temperature during the period 1951-90 (1.5°F versus 0.5°F).”3 Rising minimum temperatures were driving the average but Karl never addressed the higher temperatures in the 1930s. Karl simply demonstrated as populations increased, so did minimum temperatures even though the maximums did not. A town of two million people experienced a whopping increase of 4.5°F in the minimum and was the sole cause of the 2.25°F increase in average temperature.4
Although urban heat islands are undeniable, many CO2 advocates argue that growing urbanization has not contributed to recent climate trends because both urban and rural communities have experienced similar warming trends. However, those studies failed to account for the fact that even small population increases in designated rural areas generate high rates of warming. For example, in 1967 Columbia, Maryland was a newly established, planned community designed to end racial and social segregation. Climate researchers following the city’s development found that over a period of just three years, a heat island of up to 8.1°F appeared as the land filled with 10,000 residents.5 Although Columbia would be classified as a rural town, that small population raised local temperatures five times greater than a century’s worth of global warming. If we extrapolated that trend as so many climate studies do, growing populations in rural areas would cause a whopping warming trend of 26°F per decade.
CO2 advocates also downplay urbanization, arguing it only represents a small fraction of the earth’s land surface and therefore urbanization contributes very little to the overall warming. However arbitrary designations of urban versus rural does not address the effects of growing population on the landscape. California climatologist James Goodridge found the average rate of 20th century warming for weather stations located in a whole county that exceeded one million people was 3.14°F per century, which is twice the rate of the global average. In contrast, the average warming rate for stations situated in a county with less than 100,000 people was a paltry 0.04°F per century.6 The warming rate of sparsely populated counties was 35 times less than the global average.
Furthermore results similar to Goodridge’s have been suggested by tree ring studies far from urban areas. Tree ring temperatures are better indicators of “natural climate trends” and can help disentangle distortions caused by increasing human populations. Not surprisingly, most tree-ring studies reveal lower temperatures than the urbanized instrumental data. A 2007 paper by 10 leading tree-ring scientists reported, “No current tree ring based reconstruction of extratropical Northern Hemisphere temperatures that extends into the 1990s captures the full range of late 20th century warming observed in the instrumental record.”8
Because tree ring temperatures disagree with a sharply rising instrumental average, climate scientists officially dubbed this the “divergence problem.”9 However when studies compared tree ring temperatures with only maximum temperatures (instead of the average temperatures that are typically inflated by urbanized minimum temperatures) they found no disagreement and no divergence.10 Similarly a collaboration of German, Swiss, and Finnish scientists found that where average instrumental temperatures were minimally affected by population growth in remote rural stations of northern Scandinavia, tree ring temperatures agreed with instrumental average temperatures.11 As illustrated in Figure B, the 20th century temperature trend in the wilds of northern Scandinavia is strikingly similar to maximum temperature trends of the Sierra Nevada and the contiguous 48 states. All those regions experienced peak temperatures in the 1940s and the recent rise since the 1990s has never exceed that peak.

How Homogenizing Urbanized Warming Has Obliterated Natural Oscillations
It soon became obvious that the homogenization process was unwittingly blending rising minimum temperatures caused by population growth with temperatures from more natural landscapes. Climate scientists cloistered in their offices have no way of knowing to what degree urbanization or other landscape factors have distorted each weather station’s data. So they developed an armchair statistical method that blended trends amongst several neighboring stations,17 using what I term the “blind majority rules” method. The most commonly shared trend among neighboring stations became the computer’s reference, and temperatures from “deviant stations” were adjusted to create a chimeric climate smoothie. Wherever there was a growth in population, this unintentionally allows urbanization warming effects to alter the adjusted trend.
Climate computers had been programmed to seek unusual “change-points” as a sign of “undocumented” station modifications. Any natural change‑points caused by cycles like the Pacific Decadal Oscillation looked like deviations relative to steadily rising trends of an increasingly populated region like Columbia, Maryland or Tahoe City. And the widespread adjustments to minimum temperatures reveal this erroneous process.
I first stumbled onto Anthony Watts’ surface station efforts when investigating climate factors that controlled the upslope migration of birds in the Sierra Nevada. To understand the population declines in high-elevation meadows on the Tahoe National Forest, I surveyed birds at several low-elevation breeding sites and examined the climate data from foothill weather stations.
Marysville, CA was one of those stations, but its warming trend sparked my curiosity because it was one of the few stations where the minimum was not adjusted markedly. I later found a picture of the Marysville’s weather station at SurfaceStations.org website. The Marysville weather station was Watts’ poster child for a bad site; he compared it to the less-disturbed surface conditions at a neighboring weather station in Orland, CA. The Marysville station was located on an asphalt parking lot just a few feet from air conditioning exhaust fans.
The proximity to buildings also altered the winds, and added heat radiating from the walls. These urbanization effects at Marysville created a rising trend that CO2 advocate scientists expect. In contrast, the minimum temperatures at nearby Orland showed the cyclic behavior we would expect the Pacific Decadal Oscillation (PDO) to cause. Orland’s data was not overwhelmed by urbanization and thus more sensitive to cyclical temperature changes brought by the PDO. Yet it was Orland’s data that was markedly adjusted- not Marysville! (Figure C)

Several scientists have warned against homogenization for just this reason. Dr. Xiaolan Wang of Meteorological Service of Canada wrote, “a trend-type change in climate data series should only be adjusted if there is sufficient evidence showing that it is related to a change at the observing station, such as a change in the exposure or location of the station, or in its instrumentation or observing procedures.” 14
That waning went unheeded. In the good old days, weather stations such as the one in Orland, CA (pictured above) would have been a perfect candidate to serve as a reference station. It was well sited, away from pavement and buildings, and its location and thermometers had not changed throughout its history. Clearly Orland did not warrant an adjustment but the data revealed several “change points.” Although those change points were naturally caused by the Pacific Decadal Oscillation (PDO), it attracted the computer’s attention that an “undocumented change” had occurred.
To understand the PDO’s effect, it is useful to see the PDO as a period of more frequent El Niños that ventilate heat and raise the global average temperature, alternating with a period of more frequent La Niñas that absorb heat and lower global temperatures. For example heat ventilated during the 1997 El Nino raised global temperatures by ~1.6°F. During the following La Niña, temperatures dropped by ~1.6°F. California’s climate is extremely sensitive to El Niño and the PDO. Reversal in thr Pacific Decadal Oscillation caused natural temperature change-points around the 1940s and 1970s. The rural station of Orland was minimally affected by urbanization, and thus more sensitive to the rise and fall of the PDO. Similarly, the raw data for other well-sited rural stations like the Cuyamaca in southern California also exhibited the cyclical temperatures predicted by the PDO (see Figure D, lower panel). But in each case those cyclical temperature trends were homogenized to look like the linear urbanized trend at Marysville.

Marysville however was overwhelmed by California’s growing urbanization and less sensitive to the PDO. Thus it exhibited a steady rising trend. Ironically, a computer program seeking any and all change-points dramatically adjusted the natural variations of rural stations to make them conform to the steady trend of more urbanized stations. Around the country, very similar adjustments lowered the peak warming of the 1930s and 1940s in the original data. Those homogenization adjustments now distort our perceptions, and affect our interpretations of climate change. Cyclical temperature trends were unwittingly transformed into rapidly rising warming trends, suggesting a climate on “CO2 steroids”. However the unadjusted average for the United States suggests the natural climate is much more sensitive to cycles such as the PDO. Climate fears have been exaggerated due to urbanization and homogenization adjustments on steroids.
Skeptics have highlighted the climate effects of the PDO for over a decade but CO2 advocates dismissed this alternative climate viewpoint. As recently as 2009, Kevin Trenberth emailed Michael Mannand other advocates regards the PDO’s effect on natural climate variability writing “there is a LOT of nonsense about the PDO. People like CPC are tracking PDO on a monthly basis but it is highly correlated with ENSO. Most of what they are seeing is the change in ENSO not real PDO. It surely isn’t decadal. The PDO is already reversing with the switch to El Nino. The PDO index became positive in September for first time since Sept 2007.”
However contrary to Trenberth’s email rant, the PDO continued trending to its cool phase and global warming continued its “hiatus.” Now forced to explain the warming hiatus, Trenberth has flipped flopped about the PDO’s importance writing “One of the things emerging from several lines is that the IPCC has not paid enough attention to natural variability, on several time scales,” “especially El Niños and La Niñas, the Pacific Ocean phenomena that are not yet captured by climate models, and the longer term Pacific Decadal Oscillation (PDO) and Atlantic Multidecadal Oscillation (AMO) which have cycle lengths of about 60 years.”18 No longer is CO2 overwelming natural systems, they must argue natural systems are overwhelming CO2 warming. Will they also rethink their unwarranted homogenization adjustments?
Skeptics highlighting natural cycles were ahead of the climate science curve and provided a much needed alternative viewpoint. Still to keep the focus on CO2, Al Gore is stepping up his attacks against all skeptical thinking. In a recent speech, rightfully took pride that we no longer accept intolerance and abuse against people of different races or with different sexual preferences. Then totally contradicting his examples of tolerance and open mindedness, he asked his audience to make people “pay a price for denial”.
Instead of promoting more respectful public debate, he in essence suggests Americans should hate “deniers” for thinking differently than Gore and his fellow CO2 advocates. He and his ilk are fomenting a new intellectual tyranny. Yet his “hockey stick beliefs” are based on adjusted data that are not supported by the raw temperature data and unsupported by natural tree ring data. So who is in denial? Whether or not Gore’s orchestrated call to squash all skeptical thought is based solely on ignorance of natural cycles, his rant against skeptics is far more frightening than the climate change evidenced by the unadjusted data and the trees.
Literature cited
1. Mildrexler,D.J. et al., (2011) Satellite Finds Highest Land Skin Temperatures on Earth. Bulletin of the American Meteorological Society
2. Lim,Y-K, et al., (2012) Observational evidence of sensitivity of surface climate changes to land types and urbanization,
3. Karl, T.R. et al., (1993) Asymmetric Trends of Daily Maximum and Minimum Temperature. Bulletin of the American Meteorological Society, vol. 74
4. Karl, T., et al., (1988), Urbanization: Its Detection and Effect in the United States Climate Record. Journal of Climate, vol. 1, 1099-1123.
5. Erella, E., and Williamson, T, (2007) Intra-urban differences in canopy layer air temperature at a mid-latitude city. Int. J. Climatol. 27: 1243–1255
6. Goodridge, J., (1996) Comments on Regional Simulations of Greenhouse Warming Including Natural Variability. Bulletin of the American Meteorological Society. Vol.77, p.188.
7. Fall, S., et al., (2011) Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends. Journal Of Geophysical Research, Vol. 116
8. Wilson R., et al., (2007) Matter of divergence: tracking recent warming at hemispheric scales using tree-ring data. Journal of Geophysical Research–A, 112, D17103, doi: 10.1029/2006JD008318.
9. D’Arrigo, R., et al., (2008) On the ‘Divergence Problem’ in Northern Forests: A review of the tree-ring evidence and possible causes. Global and Planetary Change, vol. 60, p. 289–305
10. Youngblut, D., and Luckman, B., (2008) Maximum June–July temperatures in the southwest Yukon region over the last three hundred years reconstructed from tree-rings. Dendrochronologia, vol. 25, p.153–166.
11. Esper, J. et al. (2012) Variability and extremes of northern Scandinavian summer temperatures over the past two millennia. Global and Planetary Change 88–89 (2012) 1–9.
12. Shen, S., et al., (2011) The twentieth century contiguous US temperature changes indicated by daily data and higher statistical moments. Climatic Change Volume 109, Issue 3-4, pp 287-317.
13. Steirou, E., and Koutsoyiannis, D. (2012) Investigation of methods for hydroclimatic data homogenization. Geophysical Research Abstracts, vol. 14, EGU2012-956-1
14. Wang, X., (2003) Comments on ‘‘Detection of Undocumented Changepoints: A Revision of the Two-Phase Regression Model’’. Journal of Climate; Oct2003, Vol. 16 Issue 20, p. 3383-3385.
15. Nelson, T., (2011) Email conversations between climate scientists. ClimateGate 2.0: This is a very large pile of “smoking guns.” http://tomnelson.blogspot.com/
16. Lundquist, J. and Cayan, D. (2007) Surface temperature patterns in complex terrain: Daily variations and long-term change in the central Sierra Nevada, California. Journal of Geophysical Research, vol. 112, D11124, doi:10.1029/2006JD007561.
17. Menne. M., (2009) The U.S. Historical Climatology Network Monthly Temperature Data, version 2. The Bulletin for the American Meterological Society. p. 993-1007
18. Appell, D. (2013) Whither Global Warming? Has It Slowed Down? The Yale Forum on Climate Change and the Media. http://www.yaleclimatemediaforum.org/2013/05/wither-global-warming-has-it-slowed-down/
Adapted from the chapter Why Average Isn’t Good Enough in Landscapes & Cycles: An Environmentalist’s Journey to Climate Skepticism
Read previous essays at landscapesandcycles.net



Re: Making the past colder
My understanding is that this is in part due to the process of “infilling”. Where NASA/GISS has missing data for a given year for a given grid cell, but they do have it for other years (over 50% I think) they draw a linear trend through the data and calculate the missing years from that. Since the current temps actually ARE warmer than, say, 100 years ago, as we go forward in time adding new data, the linear slope increases and the older missing data is, the more likely it will be “infilled” with a calculated value that is increasingly cooler than the previous calculation, plus, we achieve a greater number of older blank years that now have more than 50% over the total time span, so the result is additional grid cells that were previously blank now “infilled” with temps calculated from that linear trend. As we add new data, the “infilling” process as I understand it continues to cool the past.
You can actually see this for yourself at the NASA/GISS web site:
http://data.giss.nasa.gov/gistemp/maps/
Using Land/Ocean, Annual Trends, and a 1200 km radius, I got the following results:
1900 – 1949 trend = 0.45
1950 – 2000 trend = 0.53
1900 – 2000 trend = 0.72
Now it doesn’t take a genius to figure out that the trend from 1900 – 2000 cannot possibly exceed the trends of 1900 – 1949 and 1950 – 2000, but that is exactly what the tool claims. The reason is “infilling”. With the additional spacial and temporal coverage of the current time period, which is in fact warmer, using the 1900 – 2000 trend provides a much larger number of older grid cells which can be “infilled” via a linear trend. The problem with this of course is that climate is cyclical, not linear, and it is utterly daft to assume that as we collect more warm data in the current era, that this justifies adjusting grid cells for which we have no data increasingly downward.
But the joke may be on them in the end. If we go through a cooling cycle, and it is looking like we are going to, as temps decline, the linear trend extended across all the data will actually become less, causing the infilling process to calculate increasingly warmer temps for data from the past which is missing.
I’m betting that when that starts to become apparent they will declare an update to the methodology.
======================================================================
I wouldn’t say “anecdotal”. You didn’t say where you live or lived but you remember what it was like and realize that it doesn’t match up to what’s being pushed. Not that age is a guarantee of wisdom or honesty (There are a lot of old snakes out there.), but wisdom with age and honesty is not something to be ignored.
there is no possible rationale for TOBS adjustment of min/max data.
get real, nick. it doesn’t matter what anybody writes about it.
Nick Stokes says:
If that’s a real question, then it can only be answered by finding out why they made the adjustments. They have said plenty about it.
—-
Really?
What have they said specifically about the adjustments made to stations mentioned in this post? And why exactly they were made?
Please enlighten us, Nick.
David Hoffer, thanks for that lucid and helpful (to a definite non-genius) explanation of what I had only a moderately firm grasp on till now. LOL, imagine if the early sea chart drawers or map makers had used that kind of “thinking.” Taking a bearing from, say, the horn of Africa, then a point or two northward ending with one in Gibraltar, and they would have had ships sailing full ahead right into Cote Ivoire! And they would have been out of business in a hurry.
IPCC (Intn’l Pseudoscientists and Climate Crooks) — your days are numbered.
“Mene, mene, tekel, upharsin… .”
=======================================================================
How would TOBS produce 7 new record highs for 2010 where I live? At most it would change which day the record was “set”.
http://wattsupwiththat.com/2013/09/25/unwarranted-temperature-adjustments-and-al-gores-unwarranted-call-for-intellectual-tyranny/#comment-1426755
Gunga Din, right on! You are so cool to support Allen, and so correct. Anecdotal evidence is not “no evidence.” I think that you, Allen, are likely a more reliable data source than the above-discussed “data” sets. And I KNOW my great-grandfather’s daily weather journal (admitted into evidence in a court case, even), jotted down faithfully and honestly in his careful, geological surveyor’s, manner was FAR more accurate.
As has been said in other contexts, data infilling is “precision without accuracy.”
An excellent and compelling post. Than you Jim Steele. There is interrogating the data, and then there is torturing it!
I would be more reluctant than some here to attribute this process to fraud. We know from numerous areas of science that preconceptions can colour the way in which research is conducted. As Spencer Weart said, climate science is a collective undertaking, requiring large teams, so it is necessarily socially constructed — but that doesn’t mean it is only socially constructed.
While there are undoubtedly some instances of fraudulent behaviour in climate science, it is the ‘overriding’ of the usual quality assurance processes that have allowed mistakes to have gone uncorrected. Many of these institutional factors have been more politicised. The undermining of anonymous peer review through the ‘virtual’ communities of self-appointed gatekeepers, and the empowering of consensus through the IPCC have all been factors. There are others: reliance on models, noble causes, funding availability, etc that have corrupted the scientific process — have created a ‘value slope’ (to use a term of philosopher of science Iain Boale).
It’s all coming unstuck, as the deficiencies of models tuned to errant data sets go awry. It has been a costly lesson – not just because of the billions spent on research that is no better in its predictive efficacy than 30 years ago, but because of the costs of the wrong policies. The best example here in Australia is probably the Wonthaggi desalination plant, built as a Public-Private partnership, with cost now north of $5billion, helped (ironically) by delays caused by rain: the state of Victoria will force consumers to pay each year for water they don’t need.
People and their governments will not be kind, but we might hope that from the wreckage we can extract the valuable knowledge on PDO, ENSO, etc that might allow better decisions in future (by governments, farmers and other risk managers.
AND…. Gunga Din, to continue, #(:)) (just saw your 8:28pm post) I’ve not complimented you on this before, but WAY TO GO with that historic research and detective work. That you kept such fine records of all that is impressive. Thanks for the several times (and keep it up, too — lots of new readers all the time) you have shared on WUWT about how those historic temps just — WHOA! — changed…. . (cue: Twilight Zone music… nuh, nuh, NUH, nuh — nuh, nuh, NUH, nuh…) {0.0}
The data is the data, until it is adjusted. Then it’s no longer absolute data, but climate model modelled data.
By specifying a climate model, you have established theoretical relationships between actual observing stations. By adjusting the raw data, you can decrease your model hindcast uncertainty. If your model forecast blows up after 17 additional years of new collected raw data, then your model either sucks, or you adjusted the data too much, trying to squeeze a spurious postulated signal from too much uncertainty in the raw data.
So Professor Steele is right. GISS output (and its ilk) is useless for many/most/all real-world applications. Real world applications using actual observations have more uncertainty in their forecasts, leading to more modest claims of new scientific knowledge. Well, in the military-industrial-educational-scientific government-funded by We the People taxes complex, modest claims don’t bring home the bacon.
I agree with Nick Stokes too. There was nothing wrong with the underpinnings of GISS and its ilk. Until, of course, this d*** pesky warming pause occurred which the models didn’t predict. I.e. it’s a beautiful theory, it’s a shame it doesn’t predict reality.
Live and learn. Try some stuff, and if it doesn’t work, then it means you learned something and you’re smarter now. Time to try something else. But if you keep doing it, then it means you’re stupid.
Nick Stokes says:
September 25, 2013 at 6:27 pm
“I’m saying is that the adjustments were made, by stated algorithm, for reasons that have been extensively described, in numerous papers, and those reasons should be looked at.”
Mr. Steele did just look at it in a very clear and understandable way. Not only did he identify the reasons given for, and the problem with the homogenization adjustment algorithm, he described exactly why it happens and verified it with two different methods (tree rings and unaltered data from unaltered rural stations.)
He even goes out of his way to argue that it is an ‘honest’ mistake. Perhaps it was made honestly, but the net results are too convenient and very suspicious. After honest mistakes are made, honest people try to correct them, and not remain silently complicit to falsehoods generated by those mistakes.
So the issue has been looked at and all you can say is that it should be looked at. Is Jim Steele the wrong person to look at this? Do we need some high muckity-mucks to work on this before we can say that there is a problem here? Seems like we have been down this road before:
“Instead of promoting more respectful public debate, he in essence suggests Americans should hate “deniers” for thinking differently than Gore and his fellow CO2 advocates. He and his ilk are fomenting a new intellectual tyranny”.
Gore’s fomenting tyranny isn’t new and it certainly isn’t “intellectual”.
He’s like Hitler making the case for the Aryan Race.
His semi science is as crooked as can be, just like the Agenda (21) behind this hoax.
We all know what happened to Hitler and his Third Reich so giving it a second try participating in the creation of a tyrannic One World Government is just plain stupid.
The day will come that he will be charged with ” the initiation and participation of a criminal conspiracy against human kind”, treason against the US constitution, fraud, financial crimes and racketeering.
Present at his trial will be the entire staff and board of the BBC and the Church of England who’s money was burned with failed Green Tech Speculations and the Chicago Carbon Exchange and his movie titled “An Inconvenient Truth” will be shown in evidence of the biggest fraud in history after the banking scheme.
Nick Stokes says:
September 25, 2013 at 7:43 pm
About the experts explaining their homogenization rationale is not enough. Nick, the post tells us that they use a computer algorithm to make the adjustments and that these aren’t always justified. An inflection caused by the PDO should not be adjusted, for example. I can see why this might have been done given that Trenberth an expert in these matters didn’t even acknowledge the PDO to be a real issue,despite, or perhaps because of, skeptics trying to point out its importance for a decade at least. You know full well that WUWT and other quality science based blogs don’t just pick up its debating points from the news media. They delve into the science and this work has contributed most of the worthwhile climate science of the last decade. Skeptics aren’t being suitably acknowledged but the chopping of Climate Sensitivity by 2/3 or more, the restoration of the LIA, the MWP and all the other WPs, the return of natural variability to its prominent place, vindicated by nature’s own reassertion on the subject is now becoming the new “discoveries” taken as their own by IPCC scientists. In the process, standards in the science have been raised beyond the competence of many of the proponents of CO2 warming and papers with shoddy logic, statistics and unsupported conclusions have had to be reluctantly withdrawn because to not do so went beyond the bounds of decency. Nick, the highly regarded Trenberth argued that the PDO was skeptic baloney in 2009 but now, needing to save something after 17 years of flat temps, he is putting it forward to explain the phenomenon.
Tell me, Nick, why did the CO2 scientists accept shovelfuls of terrible papers without stepping in and criticizing them – even Greenpeace and WWF grey literature by political scientists got accepted. Why? Because they supported the meme. This tells me why you accept, holus bolus, the methods of altering the historical temperature series. Even you know that the 1998 “high” was manufactured by pushing the hotter 1930s temperatures down – it was even done in 1998 by an underling of Hansen (I believe it was an FOI revelation- it was in a post on WUWT) because of the desperation to get the hottest, driest period in the century submerged below 1998. Nick, are you going to be supporting the dwindling position of the CO2 control knobs until the end? Are you as strong and confident a CO2 climate control guy today as you were, say, before 2009? Whether I get a straight answer regarding these things will tell me all I need to know about you.
Nick Stokes says:
September 25, 2013 at 6:27 pm
‘I don’t have any particular knowledge of those two stations. I’m saying is that the adjustments were made, by stated algorithm, for reasons that have been extensively described, in numerous papers, and those reasons should be looked at.’
Any links to these numerous papers, please? Thank you.
===========================================================================
Thank you, but I’m just a peon who got curious about all the Global Warming talk and thought to copy/paste a list of record temps from my local NWS site for my area into Excel back in 2007. As time went on and after I found WUWT, I did it again every few years. If you saw my spreadsheet I doubt you’d call it a “fine record” (None of the years data paste cleanly into columns.) but it is a record and some of the “scratches” stand out enough that even I could see them.
When you look through the window into the sausage factory that is “climate science”, as bad as you are afraid it is going to be, it’s worse, every time. People are rightfully cautious about suggesting a conspiracy, but it is becoming difficult to explain it otherwise.
Nick Stokes:”I don’t have any particular knowledge of those two stations. I’m saying is that the adjustments were made, by stated algorithm, for reasons that have been extensively described, in numerous papers, and those reasons should be looked at.”
Short form of above: “I refuse to answer the question. I demand you admire my handwaving!”
Yep, David H., that dodgy methodology is already backfiring big time on them. Heh, heh.
Jim, congratulations on a post which explains in plain English, with easily comprehended visual aids, some concepts that are not always understood by non-scientists. I note that the thrust of your post supports our host’s forthcoming paper, which I hope to see before I die. 🙂
My knowledge of using statistics and data homogenisation, such as it is, is based more in areas like human surveys and economic models. These fields are quite different to temperature data evaluation in many ways. But, I learned very early on to ask the most searching, irritating, “dumb” questions about data collection and anything that was subsequently done to it. The tendency of lazy statisticians to assume that if your head was in the oven and your feet were in the freezer, your body temperature was “normal” came to light surprisingly often.
Like you with your micro climate studies, what we really wanted to know was what was happening in a specific subset of the data, as well as the big picture. Delving into these similarities and differences can produce startling insights into crappy methodology, as in this case. Mixing good data with rubbish to spit out a number is endemic, sadly.
More power to your arm, and I hope that you will continue to post here!
@ur momisugly Nick Stokes “It doesn’t say that it’s GHCN, and I don’t think it is. But it does say that it is non-homogenized.”
True, my essay did not mention it was GHCN data but the paper I referenced does. You might read it instead of simply pronouncing “I don’t think it is.” In the future I encourage you to delve more deeply into the facts before simply choosing what is convenient for your beliefs.
“and thought to” {Gunga Din}
And that, my brother in the battle for Truth, is what makes you exceptional. SO WHAT if your data presentation isn’t professional looking — who cares? You can find it; that’s all that matters.
Wasn’t Einstein who said, “… leave elegance to my tailor.” ?
You go, Raj (you’ve been promoted from “peon” #(:)).
Hey, Johanna,
I hope you see this — here, the nights are getting quite chilly. The swallows have flown south and the Canadian Geese have arrived. Soon, the Trumpeter Swans will fly in from the NW, plaintively crying, “We’re here! We’re here!” As we head into fall here (about Lat. 45 in NW corner of U.S.A.), I have often thought of you folks, esp. you, in Australia as spring is getting underway. Tell me what you are seeing (or hearing or smelling) that tells Australians that spring has begun? Sorry for the off-topic, but I so enjoyed talking about birds with you awhile back.
Hope those next-door dogs aren’t a problem!
Take care, down there,
Janice
UHI effects can be quite dramatic, as I found yesterday evening while driving along Interstate 20. In eight minutes I crossed from somewhat built up Shreveport downtown to outskirts of suburbs east of town in Bossier City, a distance of about 7 miles. The ambient air temperature fell a whopping 10 degrees!
Excellent.
Bill H says: “Think about it. The CRU (supposed to be a repository for world climate records) adjusts the data and then dumps the raw data which is forever lost. Climate-gate exposed just what the game is.”
Bill every political movement has many strange bedfellows. The dumping of the CRU data does indeed raise great suspicions. We taught students they should keep their lab note books as evidence to support their results if ever challenged. That CRU expets dumped raw data flies in the face of all professional logic and in that case I too suspect dumping raw data was a move to hide other wrong doings.
That said I believe such fraudulent acts are carried out by a minority. The adjustments are not always similar to what I have shown, so I think the problem was a one-size fits all method that was tainted wherever urbanization effects were strong. However there is a legitimate criticism that their advocacy encouraged them to blindly accept adjustments that make no sense. Tunless you believed CO2 is the root of all climate change. Such blindness will always happen because we are all blinded by our beliefs. ONly respectful debate can free us of our illusions. For me the real crime is the attempt to stifle that debate and the varied suggestions to criminalize skepticism. Science is a process of suggesting good and bad ideas, that then must be rigorously tested. Trenberth, Mann and Gore and their followers are actively trying to stifle scientific debate by calling everyone who disagrees deniers. Gore’s call to “put a price on denial” and David Suzuki’s article to deny the deniers the right to deny smacks of nascent totalitarian politics. The defile good science and such tactics should be condemned by the scientific community. I suspect most scientists are troubled by such tactics but remain quiet so as not to attract the advocates wrath and threaten their funding.
IBSMDAHARCOWATPTIHAEATV
.
.
.
( in before Steve Mosher drops a hit and run comment of why altering the past temperatures is honest and ethical and then vanishes )