The NOAA USHCN RAW Data from Boulder, Colorado Restore the Beginning and End of the Modern Warming Regime

Guest post by Samuel I. Outcalt, Professor Emeritus of Physical Geography, The University of Michigan, Ann Arbor

Abstract: After noticing the strange time dependent behavior of the difference between the NOAA USHCN Average Annual Temperature data in the unadjusted RAW data. The RAW data was plotted for the 1965-2011 time period. The RAW data plot revealed the start and end of the Modern Warm Regime, which lasted from 1976 (the base of the “hockey stick”) until the onset of the 21st Century. The footprint of the regime in the area is validated by mean annual geothermal data from 3 boreholes along Trail Ridge Road in Rocky Mountain National Park.

Background: The strange behavior of the NOAA difference between the NOAA USHCN average annual temperature [AAT] and the raw data [AATRAW] at Boulder, Colorado is presented as Figure 1.

clip_image002

Figure 1. The trace of the RAW and adjusted Average Annual Temperatures at Boulder, Colorado display sectors of high amplitude variation interlaced with long runs of uniform adjustments.

Previous investigations of the adjusted Boulder average annual temperature data indicated that the Modern Warm Regime was not viable on the integral trace of that record. This was considered unusual as inflections of major regime events are known to produce inflections and extreme values of the integral trace in serial data (Outcalt et.al.(1997)). The integral trace of the USHCN Average Annual temperature data is presented as Figure 2.

clip_image004

Figure 2. The Integral Trace of the Average Annual Temperature does not indicate the onset of the Modern Warming Regime.

In summary, the RAW and Adjusted Average Temperature [ATA] data over the record length 1898-2012 displays interlaced sequences of large amplitude and uniform adjustments rather than a pattern of smooth transitions and the Adjusted Data does not even hint at the wide spread 1976 climate regime change which is a major feature of global climate data (see Figure 3).

clip_image006

Figure 3. The integral traces of three global climate series all show major inflections near the 1976.

The missing 1976 onset of the Modern Warming might be present in the RAW data integral trace for the 1965-2011 time period if rather heavy handed adjustments to the AAT data had masked or attenuated the inflection.

The 1965-2011 ATTRAW Data: The 1965-2011 RAW data is displayed as Figure 4.

clip_image008

Figure 4. The RAW data for 1965-2011.

In Figure 4. The integral trace displays a minimum in 1979 near the 1976 global climate transition and a minimum a 1999 near the end of the Global Warming Regime. Further, the 5 year running mean and integral trace display downturns in 2005 and 2008. These results appear to support the hypothesis that heavy handed adjustment had masked or attenuated the signature of the beginning and end of the Period of Modern Warming in the USHCN ATT data.

It should be noted that the method if Hurst Rescaling introduced in an paper by Outcalt et.al.(1997) is exceedingly robust and has been used by Runnals and Oke (2006) to detect weather station site moves.

The method unfortunately also detects transitions introduced by data adjustment which seems be true in the case of the Boulder Annual Average Temperature [ AAT ] data.

A further indication of the footprint of the modern warming data can be found in the thermal profiles of mean annual temperatures derived from hourly observations at probes placed in at 1 m intervals in three 6 m boreholes along Trail Ridge Road in Rocky Mountain National Park. The boreholes are approximately 39 miles northwest of the Boulder Weather Station at the Boulder Municipal Airport. These mean annual temperatures are extremely robust as they were derived from probes recording at an hourly interval for a calendar year (Janke (1911)). In addition, the annual mean temperatures were calculated at all the measurement levels by the author and dated using the thermal disturbance methods published by Terzaghi(1970). These data are presented as Figure 5.

clip_image010

Figure 5. The Trail Ridge Mean Annual Temperature Geothermal Profiles.

Using the inflection produced by the 1976 global regime transition at 6 meters in BH2 the overlying inflections dated at 9.3, 2.2 and 0.3 years BP. As the data were taken during a calendar year from the summer of 2010 to 2011 the inflections date from the turn of the century to 2011. The range of inflection dates may be artifacts introduced by local site effects (slope, exposure, winter snow cover, ground water migration etc..) as well as the assumption of the same apparent thermal diffusivity at all the sites.

Conclusions: The footprint of the Modern Warming Regime that was recovered in the RAW data from the Boulder Weather Station is the probable result of the heavy handed uncritical adjustment of the AAT data set. Hurst Rescaling is a powerful method for detecting regime transitions in climatic data. However, the method is also sensitive to uncorrected site moves and adjustment artifacts. The alteration of the Boulder RAW data set was so severe that it masked and attenuated both the onset and end of the Modern Warming Regime.

Another aspect of Hurst Rescaling is the difficulty of introducing false regime inflections into serial climate data. One must first alter the data integral and differentiate the integral to a synthetic data set. However, the differential would probably not remotely resemble the initial data. The perpetrator would then be faced with nearly infinite iterations to fine tune the data.

 

Acknowledgments:

The author is indebted to Dr. Jason Janke of a Metropolitan State College in Denver for giving him access to the rather unique borehole data from Trail Ridge Road.

References:

Janke, J (2011) personal communication.

Outcalt,S.I.,Hinkel,K.M.,Meyer,E . and Brazel,A.J.(1997) The application of Hurst rescaling to serial geophysical data. Geographical Analysis 29, 72-87.

Runnalls,K.E. and Oke,T.R.(2006) A technique to detect micro-climatic inhomogeneities in historical records of screen-level air temperature. Journal of Climate 19: 959-978.

Terzaghi,K (1970) Permafrost, J. Boston. Soc. Civil Eng. 39(1): 319-368

0 0 votes
Article Rating
57 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
February 27, 2013 4:44 am

Could someone please rewrite this so that it makes sense.

tadchem
February 27, 2013 4:48 am

My grandmother (a big fan of Agatha Christie novels) taught me that liars often have to continue to lie to cover up their previous lies, much as Ms. Christie’s murderers often killed again to prevent discovery of their previous killings. In her novels, the third iteration was usually sufficient to reveal both the cover-up and the initial crime.
Regarding the detection process, Sir Arthur Conan Doyle is said to have remarked “What one man can devise, another can discover.”
Regarding the futility of mendacity, Sir Walter Scott wrote “Oh, what tangled webs we weave, When we first practice to deceive.”

JPS
February 27, 2013 4:53 am

Im sorry but this is not terribly coherent. Fig 5 is particularly confusing- how can one data point have both a date AND a depth associated with it? Is the probe moving 6m very slowly over 35 years? If someone can explain please do.

February 27, 2013 5:12 am

1976 was when catalytic converters were mandated in the USA in all new vehicles. Other factors contributed to the decline in aerosols, and hence decreased clouds and increased solar insolation around this time, but catalytic converters caused the ‘regime shift’.

skepticjoe
February 27, 2013 5:21 am

How does ” 3 boreholes along…” relate to air tempurature?

Ebeni
February 27, 2013 6:34 am

Does this not cry out for an FOIA on adjustments, means, methods and rationale. Is it not a federal offense to falsify a document/record? If you are going to materially change a document for purported good and and proper purposes should not there be some Federal Register announcement and public response period?

Matt Skaggs
February 27, 2013 7:03 am

“The footprint of the Modern Warming Regime that was recovered in the RAW data from the Boulder Weather Station is the probable result of the heavy handed uncritical adjustment of the AAT data set.”
I think this should read:
“The [lack of a distinctive] footprint of the Modern Warming Regime [that would otherwise be present] in the RAW data from the Boulder Weather Station is the probable result of the heavy handed uncritical adjustment of the AAT data set.”

February 27, 2013 7:10 am
Claude Harvey
February 27, 2013 7:14 am

Pity the poor students.

geran
February 27, 2013 7:22 am

You just got to love professorial mumbo-jumbo
You could say:
Subsequent to the reliability of the transient depletion, an alternate mode was employed to comply with the issue of conveyance. Molecular actions substantially triggered the neural communications relevant to the motor functions. Choices were available and optimal iterations revealed a combination of processes resulting in the implementation of bovinal products.
Or, you could say:
I had a cheeseburger for lunch.

February 27, 2013 7:22 am
Jeff Westcott
February 27, 2013 7:23 am

Since the usual chorus of commentors seems slow to react to this piece, I will observe that it almost always takes an “emeritus” professor with an apparent command of differential calculus to be willing to throw a stink bomb like this at anything to do with an official (adjusted) temperature record. Only those in or near retirement can risk going directly against the grain.

Coalsoffire
February 27, 2013 7:26 am

Would it be worth the effort to translate this post into English?

David L. Hagen
February 27, 2013 7:28 am

“FALLACIES do not cease to be fallacies because they become fashions.”

~G.K. Chesterton (“Illustrated London News,” April, 19 1930)
Works of GK Chesterton

Steve (Paris)
February 27, 2013 7:29 am

A tough read but worth it

KevinM
February 27, 2013 7:33 am

Claude Harvey,thats what I was thinking as I read. Professors left to their own devices lose contact. As transmit over receive approaches infinity, signal over noise approaches zero.

jayhd
February 27, 2013 7:35 am

I’m only an accountant, and though familiar with some statistical analysis, I am by no means an expert. While reading this post, something jumped right out at me that sent off alarm bells. The author made several references to “adjusted data”. In fact, he used the phrase ” heavy handed uncritical adjustment of the AAT data set” in his conclusion. To me, adjustment of data without clear explanation of the scientific reasons for the adjustments, and without critical review of those adjustments by disinterested experts in the field studied, makes everything worthless, garbage in – garbage out.

Nigel Harris
February 27, 2013 7:35 am

I paid careful attention to this, but from the first sentence of the abstract (which isn’t a sentence as it has no verb) to the incoherent concluding paragraph, it made little or no sense. I have really no idea what the author is trying to tell me, or what point he is trying to make. Can someone summarise for me?
My guess is: Adjusting historical temperature records is evil. But how borehole temperature traces show this I cannot fathom!

February 27, 2013 7:39 am

Seems like we should run Hurst rescaling like a virus scan on all data before taking the trouble to work with it. On the other hand, a few boreholes in Colorado…

Gary Pearse
February 27, 2013 7:54 am

I’m a bit puzzled (or stupid) re figure 1 between the title of the chart and the legend. From the title, In my simpleton mode, I would understand that the trace simply represents the adjustments to the raw data. The ordinate gives negative to positive degrees F so it is therefore a chart of the number of degrees F that were either added or subtracted from the raw data? If so, shaping the post 1950s to 2000s data to a hockey stick is clear. However the most obvious feature of the chart at a glance, is the trimming down of the 1930s record highs which had stubbornly refused to be broken, until we got a look at emails in which GISS was chipping off a degree or so to make 1998 the all time high.
Some observations to add:
1) Given that one is constrained in making too obvious an adjustment to recent temperatures with everyone looking on (Oh I’m sure they have been agonizing over the present flat temp period we have been in for 17 years), the desired result of a hockey stick to be prominent must be largely done by adjusting the older part of the record downwards – and this is clear from the chart and your discussion that that is what was done.
2) Looking at the adjustments for 2000+, I see how cunningly provident they are. They have left themselves ~ 2 F leeway to add on to the record of future readings by simply leaving them almost unadjusted.
3) The effort to kill that pesky 1930s all time high (1936) results in the the most eye-catching part of the chart.
4) A forensic point: since one doesn’t know what the future records will show, the adjustments have to have been made in the recent past. There would have been no need for adjustments in 1950 – 1980 because the same record keepers were “aiming for” a new ice age. It did,however, warm up after 1979 and by the late 80s, AGW was being born. Indeed, the adjustments probably began after the El Nino high of 1998, when the hockey stick folks (and particularly Hansen) were waiting for a messianic new world record high. 1998 was too good to pass up! Now, with the usual suspects in a desperate state and all the hullaballoo over the 17 years of no warming, be on your guard for the last 2 F to be withdrawn from savings and some chiseling off of the earlier 2000s data. BTW, I think a complete forensic evaluation of the adjustments and even a prediction that the 2 F withdrawal will be made and the early 2000s record will be trimmed slightly …, perhaps this would be too cynical.

JPS
February 27, 2013 7:54 am

“The perpetrator would then be faced with nearly infinite iterations to fine tune the data.”
I think he is on to something here… if we can force them into infinite iterations they will never complete their work?

john robertson
February 27, 2013 8:19 am

From the above comments, Its not just me, I read the posting and was feeling real stupid, so I read the comments hoping someone else had unpacked the meaning.

Hmmmmm
February 27, 2013 8:37 am

I think he might just have picked up on what “Steve Goddard” has been banging on about for years…….
http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/
I think in a academic way the post is trying to state that the NOAA temperatures don’t match local temperatures and are adjusted upwards. Oddly.

Doug Jones
February 27, 2013 9:06 am

Please explain, what the ^@#(*^ is an “integral trace”?

harvey
February 27, 2013 9:12 am

john robertson says:
February 27, 2013 at 8:19 am
“From the above comments, Its not just me, I read the posting and was feeling real stupid, so I read the comments hoping someone else had unpacked the meaning.”
I have to agree with your comments. AS far i can understand it. They have screwed the data to make it look like something they wanted to see?

February 27, 2013 9:23 am

Look at the AAT-AATRAW graph in figure 1. Where are the points of station recalibration?
At least one time, and hopefully many times, during the 1895-2011 history of the station someone in technical authority visited the station to calibrate it. When that calibration activity was finished, then the day after the station should have a temperature reading requiring zero adjustment.
Where are the zeros?!?!
We may have lost the metadata on the station maintenance. In this case it is true that absence of evidence is not evidence of absence. Someone must have calibrated that station at least one from 1895 to 1935. Yet we are to believe that the station needs a -2.3 to -2.1 deg adjustment during this period.
Hypothetically, suppose the discontinuity in late 1919 is the result of such a recalibration visit. Shouldn’t then the adjustment for 1920 be zero? Perhaps evidence can be shown that in 1935 (before a 1942 recalibration) that the station was indeed too hot by 2.1 deg. A logical adjustment chart would be a saw saw-tooth, zero in 1920, drifting to -2.1 in 1935 returning to zero in 1936 — linear drift to a 1935 pre-recalibration error estimate?

Let me nominate the occasional “painting of a Stephenson screen” as a member of a class of events called recalibration of the temperature sensor. Other members of the class might be: weeding around the enclosure, replacement of degrading sensors, trimming of nearby trees, removal of a bird’s nest, other actions that might fall under the name “maintenance”.
A property of this “recalibration class” is that there is slow buildup of instrument drift, then quick, discontinuous offset to restore calibration. ….
BEST [by slicing at the recalibration discontinuity] will … codify two episodes of instrument drift into real temperature trends. Not only will Instrument drift and climate signal be inseparable, we have multiplied the drift in the overall record by discarding the correcting recalibration at the discontinuities. (More from Rasey-1/23/13 11:30) in“Berkley Earth Finally Makes Peer Review” (in vol 1, No. 1 of a new journal!)

Matthew R Marler
February 27, 2013 9:23 am

A further indication of the footprint of the modern warming data can be found in the
The section beginning with that segment seems extraneous to the rest of the post.
Is it possible to define “integral trace” clearly and succinctly, or do we need to read the reference? In figures 2 and 3 why does it go down and then come back up? How is figure 3 more informative that the simple plot of the 3 temperature series versus time?

February 27, 2013 9:25 am

This needed an editor. Besides the problem with English, the references are hard to locate. Janke(1911) probably means Janke(2011), but it needs an entry at the end of the article with the journal name, volume and page number to properly specify the citation.
REPLY: I don’t disagree, but I have only so many hours in the day, and some days I just can’t do everything. Yesterday was one of those days. However, in the future I’ll ask Mr. Outcalt to consider his words carefully- Anthony

kim
February 27, 2013 9:46 am

Heh, if he’s on to something, it should be easy to show it elsewhere.
==============

PaulR
February 27, 2013 9:51 am

This whole thing reads like an attempt at a hoax by purported Prof. emeritus. Is he real?

Paul Dennis
February 27, 2013 9:51 am

This is very poorly presented and uses confusing language. From what I can discern Professor Outcalt has simply plotted a CUSUM (CUmulative SUM of differences) chart for the data. This is an absolutely standard quality control procedure in many laboratories or industrial situations and allows one to quickly identify when processes drift. To produce the charts as presented here one simply determines the mean of all the observations in the time series. The CUSUM for the nth point in the series represents the sum of differences from the mean between the first observation and the nth observation etc. It is an extraordinarily sensitive test for when process averages change. Thus on a CUSUM chart a negative slope of constant gradient represents a mean that is less than the overall series mean. A gradient of zero represents a mean that is the same as the overall series mean, and a positive gradient represents a mean greater than the series mean.
Professor Outcalts chart, Figure 3 implies that the time series of temperature has three intervals. 1880-1936 when it is below the whole series mean
1936-1976 when it approximates the whole series mean
1976-2008(?) when the mean temperature is greater than the whole series mean.
One might say ‘so what’, this is obvious from looking at the 20th century temperature record. What Professor Outcalt seems to be arguing is because of the obvious changes in adjustment of the Boulder record it is not possible to see this ‘3-phase’ pattern (footprint in his terms) of global temperature change. I would argue that why would you expect a single temperature station to show the global pattern?

DaninVan
February 27, 2013 10:07 am

A number of commenters have queried the point of the the boreholes. I’m no scientist but it seems to me that they make a great control measurement, especially at maximum depth.
I found the migration of the temperatures at the deepest sensors fascinating. Climbing upwards in contrast to the surface temperatures’ movement goes against the current climate argy bargy…unless (finally?!) one takes into account the fact we live on the surface of a 7K C. furnace.
Might just have a wee bit of influence?

Matt Skaggs
February 27, 2013 10:29 am

Thanks Paul Dennis!

February 27, 2013 10:36 am

Tried to run this through Google Translator, but it crashed after 30 seconds! 🙂

tobias
February 27, 2013 10:52 am

About screwing the data so they can make it look like something they wanted to see.
Don’t we do that with our loved ones as well?

Silver Ralph
February 27, 2013 11:13 am

John Brookes says: February 27, 2013 at 4:44 am
Could someone please rewrite this so that it makes sense.
_______________________________
Seconded. What language was it in??
.

Ian H
February 27, 2013 11:31 am

Previous investigations of the adjusted Boulder average annual temperature data indicated that the Modern Warm Regime was not viable on the integral trace of that record.

If you change “viable” to “visible” this makes a whole lot more sense.

Another aspect of Hurst Rescaling is the difficulty of introducing false regime inflections into serial climate data. One must first alter the data integral and differentiate the integral to a synthetic data set. However, the differential would probably not remotely resemble the initial data. The perpetrator would then be faced with nearly infinite iterations to fine tune the data.

I do love the use of the word perpetrator here.

Richard G
February 27, 2013 11:32 am

Paul Dennis says:
February 27, 2013 at 9:51 am
“I would argue that why would you expect a single temperature station to show the global pattern?”
_____
I would expand this to question: why expect the HADCRUT or GISS records to reflect a global pattern. They are biased to reflect human habitation.
Landforms are approximately 30% of the earth surface, some 60 million sq. mi.
70 % of the human population lives on aproximately 7% (4.25 million sq. mi.) of that land. Humans show a preference for riparian and maritime habitat. The climate/temperature records are heavily weighted toward North America.
Climate is a local phenomenon, with varying degrees of scale.
Global average? What does that mean? Very little.

Anton Eagle
February 27, 2013 11:33 am

Let’s make this really simple. STOP ADJUSTING DATA!
Adjusting bad data does NOT make it good… it’s then just adjusted bad data. If this were done in my field, it would be dimissed on it’s face. If you don’t like the data you are working with… then get better data!

Downdraft
February 27, 2013 12:03 pm

This is probably not a fair comment, but the post reminds me of the books on primary education that my wife had to study to become a teacher. Whole volumes were filled with jargon written by professors that never set foot in an actual classroom, but which could be summed up in a couple of paragraphs of plain English.
I was unable to wade through the jargon in the post. Perhaps someone could re-write it in plain English using words and phrases in common usage by the average college graduate. It looked important and interesting, and it would be a shame if only people working in the specific field can understand it. Could it be summed up as “lots of dubious adjustments have been made to the raw data”?

February 27, 2013 12:22 pm

harvey says:
February 27, 2013 at 9:12 am
john robertson says:
February 27, 2013 at 8:19 am
“From the above comments, Its not just me, I read the posting and was feeling real stupid, so I read the comments hoping someone else had unpacked the meaning.”
I have to agree with your comments. AS far i can understand it. They have screwed the data to make it look like something they wanted to see?
*
That’s the way I read it.

February 27, 2013 12:43 pm

Downdraft says:
February 27, 2013 at 12:03 pm
This is probably not a fair comment, but the post reminds me of the books on primary education that my wife had to study to become a teacher. Whole volumes were filled with jargon written by professors that never set foot in an actual classroom, but which could be summed up in a couple of paragraphs of plain English.

Oh, it’s very fair. After twenty years being out in the work force, I decided to take on the challenge and become a teacher. I got my teaching credential four years ago (two creds actually). Some of the things we studied can sometimes be useful in the classroom, but the vast majority was nothing more than fluffed-up busy work, having no practical use, except to show how incredibly deep and intellectual education scholars are.

R2Dtoo
February 27, 2013 12:49 pm

I have been wondering for a long time if the adjusting of original temperature records- especially in one direction for long periods of time- would make it impossible for the models to properly predict the future. Maybe these guys have set themselves up for a massive fail. I would think that they would want to run the models with the original data just to see the differences generated in the long term. Maybe they did do this- and didn’t like the results?

commieBob
February 27, 2013 1:24 pm

My attempt at an explanation:
They compared the RAW data with the NOAA USHCN Average Annual Temperature data. The difference between the two is the adjustments that had been applied to the data. The adjustments are shown on Figure 1.
Because they say they found the end of Modern Warm Regime, they are saying it is over. Any subsequent warming is due to adjustments.
The borehole data is used as an honest broker somewhat similar to satellite and balloon data.

February 27, 2013 5:19 pm

After long consideration, I think someone is pulling a Sokal here.
That is, the post makes no sense, and was written with that intent, to see if a completely nonsense post would be published. Would the author please confirm?

ferdberple
February 27, 2013 6:00 pm

jayhd says:
February 27, 2013 at 7:35 am
To me, adjustment of data without clear explanation of the scientific reasons for the adjustments, and without critical review of those adjustments by disinterested experts in the field studied, makes everything worthless
=========
Hansen and GISS is no different that having the Director of Finance adjusting the books to make them “more correct”. Reduce the past, increase the present. The bank balance remains the same, but profits are at record levels. Collect your bonus for stellar performance.
Like Enron, things are rosy so long as everyone is making money. Gore saw the light and got out while the getting was good.

ferdberple
February 27, 2013 6:07 pm

What the author is saying is that heavy handed adjustments to try and improve climate records have the opposite effect. They mask what is really happening with climate.
Sort of like average temperature paints a false picture of the climate. A desert and a rain forest may have near identical average temperatures, but completely different climates. By processing the data, climate science has hidden the climate, not revealed it.

ferdberple
February 27, 2013 6:14 pm

DaninVan says:
February 27, 2013 at 10:07 am
…unless (finally?!) one takes into account the fact we live on the surface of a 7K C. furnace.
========
hotter than the surface of the sun and a lot closer! In scale the crust of the earth is much thinner than the skin of an apple. Imagine for a moment that the core of the apple is molten. Try and hold that in your hand.

Ian George
February 27, 2013 6:18 pm

Which one is correct for Boulder?
Suspicious records removed (no warming apparent)
http://data.giss.nasa.gov/cgi-bin/gistemp/show_station.cgi?id=425000508480&dt=1&ds=13
OR
GISS adjusted data (warming apparent)
http://data.giss.nasa.gov/cgi-bin/gistemp/show_station.cgi?id=425000508480&dt=1&ds=14
Any reason for the two seemingly different data bases?

February 27, 2013 7:09 pm

February 27, 2013 at 5:19 pm | John Brookes says:
————————–
C’mon, jb, WE know that YOU would be incapable of understanding this article whatever language it was written in … WE know that YOU merely parrot what you hear in the lunch room ‘academic’ chat at UWA. YOU don’t fool US, WE know that UWA output is synonymous with that of UEA, YOU remember the University of Easy Access where ‘Climategate’ originated.

Paul Vaughan
February 27, 2013 7:32 pm

Section 2c, p.961 = concise, efficient explanation of Hurst Rescaling:
Runnalls, K.E.; & Oke, T.R.(2006). A technique to detect micro-climatic inhomogeneities in historical records of screen-level air temperature. Journal of Climate 19, 959-978.
http://journals.ametsoc.org/doi/pdf/10.1175/JCLI3663.1
As Paul Dennis says (February 27, 2013 at 9:51 am), center the data and do a running sum.
This method probably goes back a lot further than recorded human history. It’s useful. That’s why it has endured and why people can reinvent it intuitively without ever having seen or heard of it.
Sensible serial data exploration always looks at integrals & derivatives. There’s a good reason why differential equations usually have x, x’, & x” terms. It’s called orthogonality.
It’s not always enough to look at unsorted integrals. For example terrestrial circulation in January has a totally different configuration than in July. Why ignore something so fundamental? Just because of bad conventions? That’s not sensible. Have a look at some seasonal integrals in the appendices here:
Solar-Terrestrial Volatility Weaves
There are plenty of other examples where sorting matters. The explorer who ignores fundamentally important conditional dependenciesparticularly if they are nonlinear — is guaranteed to fall victim to paradoxical misinterpretation.
common sense about paradox = rare & valuable

February 27, 2013 10:42 pm

John Brookes says:
February 27, 2013 at 4:44 am
Could someone please rewrite this so that it makes sense.
john robertson says:
February 27, 2013 at 8:19 am
From the above comments, Its not just me, I read the posting and was feeling real stupid, so I read the comments hoping someone else had unpacked the meaning.

Whew! Thought it was me.

February 27, 2013 11:16 pm

RE: Hurst Rescaling
What would a saw-tooth time series look like after Hurst Rescaling?
Let say that it is a drift towards warmer (Stevenson Screen paint getting old) over a couple of years, then it gets repainted.
The Hurst transform (Q) would create a scalloped shape with sharp points upward and rounded U shapes between repaintings. The bottom of the U is NOT a transition!
The first dirivative (Q’) is the just time series translated by the mean, the saw tooth.

richard verney
February 28, 2013 12:23 am

Philip Bradley says:
February 27, 2013 at 5:12 am
1976 was when catalytic converters were mandated in the USA in all new vehicles. Other factors contributed to the decline in aerosols, and hence decreased clouds and increased solar insolation around this time, but catalytic converters caused the ‘regime shift’.
//////////////////////////////////////////////////
The introduction of catalytic converters may have been a factor, but also power generation was being cleaned up as from the 1970s and that may well have been more of a factor. Generally, because of a combination of factors, there were clearer skies and less solar dimming as from the 1970s onwards and these clear skies may have played a part in the observed late 1970s to late 1990s warming.
But what now of the pollution put out by China, India and Brazil? What evidence is there of a return to solar dimming?

richard verney
February 28, 2013 12:33 am

ferdberple says:
February 27, 2013 at 6:07 pm
What the author is saying is that heavy handed adjustments to try and improve climate records have the opposite effect. They mask what is really happening with climate.
Sort of like average temperature paints a false picture of the climate. A desert and a rain forest may have near identical average temperatures, but completely different climates. By processing the data, climate science has hidden the climate, not revealed it.
///////////////////////////////////////////////////////////////////////////////
The concluding sentence is a very perceptive comment.
I frequently hark on about the inappriateness of using averages and how this hides and distorts the real position. One thing that you can be fairly sure about is that in the real world, the average condition is rarely encountered.
It is often said that models to not do well when making predictions/projections on a regional basis, but , it is further claimed, that they are quite OK for an assessment of the global position. One reason why models do not perform well on a regional basis is that far too often averaged data is fed in. Madness, since climate is local not global (and that is why there are different climatic belts throughout the globe not just one). The only global climate is that planet Earth is presently in an inter glacial period.

wayne Job
February 28, 2013 4:11 am

This maybe easier to understand if it was translated into pidgen english and re written by Willis into a commentary of the vagarities of scientific english. I feel a little lost like Alice in wonderland.

phlogiston
February 28, 2013 3:49 pm

This is a very important study. It points to key features of recent climate history – e.g. a recent termination of a warming phase, and establishes a useful method for detecting adjustment of climate records, an issue in which many here affect to have a strong interest. So folks should not attack it too much for using dense scientific language. Some scientists find it natural to convey scientific information as it is, rather than dressed up in pink lace.
Hurst is emerging as quite an important dude in climate – he gets his name on fractal analysis of climate, and now on a method to sniff out fraudulent tampering with the temperature record. Respect.