Abrupt changes in GHCN station-level temperature records contradict the anthropogenic global warming (AGW) claims.

Guest post by Jens Raunsø Jensen

Preamble

Inspired by a statement by Dr. Kevin Trenberth in the e-mails referred to as Climategate 2.0 (#3946 discussed here), it is hoped that climate scientists will have “an open enough mind to even consider” that the global warming of the 20th century could have occurred mainly as abrupt changes in mean temperature linked with natural events. Observational data supports that claim, at variance with the AGW “consensus view”.

Summary

Abrupt or step changes in temperature regime has been the subject of many discussions on this and other blogs and in the peer reviewed literature. The issue is not only statistical. More importantly, any presence of major step changes in mean temperature regime may contradict the claims of the AGW theory and models, i.e. the claims of increasing and accelerating temperature and of human emissions of GHGs being the major cause for the relatively high temperatures in the second half of the 20th century.

In this post, 232 complete and unadjusted GHCN station records are analysed for step changes in the period 1960-2010, and it is argued that:

  • Abrupt changes in temperature linked with natural climate events may be widely responsible for the “global warming” during the second half of the 20th century.
  • 50% of sample stations have not experienced increased mean temperature (”warming”) for more than 18 years.
  • 70% of Europe stations have not experienced warming for more than 20 years.
  • The relative role of natural processes in global warming is very likely underestimated by IPCC.
  • The global average temperature curve is ”apples and oranges” and is widely misinterpreted using linear trend and smoothing techniques as indicating a pattern of widespread uniformly increasing temperature.

Objective and methodology.

The post is in continuation to my earlier post on the subject (http://wattsupwiththat.com/2011/08/11/global-warming-%e2%80%93-step-changes-driven-by-enso/ ), now including a near-global station level analysis. The post is based on a ppt presentation including additional details given at a researcher’s workshop at University of Copenhagen, 15th November 2011 (http://www.danishwaterforum.dk/activities/Researchers_Day_Climate_Change_Impact_2011.html ).

The objective with this analysis has been (i) to examine the land-based temperature records at station and higher levels for the presence of step changes during the period 1960-2010, and (ii) to assess the implications for our assessment of global warming during that period. Please note that the objective has not been to dismiss a (likely) presence of an anthropogenic warming signal, or to establish a climate model, or to make projections for the future. The issue is step changes in observational data during 1960-2010.

I have used the documented Regime Shift Detection tool of Rodionov (2004, 2006; www.beringclimate.noaa.gov/ ). The results are considered to be statistically robust (ref. the ppt presentation for details on parameter settings and a verification of the assumptions of constant variance and a likely negligible influence of autocorrelation).

The station level data is from GHCN (“after combine”, http://data.giss.nasa.gov/gistemp/station_data/ ) and include ALL stations with a complete record in the period 1960-2010 in broadly defined sampling regions (ref. Fig. 1).

A total of 232 stations were identified, with 54% located in Europe and Russia. The sampling criteria result in wide differences between the “regions” in terms of station number, density and distribution. Also, the “regions” are more or less homogeneous climatologically. However, this is not of material importance for the following discussion and conclusions.

Fig. 1. Distribution of sample stations according to sampling criteria.

Results

Significant step changes are widely found in the T-records and representative examples for 3 “regions” are shown in Fig. 2a-c. The temperature increase in the steps is typically of a size which is comparable to the often quoted global warming during the 20th century.

Fig. 2a. Alaska T-anomaly (n=9). Step, 1977; T-change = 1.5 oC; significance 0.000001

Fig. 2b. Fichtelberg, Europe. Step, 1988; T-change = 1.0 oC; significance 0.00009

Fig. 2c. Malacca, South-East Asia. Steps: 1978, 1990 and 1998; T-change = 0.4+0.3+0.4 = 1.1 oC; significance, 0.0004, 0.0007 and 0.003.

Warming during 1960-2010 was clearly a non-linear process at station level, with the step pattern differing among the “regions”. The global average T-anomaly curve, constructed by averaging across station-level T-anomaly curves, is therefore highly deceptive in propagating a message of near-linearly increasing temperatures, contrary to the actual processes at station level. Thus, the global T-anomaly curve is inherently “apples and oranges” and can not be used to identify a meaningful global AGW trend if the step changes are neglected. Then, the apparent AGW trend will in reality mainly capture the aggregated effect of the sudden step changes (as e.g. in Foster and Rahmstorf, 2011).

The steps are concentrated in few short periods. Disregarding 39 steps after 2005 (considered highly uncertain and “in progress”; 2/3 ups and 1/3 downs), it is found that:

  • The steps occur predominantly (58%) in three 3-year periods: 1977/79, 1987/89 and 1997/99 (Fig. 3).
  • 72% of all stations, and more than 50% of stations in each “region” (except Arctic), have one or more steps during these periods (e.g. 89%, 56% and 93% of Europe, Russia and South-East Asia stations, respectively; Fig. 4).
  • 78% of Europe stations have a step change in 1987/89, during which the major part of the entire warming of the 2nd half of the 20th century apparently took place.
  • 2 or 3 steps are common in South-East Asia (especially 1987/89 and 1997/99), but one step only is common in records from Alaska (1977/79), Europe (1987/89) and Russia (1987/89).

Fig. 3. Distribution of step changes by year of change.

Fig. 4. Percent of stations with one or more steps in indicated 3 periods.

Similar step changes are identified in national average records (ref. link to presentation above): US contiguous 48 states (GISS): 1986 and 1998; Australia (BOM): 1979 and 2002; and Denmark (DMI): 1988. The steps in the Global T-records are: Crutem3gl: 1977, 1987 and 1998; GISS L/O: 1977, 1987 and 1998; and Hadcrut3: 1977, 1990 and 1997.

The steps are statistically highly significant. But are they supported by a probable physical cause? The answer must be yes for the majority of steps. The steps occur in a temporal and spatial pattern coinciding with well-documented events and regime changes in the ocean-atmosphere system:

  • 1976/77: the great pacific shift from a “cold” to a “warm” mode (e.g. Trenberth, 1990; Hartmann and Wendler, 2005).
  • 1987/89 and 1997/99: the two clearly most intense El Niños of the period, 1986/88 and 1997/98, with the intensity here defined as event-accumulated nino3.4 anomalies (NOAA’s ONI index); there were two less intense events in 1982 and 1991, the impact of which was probably occluded by the major volcanoes El Chichon and Mt. Pinatubo.
  • A regime shift in NH SST in 1988/89 (Yasunaka and Hanawa, 2005).
  • A new regime of constant temperature after the 1997/98 El Niño, i.e. the now widely accepted “hiatus” in global warming.
  • Documented step changes and regime shifts in marine ecosystems, e.g. the late 1980s in Europe and in the Japan/East Sea.
  • The short-term regionally diverse global impact of ENSO events is generally well-known.

The empirical evidence, from this station level analysis and other sources, is unequivocal: the step changes in mean temperature are likely real and associated with natural events. The physical mechanisms remain to be understood, and this is certainly not to claim, that ENSO events are the only elements of the natural cause-effect chain.

It is therefore concluded, that the major part of the temperature change (global warming) in the 2nd half of the 20th century occurred as abrupt changes in mean temperature associated with natural events in the ocean-atmosphere system. Still, a warming/cooling trend – albeit relatively small compared with the step changes – could of course be hidden by the regime change model. But it seems inconceivable, that steadily increasing CO2 levels could be responsible for the major sudden changes observed as e.g. in Alaska in 1977, Europe in 1988 and South-East Asia in 1998. In principle, the natural events and step changes could have been amplified by human caused warming, but this is currently pure speculation.

Implications when accepting the presence of steps

“Increasing temperature and accelerated warming” : this study does not support general statements like that. The bulk of the “global warming” has likely taken place in abrupt steps, and 50% of the stations analysed has not experienced any significant warming for more than 18 years (Fig. 5). In Europe, 70% of the stations have not experienced significant change in mean temperature for more than 20 years.

In South-East Asia, the median value is 13 years as many stations here also experienced a step change in 1997/98 (Fig. 4).

Fig. 5. Years of constant T-mean prior to 2010. Box-Whisker plot, 1st and 3rd quartiles. (note: uncertain up and down step changes during 2006-2010 are disregarded).

Challenging the IPCC consensus view, i.e.: “Most of the observed increase in global average temperature since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse-gas concentrations”. However, the finding above, that abrupt changes linked with natural processes likely account for most of the increase in temperature during 1960-2010, contradicts the IPCC claim regarding the relative importance of natural and human causes. Thus, when IPCC (AR4) can only reproduce the T-curve by including GHG effects, then logically

  • either the IPCC GCM models do not adequately model the natural processes of high significance for the temperature variations (there is still low confidence in the projection of changes in the ENSO variability and frequency of El Niños, ref. the recent SREX-SPM IPCC report),
  • or/and the IPCC has overestimated the climate sensitivity to CO2 changes by eg. attributing natural temperature increases to CO2-induced feed-back processes.

    In either case, the relative importance of natural processes for the T-changes has likely been underestimated by IPCC.

Conclusion

This study has established that step changes in land-based temperature records during 1960-2010 are common and very likely real and linked with natural climate events. The step changes are statistically highly significant and with a systematic yet regionally diverse pattern of occurrence coinciding with major climate events and regime shifts. This finding has far reaching consequences for our analysis of climate records and for our assessment of global warming.

Thus, although many different statistical models can be applied to explore the pattern of T-change, the presence of step changes invalidates the widely used statistical techniques of linear trend and smoothing as means of identifying the pattern of temperature variation during 1960-2010.

Furthermore, the step changes account for the main part of the temperature changes during the 2nd half of the 20th century. The logical consequence is that natural processes have been the major cause for the temperature change during this period, leaving a secondary role to other causes such as the anthropogenic greenhouse effect.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
205 Comments
Inline Feedbacks
View all comments
January 5, 2012 1:28 pm

Some of the ‘steps’ will also be splice artifacts from the ‘homogenizing’ process.
There are assumptions made when segments from different thermometers (some with the same name but slightly different locations or hardware) are merged to make a single “record” of a “place”.
If demonstrated how this can be happening in the data in:
http://chiefio.wordpress.com/2010/04/03/mysterious-marble-bar/
Which takes the individual station data and one step at a time merges them (individual flat) into a longer term rising “trend” (that matches the “trend” found by GIStemp). I then developed a method to look at the data without that splice artifact and found much different results from GISTemp. (The dT/dt method to the right of that link in categories)
Since then they have eliminated individual station data from GHCN and moved the “splicing” out of GIStemp and “upstream” to inside NOAA / NCDC (where, on presumes, they hope it will not be as visible…)
IMHO, some of the timing on when particular thermometers come into and leave the system are curiously synchronized with some of those natural cycles and can help facilitate the occurrence of such “spice artifacts” from the homogenizing process.
So while I’m pretty sure a lot of the step function is from natural causes, I suspect that is then reflected (perhaps with added ‘vigor’) in the temperature record via the way different thermometer segments are spliced to create a fictional record of longer duration.

Richard M
January 5, 2012 1:34 pm

One reason we might only see step ups has to do with how the planet manages heat. If the method is absorption into the oceans and infrequent releases of that energy, then you would expect the increases to be in steps. If the cooling is just the lack of heat injections then it would be slow/steady and we would not see steps down.
Not to say there couldn’t be a stepwise cooling mechanism, it just may not be active in our current climate.

clipe
January 5, 2012 1:40 pm

Urederra says:
January 5, 2012 at 8:07 am
“I recall reading one article written by Ross McKitrick where he shows a temperature graph with a big step just when the thermometers in the canadian surface stations were changed from mercury/alcohol to thermocouples. I may be recalling the whole thing wrong though.”
Is this the one?
http://www.uoguelph.ca/~rmckitri/research/nvst.html

January 5, 2012 2:10 pm

Interesting approach and it addresses one of my major problems with the whole process of deriving a “global average temperature” — you lose so much potentially interesting information in creating the average. I believe it was on this blog someone commented ” global average temperature fits the earth’s climate like a burlap sack fits Paris Hilton”, or words to that effect. In point of fact a burlap sack will fit Paris Hilton, but it will also hide all the most interesting regional departures from the average shape.

Jim G
January 5, 2012 2:12 pm

If this sediment core data is at all accurate then temperature has been trending down and becoming more chaotic for the last several million years. How anyone can say anything about a trace gas having any recent effect is patently ridiculous. (Sorry, the chart was found on wikipedia.) Seems I saw a similar graphic of ice core data in Unscientific American last year or so that showed the opposite, with temperature more stable over the last 3 mm or so years. I argue that climate is chaotic with too many variables to predict particularly over the short (thousands of years) term.
http://en.wikipedia.org/wiki/File:Five_Myr_Climate_Change.svg

Jan de Ruiter
January 5, 2012 2:21 pm

I have a small technical question: your post keeps mentioning “significance”, but how do you define significance here? What probability is your p-value exactly representing? I ask, because the use of “traditional” significance measures does not make sense in this context, and these measures cannot be used to evaluate “model fits”, which I assume is what you are after. Any background information or reference to papers that contain this information are welcome. Thanks in advance.

londo
January 5, 2012 3:16 pm

Have to agree with JT here. The homogeneous solutions to the earth climate system my have discrete structure that is being probed by the forcing caused by CO2. However, If the climate model are worth anything, they should have seen this structure. Question is only how long it takes before they do and how ugly the fix will be to make them do so.

R. Gates
January 5, 2012 3:18 pm

Justin K says:
January 5, 2012 at 12:45 pm
@R. Gates said:
And in respect to my going to the same “mantra”…it is the nice thing about the laws of physics that they are so consistent.
——————————–
Ahhh, but that is not technically accurate as the laws of physics are not consistent as they are applied differently depending on the scale, i.e. large bodied objects’ behavior can be predicted by classical Newtonian physics and small bodied objects’ behavior can be predicted by quantum physics.
____
Though we’ve not yet found the perfect unification in the laws of quantum effects and gravity, we certainly will someday as such a unification certainly does exist, and it will be as consistent as the very existence of stars, galaxies, and all the rest of the wonderful things that make up this universe. It takes both quantum effects and gravitational effects to allow you to even exist here on earth, so feel glad for that very stable consistency.

Theo Goodwin
January 5, 2012 3:49 pm

Alan Statham says:
January 5, 2012 at 6:17 am
“This is laughable. Your starting premise – “any presence of major step changes in mean temperature regime may contradict the claims of the AGW theory and models” – is wrong. You’ve used an arbitrary subset of the available data. Your methodology consists of drawing arbitrary lines on that data. Your conclusions, as a result, are wildly erroneous.”
Apparently, you are nominating the author for a lead author position with the IPCC. Good judgement!

R. Gates
January 5, 2012 3:51 pm

Philip Bradley says:
January 5, 2012 at 1:15 pm
In recent decades global aerosol levels have declined, which means the aerosol forcing has been in the same direction as GHGs (ie warming).
_____
Actually, this is not true for the period after 2000. Stratospheric Aerosols have shown an increase over the past 10 years or so, at least partially off-setting some of the forcing caused by the increases in greenhouse gases. See:
http://www.noaanews.noaa.gov/stories2011/20110721_particles.html

Theo Goodwin
January 5, 2012 3:54 pm

JT says:
January 5, 2012 at 6:17 am
“Lets not forget that a complex, chaotic, non-linear system like the climate system might respond to a steady anthropogenic forcing with discrete (stepwise) state changes.”
Warmists bring up this idea of a chaotic system constantly but only in defense of their other theories. They never manage to explicate it or show that it is well confirmed. The modelers do not address a chaotic system. The paleoclimatologists do not address a chaotic system.
Who among climate scientists addresses a chaotic system? What confirming evidence do they provide?

Theo Goodwin
January 5, 2012 4:02 pm

thepompousgit says:
January 5, 2012 at 8:17 am
Thanks for the info. Please give us a reference to get us started on this particular topic.

A physicist
January 5, 2012 4:13 pm

Jan de Ruiter says: I have a small technical question: your [Jensen’s] post keeps mentioning “significance”, but how do you define significance here? What probability is your p-value exactly representing? I ask, because the use of “traditional” significance measures does not make sense in this context, and these measures cannot be used to evaluate “model fits”, which I assume is what you are after. Any background information or reference to papers that contain this information are welcome. Thanks in advance.

Jan makes a good point. The tool that Jensen (the Regime Shift Detection tool) is using tests the null hypothesis that the data points are: (1) normal, (2) stationary, and (3) independent.
However, even in the absence of regime shifts (“steps”) we have reason to anticipate that the data-set being analyzed is non-normal, non-stationary, and non-independent, and so we expect that the statistical significance values assigned by the Regime Shift Detection tool will be fragile (not robust).
As a common-sense example, in Figure 2a the claimed significance is “0.000001”, and yet the unaided eye has great difficulty seeing any step changes in the data at all.

January 5, 2012 4:28 pm

R. Gates said January 5, 2012 at 3:18 pm
“Though we’ve not yet found the perfect unification in the laws of quantum effects and gravity, we certainly will someday as such a unification certainly does exist, and it will be as consistent as the very existence of stars, galaxies, and all the rest of the wonderful things that make up this universe.”
R Gates, you seems to know more about these things than Stephen Hawking. When did you receive your Nobel prize in Physics?
Quoting Hawking:
“But I think that quantum theory and gravity together, introduces a new element into the discussion that wasn’t present with classical Newtonian theory. In the standard positivist approach to the philosophy of science, physical theories live rent free in a Platonic heaven of ideal mathematical models. That is, a model can be arbitrarily detailed and can contain an arbitrary amount of information without affecting the universes they describe. But we are not angels, who view the universe from the outside. Instead, we and our models are both part of the universe we are describing. Thus a physical theory is self referencing, like in Godel’s theorem. One might therefore expect it to be either inconsistent or incomplete. The theories we have so far are both inconsistent and incomplete.
Quantum gravity is essential to the argument. The information in the model can be represented by an arrangement of particles. According to quantum theory, a particle in a region of a given size has a certain minimum amount of energy. Thus, as I said earlier, models don’t live rent free. They cost energy. By Einstein’s famous equation, E = mc squared, energy is equivalent to mass. And mass causes systems to collapse under gravity. It is like getting too many books together in a library. The floor would give way and create a black hole that would swallow the information. Remarkably enough, Jacob Bekenstein and I found that the amount of information in a black hole is proportional to the area of the boundary of the hole, rather than the volume of the hole, as one might have expected. The black hole limit on the concentration of information is fundamental, but it has not been properly incorporated into any of the formulations of M theory that we have so far. They all assume that one can define the wave function at each point of space. But that would be an infinite density of information which is not allowed. On the other hand, if one can’t define the wave function point wise, one can’t predict the future to arbitrary accuracy, even in the reduced determinism of quantum theory. What we need is a formulation of M theory that takes account of the black hole information limit. But then our experience with supergravity and string theory, and the analogy of Godel’s theorem, suggest that even this formulation will be incomplete.
Some people will be very disappointed if there is not an ultimate theory that can be formulated as a finite number of principles. I used to belong to that camp, but I have changed my mind. I’m now glad that our search for understanding will never come to an end, and that we will always have the challenge of new discovery. Without it, we would stagnate. Godel’s theorem ensured there would always be a job for mathematicians. I think M theory will do the same for physicists. I’m sure Dirac would have approved.”
http://www.hawking.org.uk/index.php/lectures/91

Rex
January 5, 2012 4:34 pm

>> Rob R says:
>> January 5, 2012 at 1:24 pm
>> Step changes in average surface temperature are present in regions
>> that are not highlighted in this post. For instance there was a major
>> upward step change in New Zealand in the mid 1950′s that was
>> followed by 60 years of relatively flat trend, particularly in the
>> South Island.
I hope these trends weren’t derived from NIWA’s ridiculous Seven Station
Series, which they skite about on their website as being “representative of
New Zealand”. The four in the South Island are all (near-)coastal.

Caleb
January 5, 2012 4:37 pm

re: Leif Svalgaard says:
January 5, 2012 at 7:33 am
Why are the “steps” always up?
One Viking asked another, in Greenland in the year 1300, “Why are the steps always down?”
Answer that question and you’ll answer your current one.

Ian H
January 5, 2012 4:40 pm

I am unconvinced.
On some of these graphs a step looks like maybe it might be a better fit than a straight line. But others – especially the ones you have kitted out with multiple steps, look more like straight lines than a bunch of steps to me. You can model a straight line with steps if you use enough steps. But why should you? Where is the analysis to show that steps actually do a better job than straight lines at fitting to these curves? The Mk1 eyeball just doesn’t cut the mustard in this situation.
Anthony sometimes fits a sine wave to his temperature series. But he always includes the disclaimer that this is for amusement only. Your analysis needs a similar disclaimer. Instead however you make wild claims which massively overreach the extent of your analysis.
So you fitted steps to some generally increasing graphs. It is a very long way from there to the claim that “Abrupt changes … contradict the anthropogenic global warming (AGW) claims”.
I am … highly sceptical … of your claims.

January 5, 2012 4:44 pm

Theo Goodwin said January 5, 2012 at 4:02 pm
“thepompousgit says:
January 5, 2012 at 8:17 am
Thanks for the info. Please give us a reference to get us started on this particular topic.”
Horizontal gene transfer
From Wikipedia, the free encyclopedia
“HGT” redirects here. For other uses, see HGT (disambiguation).
Horizontal gene transfer (HGT), also lateral gene transfer (LGT), is any process in which an organism incorporates genetic material from another organism without being the offspring of that organism. By contrast, vertical transfer occurs when an organism receives genetic material from its ancestor, e.g., its parent or a species from which it has evolved.
Horizontal gene transfer is also the primary reason for bacterial antibiotic resistance [1][2][3][4] and this often involves plasmids.[5]. Genes that are responsible for antibiotic resistance in one species of bacteria can be transferred to another species of bacteria through various mechanisms (e.g., via F-pilus), subsequently arming the antibiotic resistant genes’ recipient against antibiotics, which is becoming a medical challenge to deal with. This is the most critical reason that antibiotics must not be consumed and administered to patients without appropriate prescription from a medical physician.[6] Most thinking in genetics has focused upon vertical transfer, but there is a growing awareness that horizontal gene transfer is a highly significant phenomenon and amongst single-celled organisms perhaps the dominant form of genetic transfer.[citation needed] Artificial horizontal gene transfer is a form of genetic engineering.
http://en.wikipedia.org/wiki/Horizontal_gene_transfer
—————————————————————————————–
There’s a bunch of references at the foot of the article. It’s also a rather neater explanation for some 30+ species acquiring the C4 process in a very short space of time, rather than invoking coincidence.

Ian W
January 5, 2012 5:48 pm

PRD says:
January 5, 2012 at 9:18 am
I’m sick and tired of being sick and tired.

I fully agree with your points on enthalpy. It seems that professional meteorologists and many on these blogs have been drawn into the simplistic thinking disregarding humidity and using ‘average atmospheric temperatures’. The post is a good one but is the incorrect argument. We should be using your argument:
“Examples: 105 F and 15% R.H. = 33 BTU/cu.ft. of air vs. 85 F and 70% R.H. = 38 BTU/cu.ft. of air”
And keep repeating the enthalpy argument back to people who can only talk in atmospheric temperatures
As I posted in the ‘Big Picture’ thread:
“Let us take a cool humid afternoon in Louisiana after a rainstorm has just stopped and the air temperature is a relatively cool 25C (77F) but the humidity is close to 100% at the same time in the Arizona desert after several weeks of drought the temperature is a really hot 38C (100F) but the air is almost zero humidity. It may come as a surprise to some that the 25C atmosphere in Louisiana holds twice the energy 76.9KJ/Kg, as the dry 38C atmosphere in Arizona 38.3KJ/Kg . If there are actually droplets of water for example a post shower mist in the Bayou then the energy content of the 25C Louisiana atmosphere is considerably greater. This is due to the enthalpy difference between saturated and dry air. (see http://www.engineeringtoolbox.com/enthalpy-moist-air-d_683.html )”
If meteorologists were to really understand enthalpy they would be pointing out that ‘averaging’ temperatures from the just post dawn minimum with high humidity with the late afternoon maximum in lower humidity is nonsensical and only exhibits ignorance of basic meterorology and atmospheric physics. The accuracy of reading the temperatures, the mathematics of the averaging of temperatures and the statistical analyses of the extracted values may be faultless – but the results are based on a totally invalid assumption: atmospheric temperature is not a metric for atmospheric heat content. It is the heat energy budget that the AGW proponents claim is the problem so they should be measuring atmospheric heat content. We should not join the ignorant climate ‘scientists’ under their lampost – we should be pointing out their ignorance and correcting them every time they say ‘atmospheric temperature’.

January 5, 2012 6:15 pm

A physicist says:

Jens, the folks on SkepticalScience are questioning whether step-change models make any testable predictions. When we strip away the pointlessly tendentious SkepticalScience rhetoric (which I condemn!) we are left with reasonable questions like “If El Niños cause abrupt temperature step changes upward, why wouldn’t La Niñas cause equivalent abrupt temperature step changes downward?”
More broadly, step-change models have an unbounded number of independently adjustable steps. So why should we embrace arbitrarily-complicated step-change models, when non-step models like Foster & Rahmstorf (2011) — which are mathematically simpler and physically well-motivated — describe the overall climate-change data impressively well?

I too have my doubts about step models. Clearly, if you have any trending dataset, there will be a step model that approximates it better than a flat line. A step detector will find that model. However, there is a reason why steps might be expected. In chaos theory, there is the concept of an attractor. Briefly, a chaotic system can ‘orbit’ an attractor (a set of values for variables, such as temperature, etc.) and if the system receives a severe ‘bump’ it can be jogged into a regime where it orbits a completely different attractor. The book “The Death of Economics” by Paul Ormerod contains a very readable discussion in the context of economic variables such as inflation, employment figures, boom/bust cycles etc.

Pamela Gray
January 5, 2012 7:22 pm

R Gates, the very fact that you keep referring to rather strong solar influences on these ups and downs of temperatures, and that you appear to be able to “see” it in these swings, makes me question your contention that CO2 has an effect as well.

wayne
January 5, 2012 7:28 pm

Stacey says:
January 5, 2012 at 6:41 am
Sorry if I am being stupid but I can’t see Trenberth’s comment in the link to email 3046?
Regards
S
REPLY: Yes, I don’t see it either, perhaps the number for the email he gave is a typo (I added the link) – So I’ve removed the reference to Dr. Trenberth for now until Jens can clear up the discrepancy – Anthony
>>>>
I have already noted a month ago that certain Climategate email databases use a different numbering schemes. That is probably the problem. An email number needs a specific site also.

goldie
January 5, 2012 7:48 pm

I don’t really have any doubt that step changes are occurring and if the steps were attributed to natural changes there would be little else left. But I don’t see how steps need only be of natural origin.
Also I have seen others argue quite plausibly that after each ENSO event there is a residual that takes some time to dissipate.
Perhaps there would be some profit in considering the hypothesis that the “steps” are instead changes in the rate of change.

Tim Minchin
January 5, 2012 8:21 pm

How much ‘heat’ does a nuclear bomb release into the atmosphere?

January 5, 2012 8:43 pm

So, the dispute comes down to this:
Is there a continuous trend that artifactually looks like steps,
or
Are there steps which have been smoothed statistically into the apparency of a continuous trend?
Or a small trend with overlaying steps? Or a negative trend which has been overwhelmed by a few upward steps? Or …

1 3 4 5 6 7 9