New paper makes a hockey sticky wicket of Mann et al 98/99/08

NOTE: This has been running two weeks at the top of WUWT, discussion has slowed, so I’m placing it back in regular que.  – Anthony

UPDATES:

Statistician William Briggs weighs in here

Eduardo Zorita weighs in here

Anonymous blogger “Deep Climate” weighs in with what he/she calls a “deeply flawed study” here

After a week of being “preoccupied” Real Climate finally breaks radio silence here. It appears to be a prelude to a dismissal with a “wave of the hand”

Supplementary Info now available: All data and code used in this paper are available at the Annals of Applied Statistics supplementary materials website:

http://www.imstat.org/aoas/supplements/default.htm

=========================================

Sticky Wicket – phrase, meaning: “A difficult situation”.

Oh, my. There is a new and important study on temperature proxy reconstructions (McShane and Wyner 2010) submitted into the Annals of Applied Statistics and is listed to be published in the next issue. According to Steve McIntyre, this is one of the “top statistical journals”. This paper is a direct and serious rebuttal to the proxy reconstructions of Mann. It seems watertight on the surface, because instead of trying to attack the proxy data quality issues, they assumed the proxy data was accurate for their purpose, then created a bayesian backcast method. Then, using the proxy data, they demonstrate it fails to reproduce the sharp 20th century uptick.

Now, there’s a new look to the familiar “hockey stick”.

Before:

Multiproxy reconstruction of Northern Hemisphere surface temperature variations over the past millennium (blue), along with 50-year average (black), a measure of the statistical uncertainty associated with the reconstruction (gray), and instrumental surface temperature data for the last 150 years (red), based on the work by Mann et al. (1999). This figure has sometimes been referred to as the hockey stick. Source: IPCC (2001).

After:

FIG 16. Backcast from Bayesian Model of Section 5. CRU Northern Hemisphere annual mean land temperature is given by the thin black line and a smoothed version is given by the thick black line. The forecast is given by the thin red line and a smoothed version is given by the thick red line. The model is fit on 1850-1998 AD and backcasts 998-1849 AD. The cyan region indicates uncertainty due to t, the green region indicates uncertainty due to β, and the gray region indicates total uncertainty.

Not only are the results stunning, but the paper is highly readable, written in a sensible style that most laymen can absorb, even if they don’t understand some of the finer points of bayesian and loess filters, or principal components. Not only that, this paper is a confirmation of McIntyre and McKitrick’s work, with a strong nod to Wegman. I highly recommend reading this and distributing this story widely.

Here’s the submitted paper:

A Statistical Analysis of Multiple Temperature Proxies: Are Reconstructions of Surface Temperatures Over the Last 1000 Years Reliable?

(PDF, 2.5 MB. Backup download available here: McShane and Wyner 2010 )

It states in its abstract:

We find that the proxies do not predict temperature significantly better than random series generated independently of temperature. Furthermore, various model specifications that perform similarly at predicting temperature produce extremely different historical backcasts. Finally, the proxies seem unable to forecast the high levels of and sharp run-up in temperature in the 1990s either in-sample or from contiguous holdout blocks, thus casting doubt on their ability to predict such phenomena if in fact they occurred several hundred years ago.

Here are some excerpts from the paper (emphasis in paragraphs mine):

This one shows that M&M hit the mark, because it is independent validation:

In other words, our model performs better when using highly autocorrelated

noise rather than proxies to ”predict” temperature. The real proxies are less predictive than our ”fake” data. While the Lasso generated reconstructions using the proxies are highly statistically significant compared to simple null models, they do not achieve statistical significance against sophisticated null models.

We are not the first to observe this effect. It was shown, in McIntyre

and McKitrick (2005a,c), that random sequences with complex local dependence

structures can predict temperatures. Their approach has been

roundly dismissed in the climate science literature:

To generate ”random” noise series, MM05c apply the full autoregressive structure of the real world proxy series. In this way, they in fact train their stochastic engine with significant (if not dominant) low frequency climate signal rather than purely non-climatic noise and its persistence. [Emphasis in original]

Ammann and Wahl (2007)

On the power of the proxy data to actually detect climate change:

This is disturbing: if a model cannot predict the occurrence of a sharp run-up in an out-of-sample block which is contiguous with the insample training set, then it seems highly unlikely that it has power to detect such levels or run-ups in the more distant past. It is even more discouraging when one recalls Figure 15: the model cannot capture the sharp run-up even in-sample. In sum, these results suggest that the ninety-three sequences that comprise the 1,000 year old proxy record simply lack power to detect a sharp increase in temperature. See Footnote 12

Footnote 12:

On the other hand, perhaps our model is unable to detect the high level of and sharp run-up in recent temperatures because anthropogenic factors have, for example, caused a regime change in the relation between temperatures and proxies. While this is certainly a consistent line of reasoning, it is also fraught with peril for, once one admits the possibility of regime changes in the instrumental period, it raises the question of whether such changes exist elsewhere over the past 1,000 years. Furthermore, it implies that up to half of the already short instrumental record is corrupted by anthropogenic factors, thus undermining paleoclimatology as a statistical enterprise.

FIG 15. In-sample Backcast from Bayesian Model of Section 5. CRU Northern Hemisphere annual mean land temperature is given by the thin black line and a smoothed version is given by the thick black line. The forecast is given by the thin red line and a smoothed version is given by the thick red line. The model is fit on 1850-1998 AD.

We plot the in-sample portion of this backcast (1850-1998 AD) in Figure 15. Not surprisingly, the model tracks CRU reasonably well because it is in-sample. However, despite the fact that the backcast is both in-sample and initialized with the high true temperatures from 1999 AD and 2000 AD, it still cannot capture either the high level of or the sharp run-up in temperatures of the 1990s. It is substantially biased low. That the model cannot capture run-up even in-sample does not portend well for its ability

to capture similar levels and run-ups if they exist out-of-sample.

Conclusion.

Research on multi-proxy temperature reconstructions of the earth’s temperature is now entering its second decade. While the literature is large, there has been very little collaboration with universitylevel, professional statisticians (Wegman et al., 2006; Wegman, 2006). Our paper is an effort to apply some modern statistical methods to these problems. While our results agree with the climate scientists findings in some

respects, our methods of estimating model uncertainty and accuracy are in sharp disagreement.

On the one hand, we conclude unequivocally that the evidence for a ”long-handled” hockey stick (where the shaft of the hockey stick extends to the year 1000 AD) is lacking in the data. The fundamental problem is that there is a limited amount of proxy data which dates back to 1000 AD; what is available is weakly predictive of global annual temperature. Our backcasting methods, which track quite closely the methods applied most recently in Mann (2008) to the same data, are unable to catch the sharp run up in temperatures recorded in the 1990s, even in-sample.

As can be seen in Figure 15, our estimate of the run up in temperature in the 1990s has

a much smaller slope than the actual temperature series. Furthermore, the lower frame of Figure 18 clearly reveals that the proxy model is not at all able to track the high gradient segment. Consequently, the long flat handle of the hockey stick is best understood to be a feature of regression and less a reflection of our knowledge of the truth. Nevertheless, the temperatures of the last few decades have been relatively warm compared to many of the thousand year temperature curves sampled from the posterior distribution of our model.

Our main contribution is our efforts to seriously grapple with the uncertainty involved in paleoclimatological reconstructions. Regression of high dimensional time series is always a complex problem with many traps. In our case, the particular challenges include (i) a short sequence of training data, (ii) more predictors than observations, (iii) a very weak signal, and (iv) response and predictor variables which are both strongly autocorrelated.

The final point is particularly troublesome: since the data is not easily modeled by a simple autoregressive process it follows that the number of truly independent observations (i.e., the effective sample size) may be just too small for accurate reconstruction.

Climate scientists have greatly underestimated the uncertainty of proxy based reconstructions and hence have been overconfident in their models. We have shown that time dependence in the temperature series is sufficiently strong to permit complex sequences of random numbers to forecast out-of-sample reasonably well fairly frequently (see, for example, Figure 9). Furthermore, even proxy based models with approximately the same amount of reconstructive skill (Figures 11,12, and 13), produce strikingly dissimilar historical backcasts: some of these look like hockey sticks but most do not (Figure 14).

Natural climate variability is not well understood and is probably quite large. It is not clear that the proxies currently used to predict temperature are even predictive of it at the scale of several decades let alone over many centuries. Nonetheless, paleoclimatoligical reconstructions constitute only one source of evidence in the AGW debate. Our work stands entirely on the shoulders of those environmental scientists who labored untold years to assemble the vast network of natural proxies. Although we assume the reliability of their data for our purposes here, there still remains a considerable number of outstanding questions that can only be answered with a free and open inquiry and a great deal of replication.

===============================================================

Commenters on WUWT report that Tamino and Romm are deleting comments even mentioning this paper on their blog comment forum. Their refusal to even acknowledge it tells you it has squarely hit the target, and the fat lady has sung – loudly.

(h/t to WUWT reader “thechuckr”)

Share

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
1.2K Comments
Inline Feedbacks
View all comments
barry
August 16, 2010 9:58 pm

Hi James,

COMPUTER MODELS DO NOTHING OTHER THAN WHAT THEY ARE TOLD TO DO

The theory of AGW does not rest on global Climate models, either. The original calculations were done half a century before electronic computers were invented (Arrhenius).

in my view, detection of GW would come from basic READING OF THERMOMETERS. With the caveat of not mucking with the reading after the read. At least not without a published (for public dissemination and discernment.) and accepted reasoning

You may be unaware that such reasoning is documented. GISS, CRU and NCDC have many papers outlining there reasoning and methods. GISS have all their documentation online, as does CRU. GISS has all their data online, while CRU is hamstrung by agreements with the national meteorological services of a number of countries – ie, they don’t have permission to release that data – yet.
The GHCN data is also accessible online, in the raw and adjusted form. There has been a flurry of activity over the last year comparing raw time series with the institutional products (GISS, HadCRU etc), rural, airport, urban, pre and post station drop out. The results will probably surprise you – considering that some of them come from popular skeptical websites. Again, if you are genuinely interested, I will provide references that require only a click of your mouse to explore.

given that your beloved GCMs in part are generated by accepting historical data created by paleo-climatology

Which “historical data created by paleo-climatology” is ‘accepted’ to help generate GCMs? Please provide a reference.
I can assure you you have not mischaracterised my assertions, just introduced new ones from what is probably popular media. If you mean to say that the press exaggerates, I completely agree.

Gaylon
August 16, 2010 10:15 pm

Barry,
I don’t understand how you can end with, “There is no impact on the greater body of AGW theory and projections.”
Really?
I understood the paper to say that the statistical methods used were, at best, inadequate. Does this not invalidate the MHB result? In addition the paper also invalidates the use of the proxies due to the “weak signal” issue. Does this not also invalidate the MHB product (graph)? Invalidation meaning that the anthropogenic theory is now no longer known.
I may be way off base here, but wasn’t it the use of this graph, the misuse of the data, and the subsequent foaming at the mouth of the CAGW crowd that brought us to this juncture? IMO this paper strikes a fatal blow at the very foundation of the AGW theory and its predictions simply because this is where it all started, “patient zero”, as it were.
All subsequent graphs foisted on the public have used the same, or similiar methodologies and the same data (Yamal, Bristlecones, Boreholes, etc). They continue to push this on us as though invalidation never occurred, starting with M&M and I predict they will/are going to continue more of the same with the M&W paper.
Listen, I’m no scientist (duh) but at what point do people responsible for advising governments start standing up and tell the truth, “we don’t know, our GCM’s say one thing but observation is telling us we don’t yet understand how the whole picture fits together…we’ll keep you posted.” Why is that so hard?
sTv says:
August 16, 2010 at 5:30 pm
I apologize, I was posting in response to your earlier post and lapsed into a generalization that should not have been directed at you. I stand by the generalization, on general terms anyway.
I stand by what I posted: Stu said he believes in ‘climate change’, so do we all, that’s what it does. He never says in his post that it is caused by antropongenic forcings. In fact it is not a far leap to infer that he is still undecided as to man’s role in CC based on his opening comments which I posted, unless of course he has stated exactly that and I am unaware of it, which is completely plausible. As I said, I thought he did a good job.
You failed to answer my question: Do you prefer scepticism or consensus in science?

Gaylon
August 16, 2010 10:17 pm

Add to paragraph immediately under, Really? Last sentence…is not known with any certainty.

James Sexton
August 16, 2010 10:19 pm

“You’re thinking of the 99 paper, and the MWP wasn’t ‘disappeared’. It was qualified.”
Are you looking at the same graph the rest of the world is?
BTW, yes, disappeared, is indeed a jargon-ization, of an Orwellian nature, I doubt it. As I recollect, the word, in the context being used, was a reference to the internet and it’s ability to change historical data at a whim. For instance, if a statement was made on a website which was posted at one time and then later was removed, it was deemed “disappeared”, as in “it never happened.” Later, during the Bush administration, shortly after 911, the paranoid group of bloggers applied the term to people that may have been removed from the U.S. society under the Patriot act. If someone critical of the Bush admin. went missing, they were deemed “disappeared”. Currently, we see this phenomena in climate science. For instance, once, the GISS’ database had a year in the 1930s(1934 if memory serves me.) to be the warmest year on record. Sometime after or during 1998 the recorded temps of 1934 were lowered and 1998 became the warmest year on record even though the previous recorded temps were higher. However, since the data had been removed and thus “disappeared”, to date 1998 is the warmest year on record. I believe it is here where it may be appropriate to apply the Orwellian analogy. Down the memory hole it went! I screen shot a lot of stuff now.

Dave Springer
August 16, 2010 10:22 pm

@Anthony
I suspect right now you’re busy looking at the Arctic Sea Ice extent record going back to 1979 and have now noted that right around 1998 it began melting faster and accelerated for a few years until it peaked in 2006. You’re probably also thinking about how many years it might take for a pulse of warm water to travel from the tropical Pacific to the Arctic ocean and there start getting soaked up in latent heat of fusion in the sea ice. I think you’ll find that a few years for the ocean currents to make the several thousand mile journey up there is reasonable and then took a similar amount of time to be completely absorbed and then the ice extent stabilized at about a million square kilometers less than before the 1998 El Nino.
You’re probably also wondering why this never dawned on you before. Don’t feel bad. You weather guys only look at thermometers. Pretty much everyone ignores the latent heat of fusion and vaporization. The latent heat of vaporization is why there isn’t any positive feedback associated with CO2 driven warming. The so-called missing heat is carried right through the densest layer of CO2 as latent heat of vaporization and released high in the atmosphere where it’s much easier to radiate out into space than to wend its way downward through a lower layer of CO2 that’s now serving to insulate the warm clouds from the surface.
I have a minor fascination with heat pumps and if you don’t constantly keep in mind the latent heat involved in phase changes of your working fluids you’ll never come close to understanding how they work or how to improve one. I’m an engineer to the core and I can’t look at a damn thing without wondering how it works and how it might be improved so these heat transfer mechanisms like the El Nino energy going into Arctic sea ice melt jump right out at me.

Pamela Gray
August 16, 2010 10:27 pm

Dave Springer, I imagine the ’98 surge in SST’s has a mathematical equation. First, the trade winds that blow East to West died down, allowing the Sun to do its thing without the constant mixing of the thermocline that the trade winds churn up. That allowed water vapor and whatever CO2 was in it (outgassed from the oceans or put there by human activity), to build up. That in turn would, according to AGW theory, heat up the surface. The Sun’s SW infra-red energy can be fairly well calculated in its ability to penetrate a large body of water and warm it. That part is fairly easy.
For CO2/water vapor to make an El Nino worse (or cause one), one would have to up the amount of water vapor and CO2 by a huge amount to make even a tiny difference in ocean temps. Why? Because CO2/water vapor does not emit SW infra red. They emit LW, a very weak source of heat when it comes to heating a large body of water to several measures of depth. Besides, when the sea surface is being heated by the Sun, it immediately starts to evaporate. What LW warming there is at the surface is immediately evaporated away.
The bottom line is that El Nino’s are caused by a steady Sun allowed to beam down on a calm ocean, a lack of trade winds, which normally bring cooler water to the surface. CO2/water vapor does not have the capacity to increase SST’s to El Nino levels. Plain and simple.

August 16, 2010 10:38 pm

Doesn’t all this go to show what a big heap of steaming $hi7 these computer projections really are. This is what Prof Tim Patterson of Carleton University was saying in his recent radio broadcast, which is available at my site.
Professor Tim Patterson – CKCU Radio July 2010

August 16, 2010 10:47 pm

[off topic and rude as well. bye ~ ctm]

Dave Springer
August 16, 2010 10:47 pm

@Anthony (con’t)
You might also note that when the arctic ice extent stabilized it now appears as a step change to a lower extent. At the same time in the satellite temp record you’ll also note a step change to a higher average surface temperature. This is indicative of the El Nino energy pulse being completely absorbed as latent heat of fusion and the thermometers are now registering normal again. The arctic temps are now steady or in decline as they should be because in 2000 we completed the upside of the the 60 year cycle and there hasn’t been any significant warming indicated on the satellite record since then (it’s a travesty that Trenberth can’t explain while I can). I’ll have to step out on a limb now and predict we won’t see any significant warming for another 20 years but we won’t see any significant cooling either. Then about in the year 2030 (if mankind is still alive) we’ll see a 30 year warming trend begin. And if during the 60 year period from 2000 to 2060 we see a concomitant increase in CO2 of 100ppm (which seems rather certain because no one nation is really going to slow down fossil fuel consumption and wreck their economic growth in the process – it’s just big talk and no action) then we’ll see a global average temperature increase of 0.4c again.
Of course a big volcano blowing its top could muck up my prediction big time!

James Sexton
August 16, 2010 10:54 pm

Dave Springer says:
August 16, 2010 at 9:44 pm
@Anthony
“It takes 337 kilojoules per kilogram to turn ice at 32F into water at 32F. This latent heat is called insensible heat because it doesn’t register on a thermometer.”
Dave, I clicked on the link and found a wonderful treasure chest! Thanks. However, I found no reference to your “insensible heat”. Your statement is counter intuitive. While I could probably google the answer, I think it bears more explanation here. This could be because I’m tired and or the gross amount of beer I’ve had.(in terms of ounces) I get the joules thing. And I understand mercury doesn’t move the same in different pressures. But joules do convert to heat and joules, to my knowledge, do not convert or exert pressure…..????
Probably the beer and I’m going to call it a night, but I will check back if you have a better explanation, I’d be more than grateful.
Thanks,
James

donald penman
August 16, 2010 11:01 pm

The hockey stick model is wrong then but we already knew that. This has not and will not stop the politically motivated scientist in the UK citing the hockey stick model of past temperatures as evidence supporting their political agendas. The people in the UK are being “trained” to accept the doctrine of anthropegenic global warming .Pedestrians are being encouraged to walk out in front of you, if you drive a car ,by car hating politicians.More accidents are caused also because their are more bicycles on the congested roads and given the behaviour of cyclists that we observe they need to pass a test before being allowed to ride a cycle on the road. We are building more speed humps installing more speed cameras but most drivers including the police regularly ignore all the regulations on UK roads. The idea is if we can brainwash everyone into believing that anthropegenic global warming is true then it becomes true, you do not have to prove what you are claiming, you do not need evidence, government can then regulate every aspect of our life.Science should be objective and should not start with preconceived ideas of truth or what should be true,who are scientists to tell us what should be true.

August 16, 2010 11:06 pm

Henry@DaveSpringer/Evan/Bryan
Dave, the idea that CO2 is completely transparent to UV, visible and (near) IR is not correct.
It seems you did not catch the questions I posted here:
http://wattsupwiththat.com/2010/08/14/breaking-new-paper-makes-a-hockey-sticky-wicket-of-mann-et-al-99/#comment-458246
and comment here:
http://wattsupwiththat.com/2010/08/14/breaking-new-paper-makes-a-hockey-sticky-wicket-of-mann-et-al-99/#comment-458382

Dave Springer
August 16, 2010 11:27 pm

@Pamela
Sorry, I’m not buying it. SST oscillations are known only through statistical analysis of history. There is no theory of SST oscillations to explain them. Sort of like climate science in general, actually. Or perhaps I should say actuarily…
You seem to have ruled out it was energy accumulated over many years of CO2 driven temperature surface temp increase. If it didn’t come from that then where did it come from? Regardless of the source it was a god-awful big lot of heat that appeared virtually overnight. Had to come from somewhere – energy is neither created nor destroyed. Account for it. The books have to balance.

barry
August 16, 2010 11:35 pm

Hi Gaylon,

Barry,
I don’t understand how you can end with, “There is no impact on the greater body of AGW theory and projections.”

I’m simply paraphrasing what the authors say themselves.

This effort to reconstruct our planet’s climate history has become linked
to the topic of Anthropogenic Global Warming (AGW). On the one hand, this is peculiar since paleoclimatological reconstructions can provide evidence only for the detection of AGW and even then they constitute only one such source of evidence. The principal sources of evidence for the detection of global warming and in particular the attribution of it to anthropogenic factors come from basic science as well as General Circulation Models (GCMs) that have been fit to data accumulated during the instrumental period (IPCC, 2007). These models show that carbon dioxide, when
released into the atmosphere in sufficient concentration, can force temperature
increases.

I think that’s pretty clear, and no need to go into greater detail.
Like you, the authors are motivated by popular and policy considerations – this is what they discuss in the paragraph following the one I just quoted. This has nothing to do with the scientific underpinnings of AGW, which were in place a century before MBH 98/99 etc, but how certain ideas have been projected. I believe this distinction is blurred in much of the commentary upthread.
It’s obvious enough that the paper tends to support M&M criticisms of Mann’s paleo-reconstruction techniques. It might be tempting to draw a broader conclusion about the science behind AGW theory from this, but the authors clearly state that millennial temperature reconstructions are somewhat of a side-issue WRT AGW theory and projections. Climate sensitivity, for example, does not rest on this branch of paleoclimatology. Climate models don’t either (nor do they form the underpinnings of the theory of global warming from increasing ‘greenhouse’ gases).
There is a tendency to try to discredit the whole of climate science from a dispute over this or that component. It’s a simple narrative, and seems to be effective, but it is a stranger to reason.

Dave Springer
August 16, 2010 11:40 pm

Henry Pool says:
August 16, 2010 at 11:06 pm
Henry@DaveSpringer/Evan/Bryan
Dave, the idea that CO2 is completely transparent to UV, visible and (near) IR is not correct.

It’s correct for all practical purposes in this context.

Dave Springer
August 16, 2010 11:46 pm


http://en.wikipedia.org/wiki/Latent_heat
http://en.wikipedia.org/wiki/Sensible_heat
It’s really true. Not all heat registers on a thermometer. The above articles hopefully will explain it for you.

barry
August 16, 2010 11:50 pm

Are you looking at the same graph the rest of the world is?

Yes, and I’ve read the papers that spawn them.
Regarding MBH 1999, for which the reconstruction does cover the putative MWP, they say;

“While warmth early in the millennium approaches mean 20th century levels, the late 20th century still appears anomalous”

They also mention the “Medieval Warm Epoch.”
The new paper says:

“Nevertheless, the temperatures of the last few decades have been relatively warm compared to many of the thousand year temperature curves sampled from the posterior distribution of our model.”

There’s not a hell of a lot of daylight between the two quotes. The main difference between the conclusions is the likelihood of 1998 being the warmest year in the last 1000, and in the level of confidence attached to the other conclusions.
It seems like a good paper, though. I’ve learned that it was done with no input from paleoclimatologists, which seems unfortunate when both sides agree (M&M and Gavin Schmidt, for example) that there should be more collaboration between statisticians and paleoclimatologists on the issue. I’ll be interested to see how it pans out after publication.

James Sexton
August 16, 2010 11:59 pm

barry says:
August 16, 2010 at 9:58 pm
“Hi James,…….”
Hi Barry! Barry, as I alluded to in an earlier post, I’ve gotta call it. Wish I could stay and play, but….work is calling in just a very short few hours. I’ll leave you with this, I know it is incomplete, my apologies. I will check back en la manana.
You should check Arrhenius out the second time around.
And GCMs ain’t done by hand. The models are computer generated. I’d cut and paste, but ….
Peace to all.
James

Dave Springer
August 17, 2010 12:08 am

@Pamela
Sorry if I was a little short in my previous reply. Been busy here tonight.
Sure, CO2 isn’t going to directly warm a localized mass of water more than some other mass as CO2 is more or less evenly distributed.
Can the stratification it causes around the globe of warmer surface air and colder stratosphere perhaps influence the trade winds that in turn influence the mix rate of surface and deep water that in turn allows the ocean surface to heat and cool in localized cyclic patterns?
Your explanation of SST oscillations just pushes the question back to a different point. What drives cyclic changes in trade winds which in turn drive cyclic changes in localized SSTs?

August 17, 2010 12:13 am

Henry@DaveSpringer
Again: The idea that CO2 is transparent in the sun’s radiation range of 0-5 um is not correct, in any context! How else could they measure CO2 coming back from the moon, showing its exact spectral fingerprint data?

John Mason
August 17, 2010 12:46 am

Thanks for the comments. Over here it was night-time so that’s why I didn’t respond – busy zzz-ing. I was not setting out to patronise anyone: my point was that, given the obvious complexity of the detailed issues set out in this paper, I just think it is best to wait for the specialists in this particular field to give their measured responses prior to coming to any quick conclusions. I guess we are looking at just a few weeks from now to publication-time. We have an idea as to the climate of the past thousand years – a warm period in Medieval times that transitioned to colder conditions by the middle of the last millennium and then gave way to the warmer period we are currently within. These things we know – regionally – not only via proxies but from historical accounts in some countries, and some appear to have been global, others more regional e.g. the MWP is historically well-documented in NW Europe, but documented history from that time is largely to wholly absent in e.g. the USA, Australia etc. Thus to determine whether such things were global or not, we need a valid global proxy record, and it is important to ascertain how accurately that may be constructed. Anything that improves its reliability is to be welcomed on that basis. Let’s wait and see how others working in this specialist field interpret the findings when they have spent time going through the paper and the package of supporting information.
Cheers for now – John

duckster
August 17, 2010 1:47 am

@two moon says:
August 16, 2010 at 7:03 am
duckster: M&W are not scientists and their point is not scientific. They are statisticians and their point is statistical. They do not claim to present a new, “valid” reconstruction. Their point is that proxies will not support any reconstruction. In other words, the Hockey Stick is not so much broken as it is a castle in the air.

See this is exactly what I am saying. If you accept MW (2010), then it also undermines every argument made here which has also relied on proxies. Either the arguments presented here – use of proxies is fundamentally flawed – are true, and you throw out many of your previous arguments, or they are not, and you get to keep your incredible moving MWP at 1200 – 1400 attested to by various proxies.
Say ‘yes’ to this paper and WUWT’s MWP reconstruction is broken too! Back to the drawing board guys.
And this isn’t a straw man argument – it’s about logical consistency. Simple really.

August 17, 2010 2:26 am

I think the tide is turning. My New Scientist (cancelled, but paid in advance) has an article on the recent weather events such as the fires in Russia among many others. About 2/3 of the way through before Cliamte Change in mentioned, and even then, it is NOT blamed! “Impossible to say” is the answer!
This is a first!

Richard S Courtney
August 17, 2010 2:52 am

Dave Springer:
At August 16, 2010 at 10:47 pm you say:
“I’ll have to step out on a limb now and predict we won’t see any significant warming for another 20 years but we won’t see any significant cooling either. Then about in the year 2030 (if mankind is still alive) we’ll see a 30 year warming trend begin.”
Well, I stepped out on to that limb 10 years ago, and I am still there.
In the past there was an idea that one observed the world, looked for patterns, made predictions on the basis of those patterns, then checked the predictions against real-world outcomes
(a) to gain confidence in the obtained understanding of real-world behaviour
or
(b) to reject the understanding of the real world that was inferred from the observations.
If (b), then start again.
This idea was called the scientific method. And it utilised mathematics and statistics to model the observed patterns and to help make the predictions from those observations.
On the basis of that idea, over the last 10 years I have repeatedly said the following in several places including in threads of this blog.
The global temperature seems to vary in cycles that are overlaid on each other. The cause(s) of these cycles is not known but some are associated with known phenomena (e.g. ENSO, NAO and PDO) although the causes of these phenomena are not known.
There is an apparent ~900 year oscillation that provided
the Roman Warm Period (RWP),
then the Dark Age Cool Period (DACP),
then the Medieval Warm Period (MWP),
then the Little Ice Age (LIA), and
the present warm period (PWP).
And there is an apparent ~60 year oscillation that provided
cooling from ~1880 to ~1910,
then warming from ~1910 to ~1940,
then cooling from ~1940 to ~1970,
then warming from about ~1970 to ~2000,
then cooling since.
These oscillations form a pattern of climate change over time.
And if this pattern continues then either
(A) cooling will continue until ~2020 when the ~60 year oscillation change phase and warming will resume until global temperature reached the levels it had in the RWP and the MWP
or
(B) the ~900 year oscillation will change phase and the globe will start to cool to the temperatures it had in the DACP and LIA.
There is no observation that indicates there has been any change to this pattern.
Richard

Stephen Wilde
August 17, 2010 3:33 am

Dave Springer asked:
“Your explanation of SST oscillations just pushes the question back to a different point. What drives cyclic changes in trade winds which in turn drive cyclic changes in localized SSTs?”
Try latitudinal shifts in the air circulation systems driven by the oceans below and the sun above in a complex interplay.

1 21 22 23 24 25 49