Circular Logic not worth a Millikelvin

Guest post by Mike Jonas

A few days ago, on Judith Curry’s excellent ClimateEtc blog, Vaughan Pratt wrote a postMultidecadal climate to within a millikelvin” which provided the content and underlying spreadsheet calculations for a poster presentation at the AGU Fall Conference. I will refer to the work as “VPmK”.

VPmK was a stunningly unconvincing exercise in circular logic – a remarkably unscientific attempt to (presumably) provide support for the IPCC model[s] of climate – and should be retracted.

Background

The background to VPmK was outlined as “Global warming of some kind is clearly visible in HadCRUT3 [] for the three decades 1970-2000. However the three decades 1910-1940 show a similar rate of global warming. This can’t all be due to CO2 []“.

The aim of VPmK was to support the hypothesis that “multidecadal climate has only two significant components: the sawtooth, whatever its origins, and warming that can be accounted for 99.98% by the AHH law []“,

where

· the sawtooth is a collection of “all the so-called multidecadal ocean oscillations into one phenomenon“, and

· AHH law [Arrhenius-Hofmann-Hansen] is the logarithmic formula for CO2 radiative forcing with an oceanic heat sink delay.

The end result of VPmK was shown in the following graph

clip_image002

Fig.1 – VPmK end result.

where

· MUL is multidecadal climate (ie, global temperature),

· SAW is the sawtooth,

· AGW is the AHH law, and

· MRES is the residue MUL-SAW-AGW.

Millikelvins

As you can see, and as stated in VPmK’s title, the residue was just a few millikelvins over the whole of the period. The smoothness of the residue, but not its absolute value, was entirely due to three box filters being used to remove all of the “22-year and 11-year solar cycles and all faster phenomena“.

If the aim of VPmK is to provide support for the IPCC model of climate, naturally it would remove all of those things that the IPCC model cannot handle. Regardless, the astonishing level of claimed accuracy shows that the result is almost certainly worthless – it is, after all, about climate.

The process

What VPmK does is to take AGW as a given from the IPCC model – complete with the so-called “positive feedbacks” which for the purpose of VPmK are assumed to bear a simple linear relationship to the underlying formula for CO2 itself.

VPmK then takes the difference (the “sawtooth”) between MUL and AGW, and fits four sinewaves to it (there is provision in the spreadsheet for five, but only four were needed). Thanks to the box filters, a good fit was obtained.

Given that four parameters can fit an elephant (great link!), absolutely nothing has been achieved and it would be entirely reasonable to dismiss VPmK as completely worthless at this point. But, to be fair, we’ll look at the sawtooth (“The sinewaves”, below) and see if it could have a genuine climate meaning.

Note that in VPmK there is no attempt to find a climate meaning. The sawtooth which began life as “so-called multidecadal ocean oscillations” later becomes “whatever its origins“.

The sinewaves

The two main “sawtooth” sinewaves, SAW2 and SAW3, are:

clip_image004

Fig.2 – VPmK principal sawtooths.

(The y-axis is temperature). The other two sinewaves, SAW4 and SAW5 are much smaller, and just “mopping up” what divergence remains.

It is surely completely impossible to support the notion that the “multidecadal ocean oscillations” are reasonably represented to within a few millikelvins by these perfect sinewaves (even after the filtering). This is what the PDO and AMO really look like:

clip_image006

Fig.3 – PDO.

(link) There is apparently no PDO data before 1950, but some information here.

clip_image008

Fig.4 – AMO.

(link)

Both the PDO and AMO trended upwards from the 1970s until well into the 1990s. Neither sawtooth is even close. The sum of the sawtooths (SAW in Fig.1) flattens out over this period when it should mostly rise quite strongly. This shows that the sawtooths have been carefully manipulated to “reserve” the 1970-2000 temperature increase for AGW.

clip_image010

Fig.5 – How the sawtooth “reserved” the1980s and 90s warming for AGW.

 

Conclusion

VPmK aimed to show that “multidecadal climate has only two significant components”, AGW and something shaped like a sawtooth. But VPmK then simply assumed that AGW was a component, called the remainder the sawtooth, and had no clue as to what the sawtooth was but used some arbitrary sinewaves to represent it. VPmK then claimed to have shown that the climate was indeed made up of just these two components.

That is circular logic and appallingly unscientific. The poster presentation should be formally retracted.

[Blog commenter JCH claims that VPmK is described by AGU as “peer-reviewed”. If that is the case then retraction is important. VPmK should not be permitted to remain in any “peer-reviewed” literature.]

Footnotes:

1. Although VPmK was of so little value, nevertheless I would like to congratulate Vaughan Pratt for having the courage to provide all of the data and all of the calculations in a way that made it relatively easy to check them. If only this approach had been taken by other climate scientists from the start, virtually all of the heated and divisive climate debate could have been avoided.

2. I first approached Judith Curry, and asked her to give my analysis of Vaughan Pratt’s (“VP”) circular logic equal prominence to the original by accepting it as a ‘guest post’. She replied that it was sufficient for me to present it as a comment.

My feeling is that posts have much greater weight than comments, and that using only a comment would effectively let VP get away with a piece of absolute rubbish. Bear in mind that VPmK has been presented at the AGU Fall Conference, so it is already way ahead in public exposure anyway.

That is why this post now appears on WUWT instead of on ClimateEtc. (I have upgraded it a bit from the version sent to Judith Curry, but the essential argument is the same). There are many commenters on ClimateEtc who have been appalled by VPmK’s obvious errors. I do not claim that my effort here is in any way better than theirs, but my feeling is that someone has to get greater visibility for the errors and request retraction, and no-one else has yet done so.

 

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

118 Comments
Inline Feedbacks
View all comments
James
December 13, 2012 8:45 am

Assume AGW is a flat line and repeat the analysis. When the fit remains near perfect, trumpet the good news that AGW is no more!

December 13, 2012 8:47 am

Another failed attempt to force dynamic, poorly understood, under-sampled, natural process into some kind of linear deterministic logic system. It simply will not work. This foolishness is not worth any further time or effort on the part of serious scientists.

Steveta_uk
December 13, 2012 8:55 am

What VP has said repeatedly on JC’s site is basically that if you can provide a better fit, please do.
Nobody that I’ve seen has yet done so. Now I’m not in any way suggesting that what VP has done is in any way useful science. But still, can you not simply alter the 4 or 5 sines waves to show that you can provide just as good a fit without the AHH curve?
If you can, please present it here.
And if you cannot, then VP remains uncontested.

December 13, 2012 8:56 am

Why is it that the rationalizations of the Warmistas are beginning to remind me of Ptolemy and The Almagest?

December 13, 2012 9:01 am

I worked in the Banking Industry for most of my adult life. During that time, many people would be applying for finance for this business or that business – maybe a mortgage, maybe a loan.
All would arrive with their shiny spreadsheet proving their business model was viable and would soon show profitability
I never saw any proposed business plan to the Bank that didn’t show remarkable profit – certainly none ever predicted a loss.
Nonetheless, the vast majority of those business plans would fail abysmally.
Just goes to show, any spreadsheet can be made to produce whatever results the author wants – just tweak here or tweak there.
Now about milli-kelvins ?
Andi

December 13, 2012 9:04 am

‘VPmK was a stunningly unconvincing exercise in circular logic – a remarkably unscientific attempt to (presumably) provide support for the IPCC model[s] of climate – and should be retracted.”
1. its not circular.
2. its not a proof or support for models.
3. you cant retract a poster.
4. This is basically the same approach that many here praise when scafetta does it.
basically he is showing that GIVEN the truth of AGW, the temperature series can be explained by a few parameters. GIVEN, is the key you misunderstand the logic of his approach.

Taphonomic
December 13, 2012 9:11 am

“[Blog commenter JCH claims that VPmK is described by AGU as “peer-reviewed”. If that is the case then retraction is important. VPmK should not be permitted to remain in any “peer-reviewed” literature.]”
Describing VPmk as peer-reviewed is incorrect. Abstracts published by AGU for either poster sessions or presentations made at the meeting are not peer-reviewed. There are quite a few comments at the blog after JCH on this topic.

Matthew R Marler
December 13, 2012 9:13 am

VPmK was a stunningly unconvincing exercise in circular logic – a remarkably unscientific attempt to (presumably) provide support for the IPCC model[s] of climate – and should be retracted.
That is over-wrought. Vaughan Pratt described exactly what he did and found, and published the data that he used and his result. If the temperature evolution of the Earth over the next 20 years matches his model, then people will be motivated to find whatever physical process generates the sawtooth. If not, his model will be disconfirmed along with plenty of other models. Lots of model-building in scientific history has been circular over the short-term: in “The Structure of Scientific Revolutions” Thomas Kuhn mentions Ohm’s law as an example, and Einstein’s special relativity; lots of people have noted the tautology of F = dm/dt where m here stands for momentum.
Pratt merely showed that, with the data in hand, it is possible to recover the signal of the CO2 effect with a relatively low-dimensional filter. No doubt, the procedure is post hoc. The validity of the approach will be tested by data not used in fitting the functions that he found.

Matthew R Marler
December 13, 2012 9:14 am

Steven Mosher wrote: 4. This is basically the same approach that many here praise when scafetta does it.
I agree with that.

richardscourtney
December 13, 2012 9:23 am

Steven Mosher:
You innumerate four points in your post at December 13, 2012 at 9:04 am. I address each of them in turn.
1. its not circular.
(Clearly, it is “circular” in that it removes everything from the climate data except what the climate models emulate then says the result of the removal agrees with what the climate emulate when tuned to emulate it.)
2. its not a proof or support for models.
(Agreed, it is nonsense.)
3. you cant retract a poster.
(Of course you can! All you do is publish a statement saying it should not have been published, and you publish that statement in one or more of the places where the “poster” was published; e.g. in this case, on Judith Curry’s blog.)
4. This is basically the same approach that many here praise when scafetta does it.
(So what! Many others – including me – object when Scafetta does it. Of itself that indicates nothing.)
The poster by Vaughan Pratt only indicates that Pratt is a prat: live with it.
Richard

richardscourtney
December 13, 2012 9:27 am

OOOps! I wrote
(Clearly, it is “circular” in that it removes everything from the climate data except what the climate models emulate then says the result of the removal agrees with what the climate emulate when tuned to emulate it.)
Obviously I intended to write
(Clearly, it is “circular” in that it removes everything from the climate data except what the climate models emulate then says the result of the removal agrees with what the models emulate when tuned to emulate it.)
Sorry.
Richard

Arno Arrak
December 13, 2012 9:31 am

I agree. I commented about it on Curry’s blog and called it worthless. I was particularly annoyed that he used HadCRUT3 which is error-ridden and anthropogenically distorted. I could see that he was using his computer skills to create something out of nothing and did not understand why that sawtooth did not go away. That millikelvin claim is of course nonsense and was simply part of his applying his computer skills without comprehending the data he was working with. Suggested that he write a program to find and correct those anthropogenic spikes in HadCRUT and others.

December 13, 2012 9:33 am

On the sidelines of the V.Pratt’s blog presentation there was a secondary discussion between myself and Dr. Svalgaard about the far more realistic causes of the climate change. Since Dr.S. often does peer review on the articles relating to solar matters, leaving the trivia out, I consider our exchanges as an ‘unofficial peer review of my calculations’, no mechanism is considered in the article, just the calculations. This certainly was not ‘friendly’ review, although result may not be conclusive, I consider it a great encouragement.
http://www.vukcevic.talktalk.net/PR.htm
If there are any scientists who are occasionally involved in the ‘peer review’ type processes, I would welcome the opportunityto submit my calculations. My email as in my blog id followed by @yahoo.com.

Editor
December 13, 2012 9:35 am

Vaughan presented an interesting idea which has been roundly tested by many commenters in a spirit of science and hotly contested views within a framework of courtesy by Vaughan defending his ideas. Personally I’m not convinced that CET demonstrates his theory , in fact I think it shows he is wrong, but if every post, whether here or at climate etc, was discussed in such a thorough manner everyone would gain, whatever side of the fence they are on.
Tonyb

December 13, 2012 10:07 am

vukcevic says:
December 13, 2012 at 9:33 am
This certainly was not ‘friendly’ review, although result may not be conclusive, I consider it a great encouragement
It seems that In the twisted world of pseudo-science even a flat-out rejection is considered a great encouragement.

David L. Hagen
December 13, 2012 10:21 am

Steveta_uk
Re Pratt’s “if you can provide a better fit, please do.”
Science progresses by “kicking the tires”. Models are only as robust and the challenges put to them and their ability to provide better predictions when compared against hard data – not politics.
The “proof the pudding is in the eating”. Currently the following two models show better predictive performance than IPCC’s models that average 0.2C/decade warming:
Relationship of Multidecadal Global Temperatures to Multidecadal Oceanic Oscillations Joseph D’Aleo and Don Easterbrook, Evidence-Based Climate Science. Elsevier 2011, DOI: 10.1016/B978-0-12-385956-3.10005-1
Nicola Scafetta, Testing an astronomically based decadal-scale empirical harmonic climate model versus the IPCC (2007) general circulation climate models Journal of Atmospheric and Solar-Terrestrial Physics 80 (2012) 124–137
For others views on CO2, see Fred H. Haynie The Future of Global Climate Change
No amount of experimentation can ever prove me right; a single experiment can prove me wrong. Albert Einstein

Tim Clark
December 13, 2012 10:26 am

“22-year and 11-year solar cycles and all faster phenomena“.
What happened to longer term cycles?
Oh, they must be AGW./sarc

jorgekafkazar
December 13, 2012 10:27 am

Curve-fitting is not proof of anything, especially when the input data is filtered. That heat-sink delay also needs some scrutiny. Worse, the data time range is fairly short, in geological terms. On top of that, a four component wave function? Get real.
It’s wiggle-matching, with some post hoc logic thrown in. I’m underwhelmed.

P. Solar
December 13, 2012 10:34 am

Mosh: 4. This is basically the same approach that many here praise when scafetta does it.
Sorry what N. Scaffeta does is fit all parameters freely and see what results. What Pratt did was fit his exaggerated 3K per doubling model ; see what’s left , then make up a totally unfounded waveform to eliminate it. Having thus eliminated it, he screwed up his maths and managed to also eliminate the huge discrepancy that all 3K sensitivity model have after 1998.
Had he got the maths correct it would have been circular logic. As presented to AGU it was AND STILL IS a shambles.
Attribution to AMO PDO is fanciful. The whole work is total fiction intended to remove the early 20th c. warming that has always been a show-stopper for CO2 driven AGW.
At the current state of the discussion on Curry’s site, he has been asked to state whether he recognises there is an error or stands by the presentation as given to AGU and published on Climate etc.
At the time of this posting , no news from Emeritus Prof Vaughan Pratt.

Bill Illis
December 13, 2012 10:37 am

The IPCC says that the current total forcing (all sources – RCP 6.0 scenario) is supposed to be about 2.27 W/m2 in 2012.
On top of that, we should be getting some water vapour and reduced low cloud cover feedbacks from this direct forcing so that there should be a total of about 5.0 W/m2 right now.
The amount of warming, however, (the amount that is accumulating in the Land, Atmosphere, Oceans and Ice) is only about 0.5 W/m2.
Simple enough to me.
Climate Science is much like the study of Unicorns and their invisibility cloaks.

December 13, 2012 11:00 am

I want to draw people’s attention to the frequency content of VPmK SAW2 and SAW3 wave forms. Just by eye-ball, these appear to be 75 year and 50 year frequencies. As Mike Jonas points out that early in the paper VP posits they come from major natural ocean oscillations but later a more flexible “whatever its origins.”
I am not going to debate the origins of the low frequency. Take only from VPmK the temperature record contains significant very low frequency wave forms, wavelengths greater than 25 years, needed to match even heavily filtered temperatures records where

three box filters being used to remove all of the “22-year and 11-year solar cycles and all faster phenomena“.

All that is left in VPmK data is very low frequency content and there appears to be a lot of it.
My comment below takes the importance of low frequency in VPmK and focuses on BEST: Berkley Earth and what to me appears to be minimally discussed wholesale decimation and counterfeiting of low frequency information happening within the BEST process. If you look at what is going on in the BEST process from the Fourier domain, there seems to me to be major losses of critical information content. I first wrote my theoretical objection to the BEST scalpel back in April 2, 2011 in “Expect the BEST, plan for the worst.” I expounded at Climate Audit, Nov. 1, 2011 and some other sites.
My summary argument remains unchanged after 20 months:
1. The Natural climate and Global Warming (GW) signals are extremely low frequency, less than a cycle per decade.
2. A fundamental theorem of Fourier analysis is frequency resolution dw/2π Hz = 1/(N*dt) .where dt is the sample time and N*dt is the total length of the digitized signal.
3. The GW climate signal, therefore, is found in the very lowest frequencies, low multiples of dw, which can only come from the longest time series.
4. Any scalpel technique destroys the lowest frequencies in the original data.
5. Suture techniques recreate long term digital signals from the short splices.
6. Sutured signals have in them very low frequency data, low frequencies which could NOT exist in the splices. Therefore the low frequencies, the most important stuff for the climate analysis, must be derived totally from the suture and the surgeon wielding it. From where comes the low-frequency original data to control the results of the analysis ?
Have I misunderstood the BEST process? Consider this from Muller (WSJ Eur 10/20/2011)

Many of the records were short in duration, … statisticians developed a new analytical approach that let us incorporate fragments of records. By using data from virtually all the available stations, we avoided data-selection bias. Rather than try to correct for the discontinuities in the records, we simply sliced the records where the data cut off, thereby creating two records from one.

 “Simply sliced the data.” “Avoided data-selection bias” – and by the theorems of Fourier embraced high frequency selection bias and created a bias against low frequencies. There is no free lunch here. Look at what is happening in the Fourier Domain. You are throwing a way signal and keeping the noise. How can you possibly be improving signal/noise ratio?
 
Somehow BEST takes all these fragments lacking low frequency, and “glues” them back together to present a graph of temperatures from 1750 to 2010. That graph has low frequency data – but from where did it come? The low frequencies must be counterfeit – contamination from the gluing process, manufacturing what appears to be a low frequency signal from fitting high frequency from slices. This seems so fundamentally wrong I’d sooner believe a violation of the 1st Law of Thermodynamics.
 
A beautiful example of frequency content that I expect to be found in century scale un-sliced temperature records is found in Lui-2011 Fig. 2. reprinted in WUWT In China there are no hocky sticks Dec. 7, 2011 The grey area on the left of the Fig. 2 chart is the area of low frequency, the climate signal. In the Lui study, a lot of the power is in that grey area. It is this portion of the spectrum that BEST’s scalpel removes! Fig. 4 of Lui-2011 is a great illustration of what happens to a signal as you add first the lowest frequency and successively add higher frequencies.
 
Power vs Phase & Frequency is the dual formulation of Amplitude vs Time. There is a one to one correspondence. If you apply a filter to eliminate low frequencies in the Fourier Domain, and a scalpel does that, where does it ever come back? If there is a process in the Berkley glue that preserves low frequency from the original data, what is it? And were is the peer-review discussion of its validity?
If there is no preservation of the low frequencies the scalpel removes, results from BEST might predict the weather, but not explain climate.

December 13, 2012 11:07 am

The real shame here lies in The Academy. As the author did supply all details behind the work when asked, I must assume it was done in good faith. The problem is that PhD’s are being awarded without the proper training/education in statistical wisdom. Anybody can build a model and run the numbers with a knowledge of mathematical nuts and bolts and get statistical “validation.” But a key element that supports the foundation upon which any statistical work stands seems to be increasingly ignored. That element has a large qualitative side to it which makes it more subtle thus less visible. Of course I am speaking of the knowing, understanding, and verifying of all the ASSUMPTIONS (an exercise with a large component of verbal logic) demanded by any particular statistical work to be trustworthy. I had this drilled into me during my many statistical classes at Georgia Tech 30 years ago. Why this aspect seems to be increasingly ignored I can’t say, but I can say, taking assumptions into account can be a large hurdle to any legitimate study, thus very inconvenient. I imagine publish or perish environments and increasing politicization may have much to do here. The resultant fallout and real crime is the population of scientists we are cultivating are becoming less and less able to discriminate between the different types of variation that need to be identified so that GOOD and not BAD decisions are more likely. Until the science community begins to take the rigors of statistics seriously, its output must be considered seriously flawed. To do otherwise risks the great scientific enterprise that has achieved so much.

December 13, 2012 11:11 am

lsvalgaard says:
December 13, 2012 at 10:07 am
…………
Currently I am only concerned with the calculations, no particular mechanism is considered in my article, just volumes of data, AMO, CET, N.Hemisphere, Arctic, atmospheric pressure, solar activity, the Earth’s magnetic variability, comparisons against other known proxies and reconstructions.
Since you couldn’t fail my calculations, you insisted on steering discussion away from the subject (as shown in this condensed version) with all trivia from both sides excluded:
http://www.vukcevic.talktalk.net/PR.htm
Let’s remember:
Dr. L. Svalgaard :If the correlation is really good, one can live with an as yet undiscovered mechanism.
I do indeed do consider it a great encouragement that you didn’t fail calculations for
http://www.vukcevic.talktalk.net/GSC1.htm
One step at the time. Thanks for the effort., its appreciated. Soon I’ll email Excel data on the
http://www.vukcevic.talktalk.net/SSN-NAP.htm
using 350 years of geological records instead of geomagnetic changes .Two reinforce each other.
We still don’t exactly understand how gravity works, but maths is 350 years old.
I missed your usually ‘razor sharp dissection’ of Dr. Pratt’s hypothesis

Rob Dawg
December 13, 2012 11:20 am

I don’t understand the objections to simplifying models until the correct outcome is achieved. After all if the sun really had anything to do with temperature it would get colder at night and warmer during the day.

December 13, 2012 11:21 am

vukcevic says:
December 13, 2012 at 11:11 am
Since you couldn’t fail my calculations
Of course, one cannot fail made-up ‘data’. What is wrong with your approach is to compute a new time series from two unrelated time series, and to call that ‘observed data’.

1 2 3 5