Circular Logic not worth a Millikelvin

Guest post by Mike Jonas

A few days ago, on Judith Curry’s excellent ClimateEtc blog, Vaughan Pratt wrote a postMultidecadal climate to within a millikelvin” which provided the content and underlying spreadsheet calculations for a poster presentation at the AGU Fall Conference. I will refer to the work as “VPmK”.

VPmK was a stunningly unconvincing exercise in circular logic – a remarkably unscientific attempt to (presumably) provide support for the IPCC model[s] of climate – and should be retracted.

Background

The background to VPmK was outlined as “Global warming of some kind is clearly visible in HadCRUT3 [] for the three decades 1970-2000. However the three decades 1910-1940 show a similar rate of global warming. This can’t all be due to CO2 []“.

The aim of VPmK was to support the hypothesis that “multidecadal climate has only two significant components: the sawtooth, whatever its origins, and warming that can be accounted for 99.98% by the AHH law []“,

where

· the sawtooth is a collection of “all the so-called multidecadal ocean oscillations into one phenomenon“, and

· AHH law [Arrhenius-Hofmann-Hansen] is the logarithmic formula for CO2 radiative forcing with an oceanic heat sink delay.

The end result of VPmK was shown in the following graph

clip_image002

Fig.1 – VPmK end result.

where

· MUL is multidecadal climate (ie, global temperature),

· SAW is the sawtooth,

· AGW is the AHH law, and

· MRES is the residue MUL-SAW-AGW.

Millikelvins

As you can see, and as stated in VPmK’s title, the residue was just a few millikelvins over the whole of the period. The smoothness of the residue, but not its absolute value, was entirely due to three box filters being used to remove all of the “22-year and 11-year solar cycles and all faster phenomena“.

If the aim of VPmK is to provide support for the IPCC model of climate, naturally it would remove all of those things that the IPCC model cannot handle. Regardless, the astonishing level of claimed accuracy shows that the result is almost certainly worthless – it is, after all, about climate.

The process

What VPmK does is to take AGW as a given from the IPCC model – complete with the so-called “positive feedbacks” which for the purpose of VPmK are assumed to bear a simple linear relationship to the underlying formula for CO2 itself.

VPmK then takes the difference (the “sawtooth”) between MUL and AGW, and fits four sinewaves to it (there is provision in the spreadsheet for five, but only four were needed). Thanks to the box filters, a good fit was obtained.

Given that four parameters can fit an elephant (great link!), absolutely nothing has been achieved and it would be entirely reasonable to dismiss VPmK as completely worthless at this point. But, to be fair, we’ll look at the sawtooth (“The sinewaves”, below) and see if it could have a genuine climate meaning.

Note that in VPmK there is no attempt to find a climate meaning. The sawtooth which began life as “so-called multidecadal ocean oscillations” later becomes “whatever its origins“.

The sinewaves

The two main “sawtooth” sinewaves, SAW2 and SAW3, are:

clip_image004

Fig.2 – VPmK principal sawtooths.

(The y-axis is temperature). The other two sinewaves, SAW4 and SAW5 are much smaller, and just “mopping up” what divergence remains.

It is surely completely impossible to support the notion that the “multidecadal ocean oscillations” are reasonably represented to within a few millikelvins by these perfect sinewaves (even after the filtering). This is what the PDO and AMO really look like:

clip_image006

Fig.3 – PDO.

(link) There is apparently no PDO data before 1950, but some information here.

clip_image008

Fig.4 – AMO.

(link)

Both the PDO and AMO trended upwards from the 1970s until well into the 1990s. Neither sawtooth is even close. The sum of the sawtooths (SAW in Fig.1) flattens out over this period when it should mostly rise quite strongly. This shows that the sawtooths have been carefully manipulated to “reserve” the 1970-2000 temperature increase for AGW.

clip_image010

Fig.5 – How the sawtooth “reserved” the1980s and 90s warming for AGW.

 

Conclusion

VPmK aimed to show that “multidecadal climate has only two significant components”, AGW and something shaped like a sawtooth. But VPmK then simply assumed that AGW was a component, called the remainder the sawtooth, and had no clue as to what the sawtooth was but used some arbitrary sinewaves to represent it. VPmK then claimed to have shown that the climate was indeed made up of just these two components.

That is circular logic and appallingly unscientific. The poster presentation should be formally retracted.

[Blog commenter JCH claims that VPmK is described by AGU as “peer-reviewed”. If that is the case then retraction is important. VPmK should not be permitted to remain in any “peer-reviewed” literature.]

Footnotes:

1. Although VPmK was of so little value, nevertheless I would like to congratulate Vaughan Pratt for having the courage to provide all of the data and all of the calculations in a way that made it relatively easy to check them. If only this approach had been taken by other climate scientists from the start, virtually all of the heated and divisive climate debate could have been avoided.

2. I first approached Judith Curry, and asked her to give my analysis of Vaughan Pratt’s (“VP”) circular logic equal prominence to the original by accepting it as a ‘guest post’. She replied that it was sufficient for me to present it as a comment.

My feeling is that posts have much greater weight than comments, and that using only a comment would effectively let VP get away with a piece of absolute rubbish. Bear in mind that VPmK has been presented at the AGU Fall Conference, so it is already way ahead in public exposure anyway.

That is why this post now appears on WUWT instead of on ClimateEtc. (I have upgraded it a bit from the version sent to Judith Curry, but the essential argument is the same). There are many commenters on ClimateEtc who have been appalled by VPmK’s obvious errors. I do not claim that my effort here is in any way better than theirs, but my feeling is that someone has to get greater visibility for the errors and request retraction, and no-one else has yet done so.

 

Advertisements

  Subscribe  
newest oldest most voted
Notify of
James

Assume AGW is a flat line and repeat the analysis. When the fit remains near perfect, trumpet the good news that AGW is no more!

Another failed attempt to force dynamic, poorly understood, under-sampled, natural process into some kind of linear deterministic logic system. It simply will not work. This foolishness is not worth any further time or effort on the part of serious scientists.

Steveta_uk

What VP has said repeatedly on JC’s site is basically that if you can provide a better fit, please do.
Nobody that I’ve seen has yet done so. Now I’m not in any way suggesting that what VP has done is in any way useful science. But still, can you not simply alter the 4 or 5 sines waves to show that you can provide just as good a fit without the AHH curve?
If you can, please present it here.
And if you cannot, then VP remains uncontested.

Why is it that the rationalizations of the Warmistas are beginning to remind me of Ptolemy and The Almagest?

I worked in the Banking Industry for most of my adult life. During that time, many people would be applying for finance for this business or that business – maybe a mortgage, maybe a loan.
All would arrive with their shiny spreadsheet proving their business model was viable and would soon show profitability
I never saw any proposed business plan to the Bank that didn’t show remarkable profit – certainly none ever predicted a loss.
Nonetheless, the vast majority of those business plans would fail abysmally.
Just goes to show, any spreadsheet can be made to produce whatever results the author wants – just tweak here or tweak there.
Now about milli-kelvins ?
Andi

‘VPmK was a stunningly unconvincing exercise in circular logic – a remarkably unscientific attempt to (presumably) provide support for the IPCC model[s] of climate – and should be retracted.”
1. its not circular.
2. its not a proof or support for models.
3. you cant retract a poster.
4. This is basically the same approach that many here praise when scafetta does it.
basically he is showing that GIVEN the truth of AGW, the temperature series can be explained by a few parameters. GIVEN, is the key you misunderstand the logic of his approach.

Taphonomic

“[Blog commenter JCH claims that VPmK is described by AGU as “peer-reviewed”. If that is the case then retraction is important. VPmK should not be permitted to remain in any “peer-reviewed” literature.]”
Describing VPmk as peer-reviewed is incorrect. Abstracts published by AGU for either poster sessions or presentations made at the meeting are not peer-reviewed. There are quite a few comments at the blog after JCH on this topic.

Matthew R Marler

VPmK was a stunningly unconvincing exercise in circular logic – a remarkably unscientific attempt to (presumably) provide support for the IPCC model[s] of climate – and should be retracted.
That is over-wrought. Vaughan Pratt described exactly what he did and found, and published the data that he used and his result. If the temperature evolution of the Earth over the next 20 years matches his model, then people will be motivated to find whatever physical process generates the sawtooth. If not, his model will be disconfirmed along with plenty of other models. Lots of model-building in scientific history has been circular over the short-term: in “The Structure of Scientific Revolutions” Thomas Kuhn mentions Ohm’s law as an example, and Einstein’s special relativity; lots of people have noted the tautology of F = dm/dt where m here stands for momentum.
Pratt merely showed that, with the data in hand, it is possible to recover the signal of the CO2 effect with a relatively low-dimensional filter. No doubt, the procedure is post hoc. The validity of the approach will be tested by data not used in fitting the functions that he found.

Matthew R Marler

Steven Mosher wrote: 4. This is basically the same approach that many here praise when scafetta does it.
I agree with that.

richardscourtney

Steven Mosher:
You innumerate four points in your post at December 13, 2012 at 9:04 am. I address each of them in turn.
1. its not circular.
(Clearly, it is “circular” in that it removes everything from the climate data except what the climate models emulate then says the result of the removal agrees with what the climate emulate when tuned to emulate it.)
2. its not a proof or support for models.
(Agreed, it is nonsense.)
3. you cant retract a poster.
(Of course you can! All you do is publish a statement saying it should not have been published, and you publish that statement in one or more of the places where the “poster” was published; e.g. in this case, on Judith Curry’s blog.)
4. This is basically the same approach that many here praise when scafetta does it.
(So what! Many others – including me – object when Scafetta does it. Of itself that indicates nothing.)
The poster by Vaughan Pratt only indicates that Pratt is a prat: live with it.
Richard

richardscourtney

OOOps! I wrote
(Clearly, it is “circular” in that it removes everything from the climate data except what the climate models emulate then says the result of the removal agrees with what the climate emulate when tuned to emulate it.)
Obviously I intended to write
(Clearly, it is “circular” in that it removes everything from the climate data except what the climate models emulate then says the result of the removal agrees with what the models emulate when tuned to emulate it.)
Sorry.
Richard

Arno Arrak

I agree. I commented about it on Curry’s blog and called it worthless. I was particularly annoyed that he used HadCRUT3 which is error-ridden and anthropogenically distorted. I could see that he was using his computer skills to create something out of nothing and did not understand why that sawtooth did not go away. That millikelvin claim is of course nonsense and was simply part of his applying his computer skills without comprehending the data he was working with. Suggested that he write a program to find and correct those anthropogenic spikes in HadCRUT and others.

On the sidelines of the V.Pratt’s blog presentation there was a secondary discussion between myself and Dr. Svalgaard about the far more realistic causes of the climate change. Since Dr.S. often does peer review on the articles relating to solar matters, leaving the trivia out, I consider our exchanges as an ‘unofficial peer review of my calculations’, no mechanism is considered in the article, just the calculations. This certainly was not ‘friendly’ review, although result may not be conclusive, I consider it a great encouragement.
http://www.vukcevic.talktalk.net/PR.htm
If there are any scientists who are occasionally involved in the ‘peer review’ type processes, I would welcome the opportunityto submit my calculations. My email as in my blog id followed by @yahoo.com.

Vaughan presented an interesting idea which has been roundly tested by many commenters in a spirit of science and hotly contested views within a framework of courtesy by Vaughan defending his ideas. Personally I’m not convinced that CET demonstrates his theory , in fact I think it shows he is wrong, but if every post, whether here or at climate etc, was discussed in such a thorough manner everyone would gain, whatever side of the fence they are on.
Tonyb

vukcevic says:
December 13, 2012 at 9:33 am
This certainly was not ‘friendly’ review, although result may not be conclusive, I consider it a great encouragement
It seems that In the twisted world of pseudo-science even a flat-out rejection is considered a great encouragement.

David L. Hagen

Steveta_uk
Re Pratt’s “if you can provide a better fit, please do.”
Science progresses by “kicking the tires”. Models are only as robust and the challenges put to them and their ability to provide better predictions when compared against hard data – not politics.
The “proof the pudding is in the eating”. Currently the following two models show better predictive performance than IPCC’s models that average 0.2C/decade warming:
Relationship of Multidecadal Global Temperatures to Multidecadal Oceanic Oscillations Joseph D’Aleo and Don Easterbrook, Evidence-Based Climate Science. Elsevier 2011, DOI: 10.1016/B978-0-12-385956-3.10005-1
Nicola Scafetta, Testing an astronomically based decadal-scale empirical harmonic climate model versus the IPCC (2007) general circulation climate models Journal of Atmospheric and Solar-Terrestrial Physics 80 (2012) 124–137
For others views on CO2, see Fred H. Haynie The Future of Global Climate Change
No amount of experimentation can ever prove me right; a single experiment can prove me wrong. Albert Einstein

Tim Clark

“22-year and 11-year solar cycles and all faster phenomena“.
What happened to longer term cycles?
Oh, they must be AGW./sarc

jorgekafkazar

Curve-fitting is not proof of anything, especially when the input data is filtered. That heat-sink delay also needs some scrutiny. Worse, the data time range is fairly short, in geological terms. On top of that, a four component wave function? Get real.
It’s wiggle-matching, with some post hoc logic thrown in. I’m underwhelmed.

P. Solar

Mosh: 4. This is basically the same approach that many here praise when scafetta does it.
Sorry what N. Scaffeta does is fit all parameters freely and see what results. What Pratt did was fit his exaggerated 3K per doubling model ; see what’s left , then make up a totally unfounded waveform to eliminate it. Having thus eliminated it, he screwed up his maths and managed to also eliminate the huge discrepancy that all 3K sensitivity model have after 1998.
Had he got the maths correct it would have been circular logic. As presented to AGU it was AND STILL IS a shambles.
Attribution to AMO PDO is fanciful. The whole work is total fiction intended to remove the early 20th c. warming that has always been a show-stopper for CO2 driven AGW.
At the current state of the discussion on Curry’s site, he has been asked to state whether he recognises there is an error or stands by the presentation as given to AGU and published on Climate etc.
At the time of this posting , no news from Emeritus Prof Vaughan Pratt.

Bill Illis

The IPCC says that the current total forcing (all sources – RCP 6.0 scenario) is supposed to be about 2.27 W/m2 in 2012.
On top of that, we should be getting some water vapour and reduced low cloud cover feedbacks from this direct forcing so that there should be a total of about 5.0 W/m2 right now.
The amount of warming, however, (the amount that is accumulating in the Land, Atmosphere, Oceans and Ice) is only about 0.5 W/m2.
Simple enough to me.
Climate Science is much like the study of Unicorns and their invisibility cloaks.

I want to draw people’s attention to the frequency content of VPmK SAW2 and SAW3 wave forms. Just by eye-ball, these appear to be 75 year and 50 year frequencies. As Mike Jonas points out that early in the paper VP posits they come from major natural ocean oscillations but later a more flexible “whatever its origins.”
I am not going to debate the origins of the low frequency. Take only from VPmK the temperature record contains significant very low frequency wave forms, wavelengths greater than 25 years, needed to match even heavily filtered temperatures records where

three box filters being used to remove all of the “22-year and 11-year solar cycles and all faster phenomena“.

All that is left in VPmK data is very low frequency content and there appears to be a lot of it.
My comment below takes the importance of low frequency in VPmK and focuses on BEST: Berkley Earth and what to me appears to be minimally discussed wholesale decimation and counterfeiting of low frequency information happening within the BEST process. If you look at what is going on in the BEST process from the Fourier domain, there seems to me to be major losses of critical information content. I first wrote my theoretical objection to the BEST scalpel back in April 2, 2011 in “Expect the BEST, plan for the worst.” I expounded at Climate Audit, Nov. 1, 2011 and some other sites.
My summary argument remains unchanged after 20 months:
1. The Natural climate and Global Warming (GW) signals are extremely low frequency, less than a cycle per decade.
2. A fundamental theorem of Fourier analysis is frequency resolution dw/2π Hz = 1/(N*dt) .where dt is the sample time and N*dt is the total length of the digitized signal.
3. The GW climate signal, therefore, is found in the very lowest frequencies, low multiples of dw, which can only come from the longest time series.
4. Any scalpel technique destroys the lowest frequencies in the original data.
5. Suture techniques recreate long term digital signals from the short splices.
6. Sutured signals have in them very low frequency data, low frequencies which could NOT exist in the splices. Therefore the low frequencies, the most important stuff for the climate analysis, must be derived totally from the suture and the surgeon wielding it. From where comes the low-frequency original data to control the results of the analysis ?
Have I misunderstood the BEST process? Consider this from Muller (WSJ Eur 10/20/2011)

Many of the records were short in duration, … statisticians developed a new analytical approach that let us incorporate fragments of records. By using data from virtually all the available stations, we avoided data-selection bias. Rather than try to correct for the discontinuities in the records, we simply sliced the records where the data cut off, thereby creating two records from one.

 “Simply sliced the data.” “Avoided data-selection bias” – and by the theorems of Fourier embraced high frequency selection bias and created a bias against low frequencies. There is no free lunch here. Look at what is happening in the Fourier Domain. You are throwing a way signal and keeping the noise. How can you possibly be improving signal/noise ratio?
 
Somehow BEST takes all these fragments lacking low frequency, and “glues” them back together to present a graph of temperatures from 1750 to 2010. That graph has low frequency data – but from where did it come? The low frequencies must be counterfeit – contamination from the gluing process, manufacturing what appears to be a low frequency signal from fitting high frequency from slices. This seems so fundamentally wrong I’d sooner believe a violation of the 1st Law of Thermodynamics.
 
A beautiful example of frequency content that I expect to be found in century scale un-sliced temperature records is found in Lui-2011 Fig. 2. reprinted in WUWT In China there are no hocky sticks Dec. 7, 2011 The grey area on the left of the Fig. 2 chart is the area of low frequency, the climate signal. In the Lui study, a lot of the power is in that grey area. It is this portion of the spectrum that BEST’s scalpel removes! Fig. 4 of Lui-2011 is a great illustration of what happens to a signal as you add first the lowest frequency and successively add higher frequencies.
 
Power vs Phase & Frequency is the dual formulation of Amplitude vs Time. There is a one to one correspondence. If you apply a filter to eliminate low frequencies in the Fourier Domain, and a scalpel does that, where does it ever come back? If there is a process in the Berkley glue that preserves low frequency from the original data, what is it? And were is the peer-review discussion of its validity?
If there is no preservation of the low frequencies the scalpel removes, results from BEST might predict the weather, but not explain climate.

The real shame here lies in The Academy. As the author did supply all details behind the work when asked, I must assume it was done in good faith. The problem is that PhD’s are being awarded without the proper training/education in statistical wisdom. Anybody can build a model and run the numbers with a knowledge of mathematical nuts and bolts and get statistical “validation.” But a key element that supports the foundation upon which any statistical work stands seems to be increasingly ignored. That element has a large qualitative side to it which makes it more subtle thus less visible. Of course I am speaking of the knowing, understanding, and verifying of all the ASSUMPTIONS (an exercise with a large component of verbal logic) demanded by any particular statistical work to be trustworthy. I had this drilled into me during my many statistical classes at Georgia Tech 30 years ago. Why this aspect seems to be increasingly ignored I can’t say, but I can say, taking assumptions into account can be a large hurdle to any legitimate study, thus very inconvenient. I imagine publish or perish environments and increasing politicization may have much to do here. The resultant fallout and real crime is the population of scientists we are cultivating are becoming less and less able to discriminate between the different types of variation that need to be identified so that GOOD and not BAD decisions are more likely. Until the science community begins to take the rigors of statistics seriously, its output must be considered seriously flawed. To do otherwise risks the great scientific enterprise that has achieved so much.

lsvalgaard says:
December 13, 2012 at 10:07 am
…………
Currently I am only concerned with the calculations, no particular mechanism is considered in my article, just volumes of data, AMO, CET, N.Hemisphere, Arctic, atmospheric pressure, solar activity, the Earth’s magnetic variability, comparisons against other known proxies and reconstructions.
Since you couldn’t fail my calculations, you insisted on steering discussion away from the subject (as shown in this condensed version) with all trivia from both sides excluded:
http://www.vukcevic.talktalk.net/PR.htm
Let’s remember:
Dr. L. Svalgaard :If the correlation is really good, one can live with an as yet undiscovered mechanism.
I do indeed do consider it a great encouragement that you didn’t fail calculations for
http://www.vukcevic.talktalk.net/GSC1.htm
One step at the time. Thanks for the effort., its appreciated. Soon I’ll email Excel data on the
http://www.vukcevic.talktalk.net/SSN-NAP.htm
using 350 years of geological records instead of geomagnetic changes .Two reinforce each other.
We still don’t exactly understand how gravity works, but maths is 350 years old.
I missed your usually ‘razor sharp dissection’ of Dr. Pratt’s hypothesis

I don’t understand the objections to simplifying models until the correct outcome is achieved. After all if the sun really had anything to do with temperature it would get colder at night and warmer during the day.

vukcevic says:
December 13, 2012 at 11:11 am
Since you couldn’t fail my calculations
Of course, one cannot fail made-up ‘data’. What is wrong with your approach is to compute a new time series from two unrelated time series, and to call that ‘observed data’.

richardscourtney

Stephen Rasey:
re your post at December 13, 2012 at 11:00 am.
Every now and then one comes across a pearl shining on the sand of WUWT comments. The pearls come in many forms.
Your post is a pearl. Its argument is clear, elegant and cogent. Thankyou.
Richard

Jens Bagh

Should it not be milliKelvin?

Mooloo

Steveta_uk says:
What VP has said repeatedly on JC’s site is basically that if you can provide a better fit, please do..

A “better fit” is not useful. In cases like this the correct model will not provide a better fit, because the correct model has deviations between theory and reality due to noise..
If I have a 100% normally distributed population and take 100 samples then the result will never be an exact normal distribution. I could model a “distribution” that better matched my samples, but it would most certainly not tell me anything useful. In fact it would lead me to believe my actual population was not normal.
This is why I am highly suspicious of any model that is trained on old data. It is basically an exercise in wiggle matching, not an exercise in getting the underlying physics correct. The best climate models will have pretty poor fit to old temperatures.

Substitute a 1300 year wave length sine with a max at MWP and min at LIA for the AGW function and you will get similar results.

lsvalgaard says:
December 13, 2012 at 11:21 am
vukcevic says:
December 13, 2012 at 11:11 am
Since you couldn’t fail my calculations
Of course, one cannot fail made-up ‘data’. What is wrong with your approach is to compute a new time series from two unrelated time series, and to call that ‘observed data’.
………………..
Wrong Doc.
Magnometer at the Tromso does it every single minute of the day and night.
http://www.vukcevic.talktalk.net/Tromso.htm
In red are Incoming variable solar magnetic field sitting on the top of the variable Earth’s magnetic field.
Rudolf Wolf started it with a compass needle, Gauss did it with a bit more sophisticated apparatus, and today numerous geomagnetic stations do it as you listed dozens in your paper on IDV.
So it is OK for Svalgaard of Stanford to derive IDV from changes of two combined magnetic fields, but is not for Vukcevic.
Reason plain and obvious, it would show that the SUN DOES IT !
Here is how geomagnetic field (Eart + solar) is measured and illustrated by our own Dr. Svalgaard
http://www.leif.org/research/Rudolf%20Wolf%20and%20the%20Sunspot%20Number.ppt#8
and he maintains they are not added together in his apparatus.
Can anyone spot 3 magnets?
Dr. S are you really serious to suggest that no changes in the Earth field are registered by your apparatus?
Case closed!

Gail Combs

Steveta_uk says:
December 13, 2012 at 8:55 am
What VP has said repeatedly on JC’s site is basically that if you can provide a better fit, please do.
Nobody that I’ve seen has yet done so. Now I’m not in any way suggesting that what VP has done is in any way useful science. But still, can you not simply alter the 4 or 5 sines waves to show that you can provide just as good a fit without the AHH curve?
If you can, please present it here.
And if you cannot, then VP remains uncontested.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
HUH?
Let me get this correct.
I publish a paper showing that the rise and fall of women’s skirts plus a saw tooth pattern provides a good fit to the curve. Since no one can provide a better ‘fit’ than that the paper has to stand?

vukcevic says:
December 13, 2012 at 12:22 pm
So it is OK for Svalgaard of Stanford to derive IDV from changes of two combined magnetic fields, but is not for Vukcevic.
It is OK to derive two time series of the external field driven by the same source, but not to confuse and mix the external and internal fields that have different sources and don’t interact. Case is indeed closed, as you are incapable of learning.

RobertInAz

As mentioned by Steveta_uk and others ….
Rather than engage in histrionics, the way to refute the Vaughan Pratt poster is to create a similar spreadsheet (or modify his spreadsheet) to show a nominal AGW signal.
As nearly as I can tell, Dr. Pratt has done everything responsible skeptics ask:
– Formulated a hypothesis
– Presented all of the supporting data
– Published in an accessible forum
– Asked for feedback.
I have not dropped in on the thread for a couple of days. However, I suspect that if someone has published a spreadsheet model that refutes Dr. Pratt’s, then it would have been mentioned here.
I do not care whether four parameters can fit an elephant. I would like to see someone mathematically refute Dr. Pratt’s model. I took a look and realized I do not have the time to reacquire the expertise to do it. (I had the expertise years ago and even have the optimization code I wrote for my AI class that could be adapted to this problem).
In my case, I strongly believe there are contradictory cases (it is just math and there are a lot of variables), but until someone devotes the mental sweat to create one (maybe Nick Scafetta has per Steven Mosher @ 9:04 am), Dr. Pratt’s result stands as he has described it. He asks people to show the contradictions.
Finally re circularity. I agree that that post hoc curve fitting can be described as circular. All of the GCMs do it to reproduce historical temperature. What Dr. Pratt has done is simplify the curve fitting to a spreadsheet we can all use.

Don Monfort

I am with Steveta, on this one. Unless someone comes up with better numerology, Professor Pratt’s numerology stands.

P. Solar

Steveta_uk says: “What VP has said repeatedly on JC’s site is basically that if you can provide a better fit, please do.”
Why would anyone want spend time searching for a “better fit” of an exaggerated exponential, bend down my a broken filter, plus a non physically attributable wiggle to ANYTHING?
Please explain the motivation and rewards of such an exercise.

P. Solar

RobertInAz: I have not dropped in on the thread for a couple of days. …. Dr. Pratt’s result stands as he has described it. He asks people to show the contradictions.
Then you ought to do so before commenting , no?
He asks for criticisms but it’s fake openness. He clearly has no intent of admitting even the most blatant errors in his pseudo-paper-poster.
Oops is not in the vocabulary of this great scientific authority.

Gail Combs

jack hudson says:
December 13, 2012 at 11:07 am
……………….
On statistics –
I have notice that since computers and statistical packages became readily available in the 1980’s there has been a shift away from using a trained statistician to do-it-yourself statistics. ‘Six Sigma’ in industry is an example.
The statistical training I got from the ‘Six Sigma’ program at work was absolute crap. All they taught was how to use the computer program with not even a basic explanation of different types of distribution to go with it or even the warning to PLOT THE DATA so you could see the shape of the distribution. They did not even get into attributes vs variables!
It reminds me of the shift from the use of well trained secretaries who would clean up a technonut’s English and pry the need infor out of him to having everyone write their own reports. My plant manager in desperation insisted EVERYONE in the plant take night courses in English composition.
Too bad Universities do not insist that anyone using statistics must take at least three semesters of Stat.

lsvalgaard says:
December 13, 2012 at 12:43 pm
It is OK to derive two time series of the external field driven by the same source, but not to confuse and mix the external and internal fields that have different sources and don’t interact.
As this apparatus does:
http://www.leif.org/research/Rudolf%20Wolf%20and%20the%20Sunspot%20Number.ppt#8
records combined solar and Earth’s fields
or as Vukcevic does in here:
http://www.vukcevic.talktalk.net/EarthNV.htm
calculates combined solar and Earth’s fields
Do you suggest than the combined field curve that happens to match temperature change in the N. Hemisphere is coincidental, it just appeared by chance ?

Gail Combs

fhhaynie says:
December 13, 2012 at 12:17 pm
Substitute a 1300 year wave length sine with a max at MWP and min at LIA for the AGW function and you will get similar results.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
That is pretty much what the Chinese, Liu Y, Cai Q F, Song H M, et al. did. They used 1324 years.

GRAPH: http://jonova.s3.amazonaws.com/graphs/china/liu-2011-cycles-climate-tibet-web.gif
Figure 4 Decomposition of the main cycles of the 2485-year temperature series on the Tibetan Plateau and periodic function simulation. Top: Gray line,original series; red line, 1324 a cycle; green line, 199 a cycle; blue line, 110 a cycle. Bottom: Three sine functions for different timescales. 1324 a, red dashed line (y = 0.848 sin(0.005 t + 0.23)); 199 a, green line (y = 1.40 sin(0.032 t – 0.369)); 110 a, blue line (y = 1.875 sin(0.057 t + 2.846)); time t is the year from 484 BC to 2000 AD.
http://wattsupwiththat.com/2011/12/07/in-china-there-are-no-hockey-sticks/

@Gail. anyone using statistics must take at least three semesters of Stat.
I agree. Three semesters of statistics would be more generally useful throughout adult life than three semesters of Calc. I’m 34 years out of my B.Sc, 31 from my Ph.D. The college texts I return to most often are my Stat books, Johnson & Leone 1977.
Not that calc isn’t useful. Not that it isn’t required for “Diff_E_Q”. But statistics is the one of the only courses that by design gives you training in uncertainty, to quantify what you don’t know.

vukcevic says:
December 13, 2012 at 1:22 pm
records combined solar and Earth’s fields
It records the external and internal fields superposed by Nature and thus existing in Nature
or as Vukcevic does in here:
calculates combined solar and Earth’s fields

no, you calculate a field that does not exist in Nature by combining two that are not physically related
Do you suggest than the combined field curve that happens to match temperature change in the N. Hemisphere is coincidental, it just appeared by chance ?
I say that the quantity you calculate does not exist in Nature and therefore that any correlation is spurious or worse. But I thought your case was closed. Keep it that way, instead of carpet bombing every thread on every blog with it.

David L

Can’t you just take the fourier transform of the raw data to find the frequency components and phase shifts and whatever else is left over?
I do applaud the guy’s work with sines and exponentials. At least it isn’t the standard linear regression garbage!!!

lsvalgaard says:
December 13, 2012 at 1:41 pm
………
I calculate combined effect of two variables, as you could calculate combined effect of wind and temperature on the evaporation, but in my case it happens that both variables are magnetic fields.
Are you happy now?
I post on other blogs so the readers also should be aware You don’t need to follow me around, if you think it is not worth your attention. Why are you so concerned ?
In a way I am pleasantly surprised that you are devoting all your attention to my ‘nonsense’ rather than the ‘brilliant’ work of your Stanford colleague, discussed above. Either you think Dr. Pratt’s work is of a superb quality or utter rubbish, in either case no comment of yours is required.
Good night.

vukcevic says:
December 13, 2012 at 3:01 pm
I calculate combined effect of two variables, as you could calculate combined effect of wind and temperature on the evaporation, but in my case it happens that both variables are magnetic fields.
Are you happy now?

Wind and temperature and evaporation are physically related. Your inputs are not. that they are both magnetic fields is irrelevant, it makes as much sense to combine them as it would the fields of the Sun and Sirius.
I post on other blogs so the readers also should be aware
I think the readers are ill served with nonsense.
You don’t need to follow me around, if you think it is not worth your attention. Why are you so concerned ?
Because scientists have an obligation to combat pseudo-science and provide the public with correct scientific information. Even though not all do that.
In a way I am pleasantly surprised that you are devoting all your attention to my ‘nonsense’ rather than the ‘brilliant’ work of your Stanford colleague, discussed above.
You should be ashamed of peddling your nonsense, not pleased when found out.
Either you think Dr. Pratt’s work is of a superb quality or utter rubbish
Curve fitting is what it is. If one believes in it has little bearing on the mathematical validity of the fitting procedure. I asked him to make an experiment for me and the result was that what he called the ‘solar curve’ was different in solar data and in CET and HadCRUT3 temperature data and between the latter two as well. This settled the matter for me at least.

Matthew R Marler

Mike Jonas: But the result was still obtained by circular logic.
In filtering, there is a symmetry: if you know the signal, you can find a filter that will reveal it clearly; if you know the noise, you can design a filter to reveal the signal clearly. Pratt assumed a functional form for the signal (he said so at ClimateEtc), and worked until he had a filter that revealed it clearly.
The thought process becomes “circular” if you “complete the circle”, so to speak, and conclude that: since he found what he assumed, then it must be true. My only claim is that, given what he did, the result can be, and should be, tested on future data. I have written about the same regarding the modeling of Vukcevic and Scafetta. I would say the same regarding the curve-fitting of Liu et al cited by Gail Combs above. Elsewhere I have written the same of the modeling of Latif and Tsonis, and of the GCMs. I do not expect any extant model to survive the next 20 years’ worth of data collection, but I think that the data collected to date do not clearly rule out very much — though alarmist predictions made in 1988-1990 look less credible year by year.

Matthew R Marler

Mike Jonas: However, others have looked at NAT. eg, http://wattsupwiththat.com/2010/09/30/amopdo-temperature-variation-one-graph-says-it-all/
I haven’t investigated their workings, so I am not in a position to say whether their graph is worth anything, but at least it is using real data on the PDO and AMO. If they have got it right (NB. that’s an “If”), then they have nailed NAT, and HUM looks to be around a flat zero.

Well said.

I submit that a simpler and better fit of the unfiltered data is 0.573-0.973sine(x/608+.96)+0.108sine(x/63+1.21)+0.038sine(x/20+1.46) where x=2PI*year. AGW may be covarient with that 608 year cycle and contributes a little bit to the magnitude of the coefficient -.973. Most of the residual looks like a three to five year cycle.