Jean S writes at Climate Audit:
As most readers are aware, and stated in my post few hours after (ClimateGate) CG broke out, Mike’s Nature trick was first uncovered by UC here.
He was able to replicate (visually perfectly) the smooths of MBH9x thereby showing that the smooths involved padding with the instrumental data. The filter used by UC was the zero-phase Butterworth filter (an IIR filter), which has been Mann’s favourite since at least 2003. However, there was something else that I felt was odd: UC’s emulation required a very long (100 samples or so) additional zero padding. So about two years ago, I decided to take an additional look at the topic with UC.
Indeed, after digitalizing Mann’s smooths we discovered that UC’s emulation was very, very good but not perfect. After a long research, and countless hours of experimenting (I won’t bore you with the details), we managed to figure out the “filter” used by Mann before Mann (2004)-era. Mann had made his own version of the Hamming filter (windowing method, an FIR filter)! Instead of using any kind of usual estimate for the filter order, which is usually estimated from the transition bandwidth (see, e.g., Mitra: Digital Signal Processing) and has typically the length of a few dozen coefficients at maximum, he used the filter length equal to the length of the signal to be filtered! As Mann’s PCA was apparently just a “modern” convention, this must be a “modern” filter design. Anyhow, no digital signal processing expert I consulted about the matter had ever seen anything like that.
JeanS adds in a comment:
the main issue here is not which filters/smoothers are “appropriate”, but the fact Mann was using a method unknown to anyone else. This made it practically impossible to replicate his smoothings, and later, to definitely show beyond the reasonable doubt that he indeed used the trick (i.e., padded with the instrumental data).
Bold mine. Read more here: http://climateaudit.org/2014/08/29/mannomatic-smoothing-technical-details/
I suppose a man who designates himself as a Nobel Laureate in court proceedings, even after being chastised by the IPCC, only to have to retract the court claims later, would have no moral issues at all with making a special version of an established filter to suit his own special purposes.
![diverg33[1]](http://wattsupwiththat.files.wordpress.com/2014/08/diverg331.gif)
I wonder who a certain Bill H .
One of Mann’s cronies finally getting smart?
He has the trick of picking on something you’ve thought of for years as thoroughly debunked, and coming up with a whole bunch of studies that you’ve never heard of, and that were never mentioned at the time (although to be honest global warmers usually depend more on pictures of polar bears than on reasoned argument when their backs are against the wall, and it is just about possible that the proof was there all along and they never thought of mentioning it!), and that you certainly haven’t got time to research.
And yet he claims to be a sceptic.
It is hard to see exactly where you are coming from Gus.
This IS science… The pursuit of knowledge, the pursuit of explanations…
I, for one, am very impressed.
Any old thing you do that is smoothy is a low pass filter. It’s a little weird that so much thought went into it either by Mann or his critics. There’s nothing magic in the classical filter formulas except that they may optimize some measure of smoothing vs time extent or something. That is, your resolution may be worse than it could be, or something, if you picked a better filter, but smoothing is smoothing.
The data is so awful that it hardly matters what is done here.
As for Mann, he’s going to be drummed out of climate science as a scapegoat. Climate science hopes to purify itself through the ritual.
That doesn’t make it a science, however.
It’s just a sociological move.
If you wanted to do an intelligent analysis of the temperature data with proxies and everything, you wouldn’t do this kind of filtering at all. The classical stuff would just be to show you sort of what the data looked like, but wouldn’t prove anything. For one thing, it’s not a stationary process, so Fourier-oriented things (low pass filters among them) are not appropriate and give you mostly artifacts.
And even with a stationary process, who knows how PCA interacts with low pass filtering. Surely that gives you artifacts too.
The right way, or one right way, is a sort of Kalman filtering, based on models of how things might change.
The would give the most probably model from among the family you offered it, and automatically adjusts to different kinds of data (proxy, instrumental) very explicitly.
So whatever your result, you’ll be able to say what choices were offered, and it could be criticized and improved from that.
As it is, you just reject the entire mess and hope that somebody with some first-principles competence turns up to do it over.
They haven’t even passed him the black spot yet!
Bear in mind how iconic the guy is in the climate industry, and also the fact that they *don’t do scapegoats*. Everything has to be true. Every last little thing. Mount Kilimanjaro? Got to be climate change. Polar bears? Practically extinct. Pause? What pause?
There must be not the slightest little crack or the whole wall comes tumbling down…
It’s true – they can’t let go of anything!
It is just like the famous Indian monkey trap – a coconut full of goodies with a hole big enough for the monkey’s open hand to go in, but not big enough for the monkey to withdraw his closed hand. The monkey won’t drop his prize even at the cost of being caught.
Love the new and improved (apparently) ads. Seem related to viewer and topic.
The previous article about Solar Engery on the Courthouse got me a local BC Hydro ad (touting something to do with their smart energy program etc etc).
This article is even better, with some pictures inviting me to click “10 something…”. But the best one (I guess reference M Mann friends) is titled “10 Sexiest Hollywood Nerds”. Very appropriate!
Mann
Mann took man for a fool.
Now fate has Mann.
History came.
Man now knows Mann.
Dr Mann, Sir,
Can you sense the vultures beginning to circle above your head yet?
I am saddened by it all in deepest truth but I’m afraid it is deserved in your case.
You will be able to repent at leisure Sir.
I await the next couple of years with great interest.
Jones
Mann will likely now submit a research paper naming his filter after himself of course, and will manage to get a herd of co-authors to sign on to the paper as well. We have already witnessed enthusiastic journals prostituting themselves at the feet of climate scientists feeding at the watermelon targeted gravy train. After all, the journals deserve some of our hard earned money rapidly leaking out of our wallets too.
Lucky for him WUWT posters and commenters have done the outlining for him. All he has to do is scrub the well-deserved “what an idiot” color commentary out of the text and he is good to go with his first draft.
“As most readers are aware…”
===========
Think of the new readers, and the children, that might not be aware.
Question: What do the results look like if Mann’s “Special Filter” is replaced with Filter using correct values (“typically the length of a few dozen coefficients at maximum”)?
Guessing the results obliterate Mann’s Hockey Stick…
All the non-statisticians: whatever happened to simply looking at and interpreting the data? Ya can’t do it for climate science.
The fact that such fabulous statistical manipulations are needed says that the signal, i.e. the reality, being investigated is obscure to the point of insignificance. I say “insignificance” because it has to be teased out of the background noise. We can’t detect it otherwise. Which means we aren’t being affected by it.
What we are saying by all this is the changes CAGW uses lies in a hidden change in the max and min values, the regional changes. Nowhere is there ANY definite signal of CAGW, it is all in the mathematical grinding of numbers. Nobody can possibly see it. It is only visible as a construct. This is why the “science” is all about models. The public doesn’t understand this, the public thinks it is about data, stuff they can see and feel, like a temperature gauge or an anemometer. It isn’t.
Every time someone says you can see climate change around him, he is telling an untruth. The changes are all within normal variability but on a scale that people don’t experience, i.e we experience only a portion of the curve. Every other portion of the curve is different from what we experience because we don’t live long enough. And that involves regional changes, not global. And this is why homogenization and adjustments/corrections are so important: the signal is equal to or less than the modifications, at least invisible without the modifications.
There is far too much math going on for the understanding of people like Al Gore or David Suzuki or the general public. Actually, both Al and David don’t give a crap: their ideas of social engineering, personal legacy and making a thousand bucks are the dominant factors in what they say and promote as “the truth”. But there is too much math going on for the understanding of regular people, even those who espouse to have college education and a intelligence that allows individual thinking.
When you need a microscope to see it, it ain’t really there, leastways when it comes to weather AND climate.
“All the non-statisticians: whatever happened to simply looking at and interpreting the data? Ya can’t do it for climate science.”
I’m not so sure that they don’t do it properly and then when the results do not give them the expected results, they keep adjusting, manipulating, torturing the data until it does.
Not the real Climate Scientists, but the ones that champion CAGW.
A general rule in signal processing is that the number of taps on an FIR filter shouldn’t exceed about 1/3 of the number of points in the input signal.
I wonder what the results would be if the method used on Diederick Stapel were applied to Michael Mann?
http://www.realclearscience.com/journal_club/2014/08/28/can_a_scientists_writing_reveal_fraud_108809.html
Does Mann have any “honest” papers to compare results?
may the climate science version of the scientific method now be known as “climate stapel” ,it has a certain ring to it.
Now do you believe in Mann made global warming?
This is science?
Crazy engineering and statistical math aside, the takeaway here is that many people are having to spend significant resources to attempt to validate or verify scientific due process the author will not voluntarily divulge. That alone should have been sufficient to derail Mann’s work to the dustbin of time and cost him his position…
+1
The competent researcher would leave the data files unchanged, and create a makefile to do what needs to be done, in his opinion. It’s organized, faster, and reproducible.
Modulo changes made by overactive graduate students to the compilers, leading to bit rot, but that plagues everybody.
On my very first ISA bus card, a utility card for an OEM, I used a Hamming filter to encode the feature licenses. Ok, I’m no genius but the filter certainly returns deterministic outcomes..
“… he used the filter length equal to the length of the signal to be filtered!”
I think this deserves.emphasis. If the filter is as long as the signal then the filter can determine the output entirely,as selected at the choice of filter parameters. So that Mann did not publish the filter details makes the paper utterly meaningless.
“Filter Length” can be deceptive. Suppose we have a length 1000 FIR filter. Suppose further it is (as is commonly found) symmetric and clustered about the center so that the first 450 taps are very tiny relative to the center taps, as are the last 450. Effectively it is now more like length 100. And so on. Using Jean S’s Matlab code, this is found because the “b” output, the impulse response, is a sync function (already center clustered) multiplied by a Hamming window that further tapers the ends to just 8% of the center.
You are correct – if he filters, he should have fully describe the processing.
Can someone, in laymans terms, summarize the meaning and consequences of this finding? Broadly, what does it say about Mann and his methods? Is there any reasonable justification for doing what was done.
This airplane mechanic wants to know.
I think it does not tell us much about Mann that is new. Same pattern. Much as Mann should have consulted about the faulty normalization of his signals in his PCA, he should have asked more about DSP. Possibly someone can correctly draw a conclusion that he/she is using a less-familiar mathematical procedure correctly when it gives “correct” answers to STANDARD exercises. It would of course be tempting to fool yourself that you have “finally gotten the math right” when you spot what you were HOPING to see come out of new data! Nope. And withholding the procedures and data just increases suspicion. So same pattern – at minimum it looks insouciant – but no new “smoking gun” from this – in my opinion.