Climate Models -vs- Climate Reality: diverging or just a dip?

Here’s something really interesting:  two comparisons between model ensembles and  3 well known global climate metrics plotted together. The interesting part is what happens in the near present. While the climate models and climate measurements start out in sync in 1979, they don’t stay that way as we approach the present.

Here are the trends from 1979-”Year” for HadCrut, NOAA, GISSTemp compared to the trend based on 16 AOCGMs models driven by volcanic forcings:

Figure 1: Trends since 1979.

Figure 1: Trends since 1979 ending in ‘Year’.

A second graph showing 20 year trends is more pronounced.Lucia Liljegren of The Blackboard did both of these, and she writes:

Note: I show models with volcanic forcings partly out of laziness and partly because the period shown is affected by eruptions of both Pinatubo and El Chichon.

Here are the 20 year trends as a function of end year:

Figure 2: Twenty-year trends as a function of end year.

Figure 2: Twenty-year trends as a function of end year.

One thing stands out clearly in both graphs: in the past few years the global climate models and the measured global climate reality have been diverging.

Lucia goes on to say:

I want to compare how the observed trends fit into the ±95 range of “all trends for all weather in all models”. For now I’ll stick with the volcano models. I’ll do that tomorrow. With any luck, HadCrut will report, and I can show it with March Data. NOAA reported today.

Coming to the rescue was Blackboard commenter Chad, who did his own plot to demonstrate +/- 95% confidence intervals using the model ensembles and HadCRUT. He showed very similar divergent results to Lucia’s plots, starting about 2006.

http://scientificprospective.files.wordpress.com/2009/04/hadcrut_models_01.png?resize=520%2C285

So the question becomes: is this the beginning of  a new trend, or just short term climatic noise? Only time will tell us for certain. In the meantime it is interesting to watch.

0 0 votes
Article Rating
180 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Graeme Rodaughan
April 15, 2009 11:28 pm

Who would like to bet the following outcomes on the models being correct.
[1] Higher taxes.
[2] Increased energy costs.
[3] Greater Government control.
[4] Biofuel induced food price increases.
[5] Intermittent electricity supplies.
[6] Increased export of manufacturing to other countries (i.e China).
[7] Reduced jobs as business costs increase.
Of course, if the models have no basis in reality and AGW Catastrophism is an artefact of modelling – then the above outcomes can be avoided by pretty much doing nothing.

EricH
April 15, 2009 11:29 pm

Very interesting and very easy to understand maybe dear Al and all other AGW fanatics would like to comment.
Enjoy

Philip_B
April 15, 2009 11:37 pm

The climate models are just an exercise in trend fitting. They have no predictive power over and above the trend.
Its clear that the warming trend of the late 20th century is over. And because CO2 levels (or at least CO2 emissions) continue to rise, it’s also clear CO2 wasn’t the cause of the late 20th century warming.
Excellent work Lucia.

Leon Brozyna
April 15, 2009 11:47 pm

Easily fixed — adjust the models to reflect the reality…
… which reality — HadCrut/GISSTemp/NOAA or UAH/RSS? …
Then, tout how the models have been improved and predicted the current climate all along and then go on about how the models forecast even greater warming in the future.
Isn’t this how science now is done? Cook the books till it tastes right?

Richard deSousa
April 15, 2009 11:50 pm

I think the models are demonstrational… 😉 and not based on facts. It’s based on the fantasy that more CO2 = steady rising temperatures. In the real world, rising CO2 doesn’t = rising temperature… but just the opposite as the temperatures have been declining since 2004 as CO2 levels have increased.

MarkoL
April 15, 2009 11:59 pm

Have you seen this last bunch of scaremongering that has spread in the web?
“Coral Fossils Suggest That Sea Level Can Rise Rapidly”… and yet again a claim that is stated as fact and reported as fact by most newspapers in the world. Makes me mad that there is NO room for discussion and NO room for climate cooling news, yet one study gets over-the-top media attention. http://news.nationalgeographic.com/news/2009/04/090415-sea-levels-catastrophic.html
Scaring people sells.

Kohl Piersen
April 16, 2009 12:17 am

Philip B says –
It’s clear that the warming trend of the late 20th century is over
Well….O.K. if all you mean is that it’s now the 21st century and therefore any warming in the 20th century is over (!)
I’m sure that’s not what you mean, but I don’t think that such observations are worth much. Yes, they state the fact. But that doesn’t mean a great deal all by itself.
I think that it is just not possible to draw general conclusions on the basis of just 9 years.
To get anything meaningful you need to have a look at records which are much more extensive. Then the maxima and minima of trends and their sign, can be discerned accurately.
Of course, the same thing has to be pointed out to AGW proponents who point to a few years around 1998, or rabbit on about the Arctic ice losses in a particular year compared with the previous couple of years. For me that is all interesting but the excitement it seems to generate is rather vacuous.
Having said all of that, nevertheless it is true that the “global average” has not risen in the current decade and that is interesting when compared with the models.

Mike Bryant
April 16, 2009 1:10 am

I have a feeling that the Catlin sea ice thickness data will mesh perfectly with the climate models. Hansen’s adjustments have not been sufficient to mesh with model reality. NOAA, GISS, Hadcrut, UAH and RSS need to get together BEFORE releasing the monthly anomalies or we will never have a concensus.
We MUST catlin the temperature data.

Mike Bryant
April 16, 2009 1:18 am

It’s interesting that the divergence that began in 2006 fits perfectly with the flattening of the sea level graph at CU. I wonder what caused the march spike in Mauna Loa CO2. Since recent studies show simultaneous rises in CO2 in both the NH and SH, perhaps it is tied to recent volcanic activity above and below sea level.

DJ
April 16, 2009 1:20 am

Those forecasts are darn impressive – an error of just 25%.
Meanwhile only 2 months to go before we get to verify this forecast
http://anhonestclimatedebate.wordpress.com/2009/01/18/prediction-of-the-may-2009-uah-msu-global-temperature-result/

John Edmondson
April 16, 2009 1:29 am

I e-mailed the Met office some time ago about their model. There were a couple of interesting points. 1. There model could not be run backwards in time. 2. I got no sensible answer as to what would happen in they ran from 1600 AD (effectively the same as point 1 but asked differently)
This is the e-mail
FWD: Re: FWD: RE: re[2]: FWD: Climate prediction model‏
From: Dan ********** (enquiries@met************)
Sent: 06 January 2009 10:37:40
To: John Edmondson (john_edmondson@**************)
1 attachment
tett_etal…pdf (4.5 MB)
Dear John
Thank you for your email.
The attached (rather large) pdf is a scientific paper which you
might like to read to answer your follow-up questions.
I hope this helps.
Kind regards,
Dan
Weather Desk Advisor
Met Office, FitzRoy Road, Exeter, Devon, EX1 3PB, United Kingdom.
Tel: 0870 900 0100 Fax: 0870 900 5050 Email: enquiries@meto**************http://www.metoffice.gov.uk
Met Office climate change predictions can now be viewed on Google Earth
http://www.metoffice.gov.uk/research/hadleycentre/google/
>
>
> ———————– Forwarded Message ———————–
>
> From: John Edmondson
> To: “”
> Date: Mon, 5 Jan 2009 19:15:00 +0000
> Subject: RE: re[2]: FWD: Climate prediction model
>
> Hi Suzanne,
> Very interesting. So are you saying you could run your
> model from a starting point of 1600 AD, plug in your assumptions for
> what might have an influence on temperature and see what happens?
> What exactly are your asumptions? Not in any detail
> just as probable variables.
>
> Thanks,
>
> John Edmondson
>
>
>
>
>
> > Date: Mon, 5 Jan 2009 11:35:35 +0000
> > From: enquiries@metoff************
> > Subject: re[2]: FWD: Climate prediction model
> > To: john_edmondson@***************
> >
> > Dear John,
> >
> > We do not run climate models with time going backwards
> > in a literal sense as climate models are complex numerical tools
> that
> > are definitely not designed for this purpose (an analogy being to
> try to
> > operate a car by injecting exhaust gases through the tail pipe and
> > expecting it to drive itself backwards and for petrol to accumulate
> in
> > the tank – clearly nonsensical).
> >
> > Running a model “prediction” with time going forwards but climate
> > forcing (such as greenhouse gases and aerosols) changing in a way
> that
> > reverses what has happened in the past is possible, but we have no
> > particular reason to do this since it does not help much in
> answering
> > relevant scientific or modelling questions.
> >
> > We do, however, run climate models regularly (in a forward sense) as
> a
> > means of evaluating their ability to reproduce past climate changes
> (of
> > the last 150 years, or of more ancient periods when the Earth’s
> orbit
> > around the Sun presented a significantly different pattern of solar
> > insolation through the year at given latitudes).
> >
> > I hope this helps.
> >
> > Regards,
> >
> > Suzanne.
> > (On behalf of an expert)
> >
> > Weather Desk Advisor
> > Met Office, FitzRoy Road, Exeter, Devon, EX1 3PB, United Kingdom.
> > Tel: 0870 900 0100 Fax: 0870 900 5050 Email:
> enquiries@met************* http://www.metoffice.gov.uk
> >
> > Met Office climate change predictions can now be viewed on Google
> Earth
> > http://www.metoffice.gov.uk/research/hadleycentre/google/
> >
> >
> ————————————————————————————————————————————————————————————————————————
> > >
> > > From: John Edmondson
> > > To: “”
> > > Date: Sun, 4 Jan 2009 13:31:00 +0000
> > > Subject: Climate prediction model
> > >
> > > I have a question about the model used to prrdict future possible
> > > climate.
> > >
> > > What happens if it run backwards in time?
> > >
> > > Thanks
> > >
> > > John Edmondson
> > >
> > >
> > >
> > >
> > >
> ______________________________________________________________________
> > >
>
>
>
> ______________________________________________________________________
> Get Windows Live Messenger on your Mobile. Click Here!

Graeme Rodaughan
April 16, 2009 1:39 am

MarkoL (23:59:29) :
Have you seen this last bunch of scaremongering that has spread in the web?
“Coral Fossils Suggest That Sea Level Can Rise Rapidly”… and yet again a claim that is stated as fact and reported as fact by most newspapers in the world. Makes me mad that there is NO room for discussion and NO room for climate cooling news, yet one study gets over-the-top media attention. http://news.nationalgeographic.com/news/2009/04/090415-sea-levels-catastrophic.html
Scaring people sells.

MarkoL – All too true. We must remember that most MSM is publically owned and in the business of returning value to the shareholders. They generate revenue by selling advertising space. The value of that space is driven by ratings, ratings by the ability of the MSM to capture and hold peoples attention. Scare stories work.
However – people get bored….
Boredom with AGW may be a factor in it’s eventual undoing. Any scary narrative must eventually wheel out the true horror – you can’t keep people in suspense forever. The monster must be revealed. The inability of AGW Catastrophism to actually deliver on it’s promise of catastrophy will certainly hinder it’s ability to keep the MSM onside. As the public gets bored they will want to move on, and the MSM must follow the public or die for lack of revenue.

Lance
April 16, 2009 2:14 am

“Have you seen this last bunch of scaremongering that has spread in the web?”
Unfortunately yes, and also with National Geographic.
NG has been a staple of reading for my whole life, a big influence fostered by my grand dad.
A most informative, scientific and exciting digest that has ever been spawned. Most with world class professional photographs, pictures that build an adventure and surpasses the imagination sometimes.
Sadly, I never thought in a lifetime, that one day(one year ago) I’d be forced to send out a E(hate)-mail to the office of National Geographic denouncing the unscientific/fraudulent/bias approach their publications taken in the last few years.
I expected more from this now so called natural scientific journal, it seems a contrary view is not to be had anywhere these days?
AGW science consisting of NO facts, NO evidence, NO out of control warming and as of the last 10 years, this last year has been as cool and ice building as 1979. The year they started measuring from satellite and stopped caring about real temperatures and the temperature stations on earth. We now give ourselves over to Mr. JH’s scientific video game. Based on 30 year old science, when some cycles can last hundreds or thousands of years caused by natural cycles of thermo energy from our sun
But yet, we ignore that the sun has almost 100% of effect on our temperatures and CLIMATE, and why life exists on this rock in the first place.
RIP National geographic, I’ll miss you.

David Watt
April 16, 2009 2:25 am

Anthony,
There is something wrong with the graphs or text.
The text and the graph headings talk of 1979 to present but the graphs show 1999 to present.
It looks like it should all be 1999 to present

Allan M R MacRae
April 16, 2009 3:09 am

Repost from Allan M R MacRae (02:26:43) :
Please see http://www.iberica2000.org/Es/Articulo.asp?Id=3774
The plot of Surface Temperature ST (Hadcrut3 Global anom) versus Lower Troposphere Temperature LT (UAH Global anom) shows a gradually increasing deviation of ST of ~0.2C above LT since 1979.
Probably, all of this 0.2C can be ascribed to ST measurement warming bias.
Absent the ST warming bias, there is no significant global warming since 1940.
I think both GISS and Hadley ST’s are misleading and exhibit significant warming biases that render them practically useless as a basis to infer actual global warming.
The satellite-based LT temperature, while not perfect, is a far superior database for such a purpose, in my opinion.
**************************
OK – that was just an excuse to post something totally OT.
Seriously, you all should look and listen to this:

Moderator – we often get so upset about this global warming fiasco – I’m suggesting we all take a 7 minute time-out, to see something that is rather inspirational. Over 11 million views so far.
Best to all, Allan

MattN
April 16, 2009 3:13 am

How many years must the models and reality diverge before it is accepted that the models are wrong?

old construction worker
April 16, 2009 3:44 am

Me thinks the “CO2 drives the climate theory” is a little out of whack. Somebody needs to go back to the drawing board.
As a tax payer, I don’t mine funding climate research, but don’t give me garbage.

April 16, 2009 4:14 am

Who would like to bet the following outcomes on the models being correct.
[1] Higher taxes.
[2] Increased energy costs.
[3] Greater Government control.
[4] Biofuel induced food price increases.
[5] Intermittent electricity supplies.
[6] Increased export of manufacturing to other countries (i.e China).
[7] Reduced jobs as business costs increase.
——
Ah, but did you notice that all of these effects ARE the desired results of the AGW scheme (er, scam) ????

Richard111
April 16, 2009 4:21 am

MarkoL (23:59:29) :
“Scaring people sells.”
Absolutely. Just wait until the become aware of the coming reduced growing seasons.

schnurrp
April 16, 2009 4:25 am

It will be interesting to see how the AGW believers explain this result. And of course they will with the use of their own graphs, etc. One of the most frustrating aspects of this debate for the non-scientists who want to understand what is happening is the lack of graphical data accepted by both sides.
HADCRUT is an accepted metric, I believe, but it appears that the trace of the ensemble of GCMs can change depending on which models are included. Are there only 16 GCMs or are the believers going to include some in their “ensemble” that will expand the confidence intervals to include the current HADCRUT results? Have the skeptics left out some GCMs that should be in the ensemble? Is “Model Mean (No Essence NV)” an ensemble accepted by both sides?
What other climate measurements are out their that are accepted by both sides?

Darren Krock
April 16, 2009 4:29 am

OT, but i thought u might like it.
From Glenn Beck, today’s post. “I learned something from a lawyer friend of mine who won lots of cases in front of judges and lawyers — I asked him how he won so many cases. He said it’s easy: If the law supports my client’s position I argue the law. If not, I argue the facts. If the facts don’t support my client’s position, I just attack the opposition”
Sounds like the AGW crowd doesn’t it?

April 16, 2009 4:33 am

Until the climate models include the impact of ocean SST and ocean currents such as ENSO,PDO,AMO,SOI, NAO , MOC,THC, SAS, etc, their credibilty will continue to be zero.It points out the problem we have today. If you cannot get the next seaon, the next year or the next decade right you will most likely not get the next 100 years, the next 200 years and even the next 1000 years right. It seems to me that anyyone can make forecasts 100 year ahead. No will be around to hold you accountable or point out your errors.[maybe that is why IPCC uses this long term outlook primarly] You have to demonstrate near term credibilty before anyone will take you seriously. My hat is off to the meteorologists who have to be credible every week , every month and every year and make a living at it.

anna v
April 16, 2009 5:19 am

I will once again point out that the errors given for model projections are not real errors. They are just estimates of the stability of the input conditions when they are testing their fit for chaotic inputs. They are not varying their numerous parameters within +/- 1 sigma . If they did that, the error bars would be close to 1C .
As I have often pointed out when one changes the albedo by 1sigma ( assuming a 1% error) in the toy model http://junkscience.com/Greenhouse/Earth_temp.html more than 0.3C is changed in the temperature projection. Let alone the effect of the systematic change in albedo seen in http://www.leif.org/research/albedo.png durng the years plotted above.
Models are meaningless fitting games.

Dan Lee
April 16, 2009 5:21 am

Its human nature to believe that the conditions we’re experiencing now are how things will always be. Its what drives economic bubbles. When housing prices were flying upward, millions bet their financial futures that they would continue to do so. When gas prices were skyrocketing we were treated endless analyses of why we would be paying $5/gal from now on, and never see oil below $70-$80 a barrel again. When we get into a years-long bull market, only dour curmudgeons complain about banks or insurance companies taking unwarranted risks.
All these things were self-evident at one time or another, and ANY theory that explained them made sense to people. Theories are “self-evident” and “make sense” when they both fit one’s worldview, and offer justification for that worldview by offering some explanation for current conditions.
Its been warming since the 70’s, so most of the world’s population has experienced a warming trend throughout their entire lives. Since trends don’t rise in a straight line, the current leveling of temperatures looks like other plateaus we’ve experienced during that time. When a trend peaks and begins a new trend downward, often that is indistinguishable from a normal plateau until years afterward, after the new trend has had time to establish itself.
Graphs such as these, showing divergence that is well outside the confidence intervals, clearly indicate that the trend models are wrong. Anyone who follows this forum knows that already (or should at least be experiencing a nagging feeling of doubt).
Any curmudgeon who calls the end of a trend is going to be correct sooner or later. It is those who can call it AND show dispassionate, non-worldview-reinforcing evidence for it who will eventually be held in regard by the scientific community. Only time and nature itself will show who is calling the trend correctly and who is simply offering explanations that make sense because they match prevalent beliefs about industry or the environment or the economy.

Jack Green
April 16, 2009 5:27 am

The models are all 2D simulators. We need 3D to get a good history match from the past several million years. Only then can you project the model into the future. With so many variables we have an indeterminate solution. However; the AGW crowd have focused on CO2 emitted by man. That’s like saying while chewing gum I bit my lip and got lip cancer. Gum causes cancer.

Editor
April 16, 2009 5:28 am

Thank goodness the models only provide projections. Had they provided real predictions, the authors would have to go back and change the hypotheses behind the models.
I like Richard deSousa’s “I think the models are demonstrational.” Fits well with the status “operational” for weather prediction software once it’s put into service.

Frank K.
April 16, 2009 5:29 am

matt v. (04:33:19) :
“…their credibilty will continue to be zero.”
In my opinion, the credibility of many climate models will be close to zero because the code developers (in particular the NASA GISS Model E) do a horrible job documenting and validating their codes (there are some exceptions). In the case of Model E, we don’t even know for sure what equations it’s attempting to solve – which in my mind makes it’s value to science zero! Yet, year after year GISS scientists publish papers about tipping points and AGW scare scenarios using Model E. I’ve been harping on Model E for some time now, but appears that the people at GISS are simply NOT interested in documenting their code…
Another aspect of these temperature anomaly comparisons that I find striking is that we have scientists running spatially 3-D, time-dependent calculations on large supercomputers to get a zero dimensional (“global average”), time-averaged (over a calendar year) result, which is then smoothed further to get running averages! And if it’s within 50% of reality, they claim their models are “accurate”! It has always amused me that people would claim that their codes “get the physics right” and have wonderful “predictive skill” based on a set of 3-D spatially averaged values, averaged in time and smoothed. And they need to ** tune ** their models to even get that set of numbers to show any reasonable agreement with reality…

Jon Jewett
April 16, 2009 5:33 am

“Lance (02:14:41) :
Sadly, I never thought in a lifetime, that one day(one year ago) I’d be forced to send out a E(hate)-mail to the office of National Geographic denouncing the unscientific/fraudulent/bias approach their publications taken in the last few years.
RIP National geographic, I’ll miss you.”
Lance,
Welcome to the club, not only with the NG but with dozens of publications of all sorts. That happened to me with National Public Radio back in 1994 and I started to question most everything I was told by most everyone.
The next question is:
Now that we know this is a lie and we know why they are lying, are they just stupid or are they “evil”?
The second question is:
What other lies have they told us?
Regards,
Steamboat Jack

Editor
April 16, 2009 5:38 am

David Watt (02:25:27) :

There is something wrong with the graphs or text.
The text and the graph headings talk of 1979 to present but the graphs show 1999 to present.

The graphs are not easy to read, at least they’re not intuitive.
The first graph’s legend is wrong, the caption is right. What it’s showing is the trend from 1979 to the time along the X-axis. I.e. the 1999 point is showing trend over the years 1979-1999. at 2005, it shows the trend from 1979-2005.
It might have been nice to include shorter time frames, but having the 20 year minimum is a good lead-in the the second graph. Note it starts 2 years earlier. (Augh!) At 1999, the data points are the trend over the previous 20 years and match the 1999 points on the first graph. Subsequent points look at the previous 20 years, so the 2005 points reflect 1985-2005 which emphasizes recent years more than the first graph.

Mike Bryant
April 16, 2009 5:43 am

Lucia and Anthony,
I really appreciate the way these graphs have been flattened or normalized. It seems that most temperatore graphs have been squeezed left to right and stretched up and down to exaggerate spikes. Maybe since we have been seeing the scare graphs for so long, it would be fair play to flatten them even further. If you did that, I think it would be even closer to what people really feel.

Douglas DC
April 16, 2009 6:21 am

I say cooling further.Soon too…

PeteB
April 16, 2009 6:23 am

Can anybody reconcile with this ? This is quite heavily cited in the scientific literature – I would have expected any major flaw to have been raised in the literature
http://www.pik-potsdam.de/~stefan/Publications/Nature/rahmstorf_etal_science_2007.pdf
We present recent observed climate trends for carbon dioxide concentration, global mean air temperature, and global sea level, and we compare these trends to previous model projections as summarized in the 2001 assessment report of the Intergovernmental Panel on Climate Change (IPCC). The IPCC scenarios and projections start in the year 1990, which is also the base year of the Kyoto protocol, in which almost all industrialized nations accepted a binding commitment to reduce their greenhouse gas emissions. The data available for the period since 1990 raise concerns that the climate system, in particular sea level, may be responding more quickly to climate change than our current generation of models indicates.
Giventhe relatively short 16-year time period considered,
it will be difficult to establish the reasons
for this relatively rapid warming, although
there are only a few likely possibilities. The first
candidate reason is intrinsic variability within the
climate system. A second candidate is climate
forcings other than CO2: Although the concentration
of other greenhouse gases has risen
more slowly than assumed in the IPCC scenarios,
an aerosol cooling smaller than expected
is a possible cause of the extra warming. A third
candidate is an underestimation of the climate
sensitivity to CO2 (i.e., model error). The dashed
scenarios shown are for a medium climate sensitivity
of 3°C for a doubling of CO2 concentration,
whereas the gray band surrounding the scenarios
shows the effect of uncertainty in climate sensitivity
spanning a range from 1.7° to 4.2°C.

hareynolds
April 16, 2009 6:34 am

Lance (02:14:41) said:
RIP National geographic, I’ll miss you.
Personally, I gave up on them a few decades ago when they, along with everybody else in the coated stock four-color full page bleed world (antique publishing talk), threw-out DDT with the bathwater.
In the interest of preserving a select group of high-food-chain avian predators (the Peregrine Falcon was the cause celebre, IIRC), we ENTIRELY BANNED DTT worldwide. Not restricted it’s use in some way (e.g. within 100 meters of homes and schools, or even inside only), but an outright, total, worldwide ban.
“Even” the pesticide companies got on the bandwagon, but nobody at the time seemed to notice that DDT had years, nay decades, before gone off patent, and could be made simply and locally pretty much everywhere (a combination that is REALLY bad for what the economists call “excess rents”)
Meanwhile mosquito-borne diseases sky-rocketed in Africa and elsewhere equatorial, but the white folks (that means me) got to feel good about “saving the Peregrine”.
I consider the WW ban on DDT to be the first “Mob Action” on the environment, that is, an uncontrolled unconsidered irrational public outcry which leads to over-reaction and. not incidentally, corporate profit-taking (in this case, on patent pestidcides which are more expensive and don’t work as well.
We are on the cusp of another episode of Mob Action, but this time BP, XOM and RDS making plans to be in the vanguard of carbon sequestration. As I’ve mentioned before, pumping carbon dioxide into a petroleum reservoir (so called “CO2 flood”) is a tertiary production technique already well established to increase oil production from old fields. It would be irrational for the oil companies NOT to jump on a bandwagon which promises to PAY them to use a technique to increase their profits.

April 16, 2009 6:38 am

Douglas DC (06:21:40) :
I say cooling further.Soon too…
Could you be more specific. What do you mean by “soon”?

wws
April 16, 2009 6:48 am

Question: “How many years must the models and reality diverge before it is accepted that the models are wrong?”
Easy answer – until just after cap’n’trade become enshrined in law, and then we can all be told it will be too “disruptive” to change things back, especially when we will need years and years and years and years of study to find out what to do next.
It’s not about the science, anyone who thinks that is missing the boat. The science is inconsequential – it’s all about the money. Whether it’s the governments, or the publications, or the grant getters – it’s all about the money.

April 16, 2009 7:03 am

Allan M R MacRae (03:09:55) :


Moderator – we often get so upset about this global warming fiasco – I’m suggesting we all take a 7 minute time-out, to see something that is rather inspirational. Over 11 million views so far.

Thank you for that… it actually brought tears to my eyes. I really needed a smile this morning. 🙂

John Galt
April 16, 2009 7:19 am

The climate models don’t show how the climate actually works but how it must work if the run-away greenhouse effect exists.
They started with an hypothesis and built a model to demonstrate it. That’s all fine and can lead to better understanding of the actual climate. But the models must not be confused with the real-world.
Most of the gloom and doom projections are based not on observation but from models and statistical extrapolation. But the doomsayers seem to always forget to have a look outside to see if the models match the real world.
Each year, the models diverge from the observations. Eventually, the models get tweaked and updated. Most people don’t seem to notice that the models were reset. This is not how a scientific study is supposed to be run.

Ray
April 16, 2009 7:20 am

I’m sure if they ran their models based of the effect of temperature on the concnetration of CO2, this time they would get it right.

BarryW
April 16, 2009 7:26 am

If you extend the time period using Hadcrut, you get a graph of the 20 yr trends that has a sinusoidal shape with a 60 yr wavelength and with one of the peaks at the end of the twentieth century. Very obvious in the 30 yr trends.
Here’s a plot: http://farm4.static.flickr.com/3623/3446920199_52496549cb_b.jpg
If the models don’t show this sort of large structure then they are going to diverge from the observations.

MattB
April 16, 2009 7:36 am

hareynolds and others… there has never been an outright, total, worldwide ban on DDT. If you can’t even get something as basic as that right is it any wonder you don’t get AGW.
Have you ever thought of doing some independent research rather than just picking up lies from blogs that suit your views?

April 16, 2009 7:36 am

Graeme Rodaughan (23:28:42) :
I think there is an error in your line:
“6] Increased export of manufacturing to other countries (i.e China).”
It should be read like this:
6] Increased IMPORT of manufacturing FROM OTHER countries (i.e China).
Reasons:
1.Decreased production due to your poits 1 to 5
2.Decreased production of crops due to increased cold in winter time.
3.Decreased production of crops and cattle due to great droughts in wide reagions of the USA.

MattB
April 16, 2009 7:41 am

How come nearly every post has ignored the blog’s conclusion:
“is this the beginning of a new trend, or just short term climatic noise? Only time will tell us for certain.”
If this was a statistically relevant diversion do you honestly think it would not be stated pretty clearly, rather than pondering if it means anything?

MattB
April 16, 2009 7:42 am

Oh yeah – in my post about DDT… I was not referring to WUWT when I said “picking up lies from blogs”… sorry if it reads as a dig at this blog.

neill
April 16, 2009 7:43 am

awe-inspiring video! Kind of feel like we’re that lady on stage BEFORE the song.
OT, FYI:
SCIENCE MEETING
California’s leading experts on the potential effects of climate change on
the state will gather at Scripps Institution of Oceanography, UC San Diego next
week to discuss recently released findings and gather feedback from the public.
The Science Meeting will take place:
APRIL 20, 2009
9:30 a.m. to 2:30 p.m.
Robert Paine Scripps Forum for Science
8610 Kennel Way (formerly Discovery Way) La Jolla, California
There will be a no-host lunch at noon.
The state Climate Action Team (CAT) draft assessment report, released on April
1, uses updated, comprehensive scientific research to outline environmental and
societal climate impacts. Members of the science team who conducted the
assessment will deliver presentations and the public will have the opportunity
to ask questions or give comments on the presentations. The feedback will be
used to guide future CAT Research Group actions.
Members of the public wishing to attend the meeting are asked to RSVP to ensure
that adequateparking, transportation and accommodations are available. (See
link below).
Parking and campus loop shuttle service is available at Birch Aquarium at
Scripps, 2300 Expedition Way, La Jolla. The campus shuttle from the aquarium to
the Scripps Oceanography Director’s Office runs every 15 minutes. Shuttle
space is limited. Limited street parking on La Jolla Shores Drive is also
available.
Hope some from SoCal can make it as well.

April 16, 2009 7:53 am

Re: Frank K. (05:29:12) :
“In my opinion, the credibility of many climate models will be close to zero because the code developers (in particular the NASA GISS Model E) do a horrible job documenting and validating their codes (there are some exceptions). ”
This is a sore point with me as well, since I have some background in programming.
On RealClimate.org, Gavin Schmidt argues against industry standard practices of source code management, configuration management, and disclosure of code and data. Here’s a salient quote from Schmidt in a response to comment 89 in the post On Replication:

“My working directories are always a mess – full of dead ends, things that turned out to be irrelevent or that never made it into the paper, or are part of further ongoing projects. Some elements (such a one line unix processing) aren’t written down anywhere. Extracting exactly the part that corresponds to a single paper and documenting it so that it is clear what your conventions are (often unstated) is non-trivial. – gavin]”

If this isn’t a reason to use source code control, documentation and configuration management, I don’t know what is.

SteveSadlov
April 16, 2009 7:56 am

Some things are just very obvious. The overall set of leading indicators is that we are in a cold period. The only questions are how far down, and for how long?

Tim Clark
April 16, 2009 8:03 am

Anthony:
I know this is OT from this thread and is political, but there isn’t really a good thread for it and it may interest your readers what is currently happening in Washington. If the link works, it opens a document mostly based on cap and trade legislative options.
NACD Submits Climate Document
The U.S. House Agriculture Committee recently distributed a questionnaire on climate legislation and related policy issues to more than 400 groups, including NACD. The questionnaire explored policy options for potential climate legislation. The NACD Legislative Committee completed the Association’s responses based on policy set by the NACD Board of Directors, and submitted the document to the Committee on Friday, April 10, 2009. To view a copy of the document, please click:
http://nacdnet.org/policy/naturalresources/energy/climate_legislation_questionnaire.pdf
The next steps of the House Agriculture Committee are not clear: the House Energy and Commerce Committee, which holds primary jurisdiction over climate policy, has released draft legislation and is expected to review and pass legislation by Memorial Day.

Chilling stuff!

CodeTech
April 16, 2009 8:14 am

Regarding DDT:

And the 1972 ban in the United States led to an effective worldwide ban, as countries dependent on U.S.-funded aid agencies curtailed their DDT use to comply with those agencies’ demands.

That is the reality. Maybe not an outright legal worldwide ban, but since strongarm tactics were used to stop its use, what is the difference?
http://www.enterstageright.com/archive/articles/0904/0904ddt.htm

Mark T
April 16, 2009 8:16 am

MattB (07:41:21) :
If this was a statistically relevant diversion do you honestly think it would not be stated pretty clearly, rather than pondering if it means anything?

You ponder if it means anything because you don’t know what conclusion to draw without more information, but still find it interesting, correct? What exactly is wrong with that? The simple fact that it is happening calls into question the predictive power of the models in the first place, so it is clearly worth mentioning on that basis alone.
Mark

George Bruce
April 16, 2009 8:17 am

“The next question is:
Now that we know this is a lie and we know why they are lying, are they just stupid or are they “evil”?”
“The second question is:
What other lies have they told us?”
Regards,
Steamboat Jack
Jack: The complete answer to both questions is “yes.”

edward
April 16, 2009 8:25 am

Here’s a wise editorial from Michael Barone on climate, statistics and changing our economy around a carbon tax. Link at:
http://www.realclearpolitics.com/articles/2009/04/16/on_climate_and_health_beware_of_easy_formulas__96013.html
Thanks
Edward

Dave Middleton
April 16, 2009 8:27 am

Off topic…Replying to…
MattB (07:36:36) :
hareynolds and others… there has never been an outright, total, worldwide ban on DDT. If you can’t even get something as basic as that right is it any wonder you don’t get AGW.
Have you ever thought of doing some independent research rather than just picking up lies from blogs that suit your views?

It is true that the Stockholm Convention made exceptions for the use of DDT for disease control…However, the over-hyping of the dangers of DDT from the publication of Silent Spring to the US ban of DDT use in 1972 did lead to an environment in which the use of DDT was so discouraged that its use was effectively banned. Many African nations did stop using it and malaria deaths did skyrocket. Since 1999, South Africa and other African nations resumed dusting home interiors in mosquito-prone areas; and malaria deaths have plummeted.
On topic…
The reason that the climate models are so wrong is simple…They fail to properly account for water vapor and clouds. Every model assumes a positive feedback mechanism from water vapor. Yet, there is no empirical or observational evidence to support such a positive feedback mechanism. If anything, the observational evidence supports a negative feedback mechanism…
http://wattsupwiththat.com/2009/03/05/negative-feedback-in-climate-empirical-or-emotional/

Dave Middleton
April 16, 2009 8:29 am

Here’s a “funny” graphic. I digitized the HadCRUT3 curve onto Hansen’s 1988 model…:)
http://i90.photobucket.com/albums/k247/dhm1353/Hansen_1988.jpg

Mike Bryant
April 16, 2009 8:32 am

Where can we get a graph of model compared to reality on which each and every model adjustment is represented? I bet Lucia could produce such a graph. Come on Lucia, are you game?

April 16, 2009 8:47 am

neill (07:43:08) :
Science meeting. Is it a sort of “Focus Group” on carbon shares’ marketing?
Because that is just the purpose of all these Climate Models: There will be some who will receive carbon credits and those who will buy carbon shares, of course the “spread” will be fair (above 1000%). Carbon credits will be given to already existing sources (forests,etc) and carbon shares will be bought (mandatory) by existing “polluters”. At the end nothing will change, except for those who do the “business” (in the good all days it should has been called a swindle).
Those pirates at the indian ocean got a lot to learn!

Gerald Machnee
April 16, 2009 8:49 am

Is this related in any way to RC’s post attacking Pat Michaels and his charts?

Frederick Michael
April 16, 2009 8:49 am

MattB (07:36:36) :
hareynolds and others… there has never been an outright, total, worldwide ban on DDT.

Interesting; I did not know this. After the World Health Organization banned DDT, who (no pun intended) continued to use it?

Tom P
April 16, 2009 8:55 am

BarryW,
That is a fascinating plot:
http://farm4.static.flickr.com/3623/3446920199_52496549cb_b.jpg
and gives an interesting extended perspective on the graphs in the article. Thanks for posting it.
The thirty-year trend is the one to look at – the twenty year trend is strongly aliasing the trend period – the peak are exactly twenty years apart!
Can you tell me the basis for calculating the trend from the HadCRUT data in these plots?

D Johnson
April 16, 2009 9:06 am

PeteB (06:23:42) :
With regard to the Rahmstorf et al 2007 paper, you should read Lucia’s blog, which states that the statistical approach taken in the paper is not appropriate.
http://rankexploits.com/musings/2008/comment-on-the-slide-and-eyeball-method/

geophys55
April 16, 2009 9:08 am

Quoting:
“Lance,
Welcome to the club, not only with the NG but with dozens of publications of all sorts. That happened to me with National Public Radio back in 1994 and I started to question most everything I was told by most everyone.”
Commenting:
Me, too. NPR and Scientific American and the local fishwrap Houston Chronicle. I started reading on AGW to get the facts straight – but I find there’s no “there” there.

Cathy
April 16, 2009 9:17 am

@ Allan M R Macrae
Dang. Thanks.
Is WUWT great or what!? Thanks to Anthony and moderators for letting Allan put through that Susan Boyle performance on You Tube.
Yes, we need a little beauty and wonderment as we slog through the mire of the climate change debate.
(I haven’t watched that clip yet without tearing up and I’ll bet I’m not the only one batting back the tears:0)

Ron de Haan
April 16, 2009 9:19 am

Graeme Rodaughan (23:28:42) :
“Who would like to bet the following outcomes on the models being correct.
[1] Higher taxes.
[2] Increased energy costs.
[3] Greater Government control.
[4] Biofuel induced food price increases.
[5] Intermittent electricity supplies.
[6] Increased export of manufacturing to other countries (i.e China).
[7] Reduced jobs as business costs increase.
Of course, if the models have no basis in reality and AGW Catastrophism is an artefact of modelling – then the above outcomes can be avoided by pretty much doing nothing”.
Graeme,
All very much to the point.
In this regard I would like to refer to the Mockton letter that was send to The US House Committee with a copy to President Obama after his testimony.
The letter not only destroys the fiction of Global Warming but also points the finger to 50 cases of climate data fraud.
This is a bomb released and it’s effect will be noticed.
See:
http://scienceandpublicpolicy.org/images/stories/papers/reprint/markey_and_barton_letter.pdf

Sam the Skeptic
April 16, 2009 9:34 am

MattB — calm down, son. There’s no need to get angry just because we don’t happen to agree with you. What point are you trying to make? We’re all set to listen if you have something worthwhile to say.
In case you hadn’t noticed the graphs we’re talking about show that computer game climatology looks as if it differs a bit from what is happening in the real world. Are we missing something somehwere.
(By the way, I know that DDT wasn’t actually totally banned in the whole world but we came pretty close to it and an awful lot of poor kids in Africa died that didn’t need to and all because the original premise was W-R-O-N-G and a lot of idiots were conned and greedy corporations jumped on a very profitable bandwagon … and that sounds awfully familiar)

April 16, 2009 9:36 am

John Edmondson (01:29:34) :
I e-mailed the Met office some time ago about their model. There were a couple of interesting points. 1. There model could not be run backwards in time. 2. I got no sensible answer as to what would happen in they ran from 1600 AD (effectively the same as point 1 but asked differently)
> > Dear John,
> >
> > We do not run climate models with time going backwards
> > in a literal sense as climate models are complex numerical tools
> that
> > are definitely not designed for this purpose (an analogy being to
> try to
> > operate a car by injecting exhaust gases through the tail pipe and
> > expecting it to drive itself backwards and for petrol to accumulate
> in
> > the tank – clearly nonsensical).
Ho ho. Shoving exhause gases up the zoompipe of a motor car and having this result in the petrol tank soon overflowing would plainly be in absolute contravention of the Second Law of Thermodynamics. Achieving such ends is well outside the reach of mere mortals, who played no part whatsoever in the construction of these immutable Laws. (Or, at least, in their underlying substance). But mere mortals apparently have infinite control over the computer models they cook up so tirelessly. Wherein lies the problem?
Geoff Alder

WWS
April 16, 2009 9:36 am

MattB wrote: “If this was a statistically relevant diversion…”
I suspect that you do not understand the science behind what you’re attempting to comment on.
First, you should have used the phrase “statistically significant”, not relevant.
“Statistically relevant” comes from philosophy, not the hard sciences and as such means next to nothing. (Almost any relationship can be relevant) But statistical significance is a mathematically determined value.
You must have missed the 3rd graph, which shows that these results are indeed statistically significant. (Not to mention relevant!) You misread Andrew’s editorial speculation as a mathematical conclusion. Correctly stated, in your terms – “will this statistically significant deviation continue into the future? Only time will tell.” This is a true statement because mathematics can give us a reasonable guess about the future, but never an absolute certainty.
english majors. sheesh.

Jim G
April 16, 2009 9:49 am

I personally think that the National Geo issue can be summed up into one of the printed letters to the editor.
The writer was looking out the window of her home and noting that the country around her was being ravaged. And went on to lament the destruction of our environment and its beauty.
I guess it was ok that the land was ravaged to build her home, but now that she has her’s, it is now wrong for others to build too.

John F. Hultquist
April 16, 2009 9:59 am

PeteB (06:23:42) : “This is quite heavily cited …”
The period of interest includes the 1993 cooling from the Mt. Pinatubo event and the 1998 El Nino warming …
See: http://www.drroyspencer.com/latest-global-temperatures/
which took about three years to dissipate. WUWT had a guest post by Bob Tisdale beginning in January. Part 1 is here:
http://wattsupwiththat.com/2009/01/11/can-el-nino-events-explain-all-of-the-global-warming-since-1976-%e2%80%93-part-1/
Bob T’s material is longer, requires more thought, brings in more ideas, and in the end provides insight into Earth’s processes.
The paper you link to does none of this. The group of authors and the years chosen for this may indicate that they got together and deliberately produced a piece that would show their story in a way that would fit the data, be scary, and widely cited.
It is not helpful to the science.

Adam from Kansas
April 16, 2009 10:00 am

Climate dip or not, the sea-ice extent chart continues to laugh in the face of ice alarmists showing a virtual 3 way tie between this year, 2008, and 2003.
The newest SST data should ensure global temps. will not rise to levels above the peak several years ago for at least the next several months, the SST chart provided by NOAA looks like STT’s have just recently hit the next peak and starting to drop again particulary south of the equator.

April 16, 2009 10:14 am

WWS (09:36:31):
Agree! Now AGWers are trying to underestimate scientific algorithms and procedures. It seems to be an innovative strategy from AGW pseudoscience. Lucia Liljegren’s procedures for evaluating data are pretty correct and those procedures are used even by the purest AGW defenders. Nonetheless, AGWers try to say that we’ve applied those procedures on a wrong way, whereas if the same procedure is applied on the same way by an AGWer, but changing constants and flawing data, they raise it to a supernatural level. I am not used to speak on this tenor, but I have experienced the same thing with people who have not shown a single scientific foundation against my papers. The argument has been always the same.

Bill Illis
April 16, 2009 10:20 am

I’ve updated the chart extending GISS Model E by 10 years (from 2003 to 2013) for the newest temp data.
I’ve also built in a 0.1C decline for solar forcing given the state of the Sun. All the models will be building this in now.
The 2003 version of Model E would be off by 0.23C in just 5 years.
http://img259.imageshack.us/img259/6594/modeleextramar09n.png

Robert Wood
April 16, 2009 10:21 am

O/T But this is very interesting from Spaceweather, If someone has the time to provide a link to the original report.
The most powerful solar explosions are now moving in slow motion. “Lately, coronal mass ejections (CMEs) have become very slow, so slow that they have to be dragged away from the sun by the solar wind,” says researcher Angelos Vourlidas of the Naval Research Lab.
Each second in the SOHO animation corresponds to an hour or more of real time. “The speed of the CME was only 240 km/s,” says Vourlidas. “The solar wind speed is about 300 km/s, so the CME is actually being dragged.”
Vourlidas has examined thousands of CMEs recorded by SOHO over the past 13 years, and he’s rarely seen such plodding explosions. In active times, CMEs can blast away from the sun faster than 1000 km/s. Even during the solar minimum of 1996, CMEs often revved up to 500 or 600 km/s. “Almost all the CMEs we’ve seen since the end of April 2008, however, are very slow, less than 300 km/s.”

Is this just another way of saying “the sun is very quiet?” Or do slow-motion CMEs represent a new and interesting phenomena? The jury is still out. One thing is clear: solar minimum is more interesting than we thought.

Mark T
April 16, 2009 10:27 am

Geoff Alder (09:36:08) :
> > We do not run climate models with time going backwards
> > in a literal sense as climate models are complex numerical tools
> that
> > are definitely not designed for this purpose (an analogy being to
> try to
> > operate a car by injecting exhaust gases through the tail pipe and
> > expecting it to drive itself backwards and for petrol to accumulate
> in
> > the tank – clearly nonsensical).
But mere mortals apparently have infinite control over the computer models they cook up so tirelessly. Wherein lies the problem?

The analogy supplied is an insufficient reason for not running a climate model in reverse. The only reason you should not be able to run it in reverse is non-linearity. Any (and all) linear process is reversible. Whomever explained this to the desk clerk either did not understand, or did not sufficiently explain this.
I would guess that most climate models are indeed non-linear and thus, wholly incapable of being run in reverse. I don’t know this for sure, however.
Mark

Sam the Skeptic
April 16, 2009 10:33 am

O/T
My eyesight is not what it was but is there a very small sun-speck at about 12 o’clock?

Douglas DC
April 16, 2009 10:41 am

As Soon- being within two (2) years. furhter into Dalton-type (God help US if its a Maunder) minimum as I’m aready hearing about possible upper midwest
crop damage and possible failure…
(Documentaion forthcoming..)

JamesG
April 16, 2009 10:43 am

J Scott Armstrong argues very well that calculated statistical significance shouldn’t be used either, thanks to it’s many pitfalls and abuses. From Wikipedia: In the papers “Significance Tests Harm Progress in Forecasting,”[4] and “Statistical Significance Tests are Unnecessary Even When Properly Done,”[5] Armstrong makes the case that even when done properly, statistical significance tests are of no value.”
I’ve long thought the same. It only ever seems to be used for spurious correlations. If you reject any paper on first sighting the phrase then you won’t ever miss anything useful. Dr Wm Briggs agreed with me on his blog that we’d be all better off if it just disappeared entirely from the field of statistics.

pmoffitt
April 16, 2009 10:48 am

AGW seems to be running up against Saul Alinsky’s 7th Rule for Radicals- “A tactic that drags on too long becomes a drag” Alinsky, a hero of many in the Green Movement, understood that a political issue- especially one that relies on an emotional response- becomes boring in time to both the radical and those they target. Alinsky should have also warned to reach a political solution quickly before the “cause” is exposed to full analysis.

April 16, 2009 10:55 am

MattB (07:36:36) :
“hareynolds and others… there has never been an outright, total, worldwide ban on DDT. If you can’t even get something as basic as that right is it any wonder you don’t get AGW.”
The Stockholm Convention, with 98 signatory nations, banned the use of DDT world wide, effective 2002, with some parts of the world, controversially, still using it for disease vector control. Nations that required it, were severely limited, or couldn’t get it because reduced source (supply) increased the cost, and organizations that funded many of the eradication efforts succumbed to political correctness pressures and refused to fund DDT use.
If one is to be anally literal, your statement would stand. In a pragmatic sense, though, the ban is worldwide and total.
For such a literal-minded person, I’m surprised you are not concerned by the dichotomy between the AGW theories and models, and real-world observation.

Ray
April 16, 2009 11:01 am

This is an excellent document from Lord Moncton: http://scienceandpublicpolicy.org/images/stories/papers/reprint/markey_and_barton_letter.pdf
It’s funny also how he rubs the nose of NOAA by pointing to the Santa Rosa station, which has a rating of 5 on the surfacestation project database. Ecellent Work!

April 16, 2009 11:01 am

Mike Bryant (08:32:49),
Is this what you’re looking for?: click

timetochooseagain
April 16, 2009 11:07 am

DJ: They aren’t forecasts, they are “projections”-and as meteorologists like Anthony can tell you, the degree of impressiveness of a “forecast” is not how certain it is, but how successful it is…
Someone above mentioned Rahmstorf (2007)-note that since he was looking at a sixteen year period, the Pinatubo eruption was ~very~ near the end point, as was the 2005 El Nino-this combination will inflate the actual trend. David Stockwell also has some posts on it:
http://landshape.org/enm/rahmstorf-revisited/
and you gotta love that graph!

Frank K.
April 16, 2009 11:17 am

CO2 Realist (07:53:41) :
On RealClimate.org, Gavin Schmidt argues against industry standard practices of source code management, configuration management, and disclosure of code and data. Here’s a salient quote from Schmidt in a response to comment 89 in the post On Replication:
“My working directories are always a mess – full of dead ends, things that turned out to be irrelevent or that never made it into the paper, or are part of further ongoing projects. Some elements (such a one line unix processing) aren’t written down anywhere. Extracting exactly the part that corresponds to a single paper and documenting it so that it is clear what your conventions are (often unstated) is non-trivial. – gavin]”
If this isn’t a reason to use source code control, documentation and configuration management, I don’t know what is.

Thanks for the quote. This is pretty shocking to say the least! Now, this would not be an issue to me if these codes were truly “research codes”. However, the fact of the matter is that the numerical solutions being generated by AOGCMs such as Model E have formed the basis of ** numerous ** scare stories in the media about tipping points, species extinction, ice free poles, etc. And now, with Cap and Trade looming, the influence of these codes and their “solutions” is going to have a crippling effect on our economy.
For this and many more reasons, I think all of the major climate models employed by the government for use in climate prediction studies should undergo a vigorous and thorough review process, where the authors are ** required ** to produce complete documentation of their algorithms, verification and validation procedures for all subroutines, and source code control documents which can trace all changes to the source code. This is the very least these people can do. But, alas, it’s not a very rewarding activity because, as Dr. Schmidt has said before, he’s not paid to document his work, he’s paid to do “science”!

Peter Plail
April 16, 2009 11:22 am

Re DDT – for info the following from Wikipedia (http://en.wikipedia.org/wiki/DDT):
Production and use statistics
From 1950 to 1980, when DDT was extensively used in agriculture, more than 40,000 tonnes were used each year worldwide,[7] and it has been estimated that a total of 1.8 million tonnes of DDT have been produced globally since the 1940s.[1] In the U.S., where it was manufactured by Ciba,[8] Montrose Chemical Company and Velsicol Chemical Corporation,[9] production peaked in 1963 at 82,000 tonnes per year.[3] More than 600,000 tonnes (1.35 billion lbs) were applied in the U.S. before the 1972 ban, with usage peaking in 1959 with about 36,000 tonnes applied that year.[10]
Today, 4-5,000 tonnes of DDT are used each year for the control of malaria and visceral leishmaniasis, with India being the largest consumer. India, China, and North Korea are the only countries still producing and exporting it, and production is reportedly on the rise.[11]

April 16, 2009 11:26 am

I’m not really sure you can call it ‘diverging’ since, except at the 1999 start point, the lines on that graph were never that close anyway.
‘Getting further apart than ever’ would be a more accurate description.

April 16, 2009 11:28 am

Sam the Skeptic (10:33:39) :
O/T
My eyesight is not what it was but is there a very small sun-speck at about 12 o’clock?

Well… You’re eyesight is fine. There is a bright spot at about 12 and another black tiny spot at about 5 o’clock. The latter has been there for weeks. I don’t have a good explanation for that tiny speck.

M White
April 16, 2009 11:31 am

“How can a graph be so very wrong?”
The subject is the population of the UK.
http://news.bbc.co.uk/1/hi/magazine/8000402.stm
“The answer, crudely, is that the track record of population projection is abysmal. It borders on being a statistical lottery.”
The same people who spread alarm over AGW also see population as a problem
http://news.bbc.co.uk/1/hi/sci/tech/4584572.stm

George E. Smith
April 16, 2009 11:49 am

As the old faint praise goes:- “You may not do very good work ; but you sure are slow !”
In this context; I really don’t care that the Playstation Video-Game climate models; or Global Circulation Models; if you want to be pedantic, are no good; because I’m absolutely sure that the raw data that goes into them is pure garbage.
Until those who make the experimental observations start complying with the Nyquist sampling criterion; that governs ALL sampled data systems; these modesl will always predict nonsense.
Please wake me up when ANY of these GCMs, run backwards, correctly predicts the LIA and the MWP.
A good scientific model ought to be able to reproduce the raw data that was used to create the model.
Ho hum.
George
PS But nice work Lucia.
Reply: “You may not do very good work ; but you sure are slow !” George…you’re killing me! ~ charles the moderator

SteveSadlov
April 16, 2009 11:58 am

I also see (the lack of growth in) population (of native born residents of Western democracies) as a problem. It will shortly become an even bigger problem. At some point, the crash will affect the already weak economy. The global economy has never previously experienced a decline in market size. Previous declines in market size were prior to a truly global economy. Those two previous declines were the Plague and the Dark Ages. What we face is worse, from an economic impact perspective. Even the third world is coming up against fecundity issues – meanwhile in the more developed areas the die is already cast, too late to turn back from the slippery increasingly steep slope.

SteveSadlov
April 16, 2009 12:01 pm

Note – the impacts I mentioned will start to hit over the next 10 years in the developed world and elsewhere no later than 2060. The problem is compounded by the retirement and aging of the Baby Boomers

Mark T
April 16, 2009 12:04 pm

George E. Smith (11:49:03) :
because I’m absolutely sure that the raw data that goes into them is pure garbage.

I don’t disagree.
Until those who make the experimental observations start complying with the Nyquist sampling criterion; that governs ALL sampled data systems; these modesl will always predict nonsense.
I disagree. Actually, it all depends. What are the highest frequencies present in our climate? Most likely the day/night cycle is the highest, or at least, the highest that has any real impact. There may be higher frequencies, but I’ve never seen any discussion that indicates they have significance, which means they won’t cause problems if they get aliased. That said, at what time resolution do GCMs run?
Please wake me up when ANY of these GCMs, run backwards, correctly predicts the LIA and the MWP.
Given that they are likely highly non-linear, this can’t happen, either. An unfortunate consequence of non-linearity, indeed.
Mark

jack mosevich
April 16, 2009 12:14 pm

Nasif and Sam: Black speck at 5 o’clock may be a bad pixel in the camera. Lief has mentioned such things before.

Bart van Deenen
April 16, 2009 12:30 pm

CO2 Realist (07:53:41) writes about the lacking software quality control for the CGM’s.
I have some experience with a large Fortran model with similar lack of software quality: the Dutch airport risk calculation tool. I left the Dutch National Aerospace Lab mostly because this piece of crap software was being used to generate risk contours on a map with actual legal consequences.
The errors in this p.o.s were staggering. It has generated contours that were turned into law, where only 1 in 25 gridcells were correctly calculated, most were randomly wrong, and about 8 in 25 were calculated with undetermined data.
All this happened because one of the data files (surface texture) had two spaces between each figure instead of three! I kid you not!
Another important bug in this model was in the actual equations, which would lead to an infinite risk density right at the centerpoint of any circlesegment of an aircrafts track.
I assume that the programmer who wrote it put some arbitrary limit in it, or that it actually never hit the infinity value because of non-grid alignment of the aircraft tracks
When I got there, nobody new how to visualize data, and I spent some time building something to get false color pictures from the risk probability density files. Then I noticed these weird patterns every 500 meter or so, and very bright spots right in the center of an aircraft track.
The moral of the story is: You GCM modellers, you better show us your code and every single line of it, and ALL data, and the actual equations you’re trying to model, and I’m damn sure we’ll find plenty of bugs in them, some of them major.
The amazing gall the GCM model people have to think that they program something that big without very tight software quality management is mindblowing. The credibility of your model calculations is NULL.

Sam the Skeptic
April 16, 2009 12:30 pm

Nasif, thanks for confirming.
Jack — that was what I thought as well. I remember somebody mentioning it.

Aron
April 16, 2009 12:33 pm
stumpy
April 16, 2009 12:50 pm

I would be interested to see the graphs compared to RSS or UAH data, as when HadCRUT and GISS are compared with satellite data they have been diverging.
Over the last 32 years for example UAH and GISS have diverged by approx. 0.3 degrees. I expect if the IPCC projections were compared with the more reliable (though not perfect) satellite data, the model divergence from reality would become even more apparent.

Dave Middleton
April 16, 2009 12:59 pm

Replying to…
Aron (12:33:02) :
Read this nonsense
http://news.bbc.co.uk/1/hi/sci/tech/8003060.stm

Apart from the bits about “man made” and “greenhouse gas emissions”, it’s a very good article…;)

AKD
April 16, 2009 12:59 pm

Aron (12:33:02) :
Read this nonsense
http://news.bbc.co.uk/1/hi/sci/tech/8003060.stm

Hmm…I wonder what might have contributed to the 1650-1750 mega-drought in West Africa…?

April 16, 2009 1:04 pm

Regarding the disucssion of quality control and disclosure of the models:
Frank K. (11:17:51) :
“…I think all of the major climate models employed by the government for use in climate prediction studies should undergo a vigorous and thorough review process, where the authors are ** required ** to produce complete documentation of their algorithms, verification and validation procedures for all subroutines, and source code control documents which can trace all changes to the source code.”
Bart van Deenen (12:30:04) :
“The moral of the story is: You GCM modellers, you better show us your code and every single line of it, and ALL data, and the actual equations you’re trying to model, and I’m damn sure we’ll find plenty of bugs in them, some of them major…The credibility of your model calculations is NULL.”

I couldn’t agree more with both of you, and my experience is only with commercial business software. I’ve even heard arguments that there’s no need to document code, just read the code. But what if the code is wrong? Now you’re gleaning the logic and goal of the code from incorrectly written code?
With the huge impacts financially, socially, and politically, the models need to at the very least be held to the standards of commercial software. And don’t get me started on garbage in, garbage out. We know from this blog that there are huge issues with data quality.

Paddy
April 16, 2009 1:06 pm

Frank K and CO2 realist:
Perhaps I state the obvious about conclusions to be drawn from comments about Gavin S’s work. Gavin appears to be a first rate garbage collector.

Ray
April 16, 2009 1:14 pm

“jack mosevich (12:14:20) :
Nasif and Sam: Black speck at 5 o’clock may be a bad pixel in the camera. Lief has mentioned such things before.”
Yea, it is a burt pixel on the detector. It might go away the next time they do a burnout on the detectors but for now it is a permanent feature. Actually they are so desparate to see the sunspot number going up that they want to give this burnt pixel a permanent sunspot number. 😉

Tom in Texas
April 16, 2009 1:16 pm

Ron de Haan (09:19:27) :
Ray (11:01:19) : This is an excellent document from Lord Moncton:
http://scienceandpublicpolicy.org/images/stories/papers/reprint/markey_and_barton_letter.pdf
Excellent letter to the House Committee. Now if only a few would read the 40 pages.
The source of the Santa Rosa graphs was quoted as “Dr. Anthony Watts”.

pmoffitt
April 16, 2009 1:29 pm

Aron (12:33:02) :
The public is being sold a message that if we control C02 we control climate change. It is a dangerous message. The BBC story correctly confronts the very real paleo drought cycles- yet only cautions against C02. It is time for us to look at our infrastructure and its ability to withstand known weather/climate variations-drought return does not require global warming, a major hurricane striking NYC does not require global warming etc. Some of our infrastructure needs to be hardened whether or not we agree with climate models. This important question is unfortunately being lost in the heat of the “debate.”

JamesG
April 16, 2009 1:30 pm

SteveSadlov
I think you’re ignoring the fact that economic forecasts are historically much worse than even climate forecasts. And the current economic crisis is only the latest example of such abject failure. The failure comes by projecting the future as if it would be linearly the similar to today, but it won’t be. Regarding the aging population, the Japanese who are already in the middle of this problem are already using personal robots that will in the future take over vast amounts of menial tasks, freeing up human labour for other areas. Meanwhile in the transition to mass robotry, nobody is currently short of cheap immigrant labour. And what about the effect of the internet, including teleworking and information gathering. Whole classes of work that used to take hours, days, weeks or even years is either rendered unnecessary or is far quicker. Was that predicted? And lets not forget that when you communicate better there is a larger available market. 50% of Nokias business right now is in Africa. And what do Africans want? Everything we have! And what do they have to sell? Huge amounts of those supposedly scarce raw materials! (recyclable of course). I’m not into misplaced optimism but perceived future problems often turn out to be just new opportunities for marketable solutions.
Economists should finally learn some humility now and just shut the heck up. Clearly they can chronicle events with 100% hindsight but for forecasts you’d better with a blindfold and a dartboard. Well ok they won’t ever shut up, but you really shouldn’t listen to them any more: Their simplistic linear theories are obviously just pure rubbish.

JamesG
April 16, 2009 1:41 pm
Mark T
April 16, 2009 1:54 pm

JamesG (13:30:43) :
Clearly they can chronicle events with 100% hindsight but for forecasts you’d better with a blindfold and a dartboard.

It would seem to me the Keynesians are using a blindfold and a dartboard for their hindcasts, too. Or simply rewriting what happened.
Mark

George E. Smith
April 16, 2009 1:56 pm

“”” Mark T (12:04:38) :
George E. Smith (11:49:03) :
because I’m absolutely sure that the raw data that goes into them is pure garbage.
I don’t disagree.
Until those who make the experimental observations start complying with the Nyquist sampling criterion; that governs ALL sampled data systems; these modesl will always predict nonsense.
I disagree. Actually, it all depends. What are the highest frequencies present in our climate? Most likely the day/night cycle is the highest, or at least, the highest that has any real impact. There may be higher frequencies, but I’ve never seen any discussion that indicates they have significance, which means they won’t cause problems if they get aliased. That said, at what time resolution do GCMs run?
Please wake me up when ANY of these GCMs, run backwards, correctly predicts the LIA and the MWP.
Given that they are likely highly non-linear, this can’t happen, either. An unfortunate consequence of non-linearity, indeed.
Mark “””
Surely you jest Mark ? From all the postings I have seen here; GISStemp inputs are based on daily max/min readings of what; Anomalies; or temperatures ? Either way, a min/max can only be a suitable sample if the signal is a pure sinusoid, or some other time symmetrical waveform; which is most unlikely. Clearly the morning heat up is faster than the evening coolbdown so there is no way that the daily cycle is time symmetric. So it has to have at least a second harmonic component; which means that two samples per day is already a Nyquist violation by a factor of two which is all it takes to give aliassing errors of the zero frequency or average of the signal.
Even at 4 times per day; one is forced to conclude that the effect of clouds cannot be included; so already there is a gross departure from reality.
So already we have a failure in temporal space.
There”s that little matter of spatial sampling. Given that 73% of the earth surface is ocean; just think of the sampling failure there.
The Arctic (north of +60) has somewhere in the 70-80 surface sampling locations.
Don’t even mention Antarctica; as the Steig paper demonstrates; that place is a textbook case of sampling genius; was it 12 stations or some similar number that were used to concoct his analysis ?
Just as wild guess, I would guess that the spatial undersampling would be by 4-5 orders of magnitude at the minimum.
I just have to look at the 6:00 PM weather report for the SF Bay area, to see how high the spatial frequencies can get.
Nyquist does not require uniform sampling; which is the minimum sample solution; but the maximum sample spacing that occurs in a non uniform sampling still has to conform to the highest signal frequency. Maybe its six orders of magnitude undersampling.
In any case; even a correct sampling regimen, to give a believable average is of little value, since the thermal processes that determine the energy balance of the planet are non linear functions of local temperatures; are terrain dependent; and in the case of radiative transfer are at least fourth power of temperature.
When it comes then to the effect of GHGs such as CO2, then it is a fifth power function of temperature.
The central limit theorem and other prestidigitations can’t buy you a reprieve from Nyquist violations.
But climate “Scientists” are only too happy to drill a handfull of ice cores in antarctica and/or Greenland, and then declare the results pertinent to the entire planet.
They need to watch more horse operas on TV, to watch the wheels go backwards, as the hero rides to rescue the damsel in distress on the runaway wagon.
By the way; the average of the Nth power of any cyclic function, is always greater than the average of the function; so taking the 4th power of some fictional earth average temperature, to compute the radiative balance always underestimates the true cooling that is occurring (due to radiation).
George

Aron
April 16, 2009 1:59 pm

Populaton explosion. Perdictions vs reality
http://news.bbc.co.uk/1/hi/magazine/8000402.stm
Every prediction about population size since the 50s has in turned out in medium and long terms to be wrong.

George E. Smith
April 16, 2009 2:01 pm

I should have said the Nth root of the average of the Nth power of any cyclic function, is always greater than the average of the function; but then you already knew I meant that; Right ?
George

Aron
April 16, 2009 2:06 pm

The BBC article clearly shows that parts of Africa experienced drought during the Little Ice Age which reversed as temperatures rose back up to their Holocene average. Global warming or climate change since 1750 according to the article has seen an increase in precipitation and rainfall which has increased the health of Africa’s tropics. How then within the same context does Richard Black or anyone else gather that, pay attention to the dramatic wording, mega-drought will occur?
It reads like it comes from the same spinmasters who sold the Maldives-is-drowning story in order to get its politicians on board the gravy train. Now they want Africans to stop using their source of energy, live off high interest loans from the World Bank and IMF, and accept welfare cheques from developed countries. I consider that as saying Africa should sit at the back of the bus in order to save the white man’s wealth….I’m sorry…I meant planet.

George E. Smith
April 16, 2009 2:06 pm

Sorry Charles; just trying to keep a smile on your face; in view of the dismal predictions of scientifically dismal substitutes for observations.
George; who likes to laugh too !

April 16, 2009 2:10 pm

I suspect those “models” could perfectly run in an old commodore computer…

Dave Wendt
April 16, 2009 2:15 pm

You have to love that Monckton. His letter to Congress and the president, linked in the comments above, is a highly recommended read. If only we could find a way to insure that they all read it, though that would probably entail strapping them all in chairs and stapling their eyelids to their foreheads until they were done. Alas, given Obama’s continuing reputation for “brilliance”, which strongly indicates that possessing a room temperature IQ puts you 2-3 sigma above the mean inside the Beltway, most of them wouldn’t be able to understand it, even if they had any inclination to actually consider the evidence before sending us all to carbon phobia induced Hell, which it is fairly obvious they do not. I’d say the probability that Algore will respond to Monckton’s challenge and agree to a public debate can now officially and permanently be posted at negative infinity.

BarryW
April 16, 2009 2:16 pm

Tom P (08:55:55) :
Thanks, not sure I understand what you meant by basis but here’s what I did.
Data was Hadcrut3.
Plot was done using R and the lm function, “straight with no chaser” (ordinary least squares linear fit). 240 or 360 contiguous months were used in the fit respectively. The data set was then moved forward by one month using the same number of months and another calculation made. The trend was plotted using the last month of the set as the plot point (so the trend at each month comprises the trend for the previous 240 or 360 month period). Here’s another plot I did with 10 and 50 year trends
Could the twenty year cycle actually be 22 years, just twice an 11 year solar cycle?

April 16, 2009 2:19 pm

Aron:
I consider that as saying Africa should sit at the back of the bus in order to save the white man’s wealth….I’m sorry…I meant planet.
History always laughs at those wishful thinking. Remember that after the II world war a certain general was asking permission to wipe off the communist menace from China…Now China is the first buyer of US debt..
As in the latino song says “vueltas que da la vida” (life turns around ya know)

Sam the Skeptic
April 16, 2009 2:22 pm

I’ve just finished the Monckton letter. Great stuff!
In Scots Law I think he woud be charged with “Assault to their severe injury in that he did kick and punch them 50 times about their person to risk of their life”.
The “risk of their life” bit is probably wishful thinking; they’ll probably resurrect themselves like Terminator!
Seriously, this is devastating stuff and sooner or later the politicos are going to realise the extent to which they have been taken for a ride. This might just be the start!

Equalizer
April 16, 2009 2:28 pm

To Aron’s (13:59:02) :
Populaton explosion. Perdictions vs reality.
Not to inject any bodily jokes in rather serious discussion, but…
The misspelled word “perdiction” sounds very much like a Russian word that describes an act of flatulence, i.e. farting. I believe that it turned out to be very appropriate in the context it was used:
BBC’s “perdictions” vs reality

John Galt
April 16, 2009 2:43 pm

I seem to recall from science class, that an experiment is supposed to be documented. When an experiment is finished, the results are reviewed as is the methodology and the data.
_ Results don’t match predicted outcome? – Your hypothesis is likely wrong (assuming no other problems with the experiment).
_ Methodology is poor? – Results are invalid.
_ Data is bad? – Results are invalid.
_ Mistakes in calculating results? – Results are invalid.
Contrast this with the computer climate models.
_ How can the models be verified if the results can’t even be considered a prediction?
_ How can data be verified if it’s not published?
_ How can mistakes in calculations be confirmed if complete, working source code isn’t published?

DJ
April 16, 2009 3:11 pm

>2005 El Nino…
“timetochooseagain” 2005 was not an El Nino – it was a neutral year. Indeed, if anything it was influenced more by the weak La Nina of 2004 which makes it more remarkable as a record hot year.

Aron
April 16, 2009 3:16 pm

the misspelled word “perdiction” sounds very much like a Russian word that describes an act of flatulence
In South and Central Asia it is ‘pad’ (sounds like pud). Derives from padam meaning to step or to drop an object on the ground. Related to Italian piede, English pedestrian, etc. In this case, the object touching the ground is poop.
In the case of projecting global warming and population explosions, perdiction should replace prediction. Good observation!

Editor
April 16, 2009 3:26 pm

Allan M R MacRae (03:09:55) :
Allan, that video was stunning. Thank you.

April 16, 2009 3:36 pm

BarryW (07:26:06) :
If you extend the time period using Hadcrut, you get a graph of the 20 yr trends that has a sinusoidal shape with a 60 yr wavelength and with one of the peaks at the end of the twentieth century. Very obvious in the 30 yr trends.
Here’s a plot: http://farm4.static.flickr.com/3623/3446920199_52496549cb_b.jpg

This looks like what I expected, every thirty or forty years we swap between worrying about freezing and worrying about boiling. How exactly do you figure the trends? How did you do trends at the end points?

Aron
April 16, 2009 3:43 pm

At the end of the BBC article on population, an alarmist comment is left by a reader who avoids the facts that birth and death rates have been dropping worldwide for some time. This is the reader’s comment “Something clearly must be done to stem that growth if we’re to avoid future environmental problems and resource shortages.”
It shows you the totalitarian thinking that lies in the minds of these save the world types (which normally means “Save me from coloured people, please! I don’t want them to be wealthy and free! All the resources must belong to me only!!”
He wants something radical to be done to stem population growth. Well, that has been done before. The Eugenics movement sought to eliminate the weak from the gene pool and promote genes which matched certain defined ethno-types so that the planet’s resources would not be “wasted” on inferior members of the human species.
The Nazis sought to sterilise ethnic groups who they felt were physically, mentally and spiritually inferior so that resources would only be for people they deemed of Aryan stock (a fictional ethnic group). The Holocaust of Jews and gypsies was an example of population growth control driven by eugenics and resource allocation.
In China they enforced a one child per family law that caused many female fetuses to be aborted because males were preferred. Female babies were also killed. In some cases when a couple had more than one child, law enforcers would forcibly take babies away from mothers or kill babies to have their organs sold. This again was all done in the name of allocating resources under a model of sustainable development. Not only did the one child per family policy failed but so did the Chinese Cultural Revolution.
Free market policies increased China’s life expectancy, reduced death rates, reduced infant mortality and lowered birth rates despite a relaxing of the one child per family law. At this rate it is highly likely China’s population will drop to below a billion by the middle of the century and so will India’s.

Richard Lawson
April 16, 2009 4:07 pm

Picking up on errors in code, NASA’s code used to control the flight operations of the Shuttle used to have 1 error per thousand lines of code.
After 25 years and 260 permanent software staff this has dropped to a mere 1 error per 25000 lines.
Do you think climate modelers have this many man-hours applied to their code? I think not. Enough said.

Jeff B.
April 16, 2009 4:14 pm

One could easily imagine a similar graph with the plots instead being the public’s willingness to believe in the AGW scam as promulgated by Al Gore, and what the Mainstream Media is reporting. I would expect the same inflection point in about 2006 when folks gradually started waking up to the real science.

Tom P
April 16, 2009 4:16 pm

BarryW,
Thanks – that’s just the information I was after.
“Could the twenty year cycle actually be 22 years, just twice an 11 year solar cycle?”
The recent solar cycle has been much nearer to 20 years, and I think a lot of the signal energy at this period will be injected into a 20-year trend window. The divergence of the models from the data could partly be due to a failure to predict and include the depth and length of the current solar minimum into the GCMs (fig. 1), compounded by an unfortunate choice of trend window length in the comparison with the temperature data (fig. 2).

April 16, 2009 4:18 pm

For those just getting up to speed on the global warming debate, there is no better place to start than with Christopher Monckton’s report to Congress: click
In the report, Viscount Monckton thoroughly demolishes every claim made by the global warming contingent. Highly recommended! In fact, everyone who reads the report will find new reasons to be skeptical of the climate alarmist argument.

April 16, 2009 4:34 pm

Smokey (16:18:35) :
In the report, Viscount Monckton thoroughly demolishes every claim made by the global warming contingent. Highly recommended! In fact, everyone who reads the report will find new reasons to be skeptical of the climate alarmist argument.
I had always suspected of those abrupt one year long high temperatures in 1998. After Lord Monckton’s report, I can easily explain the disparity by adding it to unusual solar activity and El Niño oscillation.

Arn Riewe
April 16, 2009 4:34 pm

Several have posted about Nat Geo. I agree that it’s become no more than an AGW advocacy site.
One of my favorites from the “HEAD I WIN, TAILS YOU LOSE DEPARTMENT”
http://news.nationalgeographic.com/news/2006/09/060911-growing-glaciers.html

April 16, 2009 4:53 pm
April 16, 2009 5:11 pm

Hi all,
I have again shamelessly stolen your chart for use on my weblog (hopefully with adequate credit to WUWT–let me know if you want more praise, and I’ll cheerfully give it to you). It’s here with my additional point: http://newsfan.typepad.co.uk/liberals_can_be_skeptics_/2009/04/predictions-vs-observations.html
My point is, let’s play to win. I am a liberal Democrat and I want to change liberal Democratic policy. These figures, as well as the rest of the points made on this blog, as well as Climate Skeptic, Jennifer Marohasy and the Blackboard, strongly indicate that we at the very least have time to go out and collect more and better data. To site temperature stations correctly, and to infill the blank spots on the map. To get better proxies, to put some radiosonde balloons in the troposphere and anchor them, to make sure the satellites are getting and reporting the data in the best way possible.
If nothing else–even if global warming is all true (which it obviously isn’t), even if catastrophe is coming down the road (which I don’t believe for a minute), it is further down the road than the alarmists thought. We have time to do the science right.
Isn’t this something we can push for?

Mike S
April 16, 2009 5:18 pm

DJ – Sorry but you sound like a victim of revisionism. 2004 was El Nino in the latter months, while 2005 was more or less ENSO neutral. Don’t you just hate it when they go back and do stuff like that ? Don’t shoot the messenger, I’m just trying to figure this stuff out myself.

Squidly
April 16, 2009 5:21 pm

CO2 Realist (07:53:41) :
Re: Frank K. (05:29:12) :
“In my opinion, the credibility of many climate models will be close to zero because the code developers (in particular the NASA GISS Model E) do a horrible job documenting and validating their codes (there are some exceptions). ”
This is a sore point with me as well, since I have some background in programming.
On RealClimate.org, Gavin Schmidt argues against industry standard practices of source code management, configuration management, and disclosure of code and data….

You have hit on what I consider to be one of many extremely important points. As a long time computer scientist myself, I cannot imagine how one could seriously consider conclusions from codes developed so haphazardly. The necessary methods unemployed in the design and construction methodologies evident within the codes and mechanisms of all of the models I have inspected (Model E for example), leave me completely unconvinced of the accuracy and consequence of their results. I simply cannot take seriously a piece of software so poorly written, poorly designed, poorly constructed, poorly managed and poorly maintained. Simply non-credible and bad science in the worst of ways.

Squidly
April 16, 2009 5:37 pm

Dave Middleton (08:27:10) :

The reason that the climate models are so wrong is simple…They fail to properly account for water vapor and clouds. Every model assumes a positive feedback mechanism from water vapor. Yet, there is no empirical or observational evidence to support such a positive feedback mechanism. If anything, the observational evidence supports a negative feedback mechanism…
http://wattsupwiththat.com/2009/03/05/negative-feedback-in-climate-empirical-or-emotional/

These are not the reasons, but certainly a very important none the less. A very simple everyday demonstration of this is to compare deserts to tropics. Which is warmer? and why?
Further, has there ever been a runaway heating affect found on Earth? If not, why not?
The answer to all of these questions is quite simple (and you don’t need a computer model), the feedback effect produced by radiative forcing between CO2 and water vapor is NEGATIVE not positive. Like almost all natural processes and responses, our climate is dominated by negative feedback, not positive feedback.
“We now conclude this test of the AGW Broadcast System, had this been a real emergency, you would have been instructed to relinquish your hard earned cash and give up your right to breathe.”

BarryW
April 16, 2009 5:48 pm

Mike McMillan (15:36:52) :
Mike I explained it for Tom here: BarryW (14:16:34) :
The trick is to plot the trends based on the ending year not the start year. So the trend plotted is the trend based on the months prior to that date not after it. So for January 2009 the plotted trend is based on the data from the preceding 240 months of data for the 20 yr trend (Jan 1990 to Jan 2009 inclusive).

Thom Scrutchin
April 16, 2009 5:54 pm

I would like to make a suggestion. We should stop using the acronym AGW since it is becoming clearer and clearer that AGW doesn’t exist.
The new acronym should be AGWA for Anthropogenic Global Warming Alarmism. Since that is not a scientific claim, I can appeal to the new consensus to prove that AGWA really does exist.

Philip_B
April 16, 2009 5:54 pm

Lucia, I think the $64K question here is ‘How does the current climate trend downturn compare with the last cyclic downturn and the last cyclic upturn’.
If this downturn is similar or less than the last downturn and the last upturn, then we are just going back to where we were 60 or so years ago. Not good news, but not particularly bad news either.
Eyeballing the data with volcanos removed (link below) the current downturn looks very similar to the 1940s downturn. The 1940s downturn stopped at around the current level (in terms of the drop in the anomaly) and then was basically flat for a decade or so, before resuming the up trend.
If we repeat the 1940s then temperatures will stay around current levels for a decade or more, which is nothing to worry about. Although the models will be categorically invalidated.
If we see further falls from here, say by more than -0.2C, then the current temperature downturn will look more like the 30 years of cooling temps from 1890.
If we see significantly more than -0.2C then we are into uncharted territory in terms of the (thermometer) temperature record and who knows where we go from there.
http://www.worldclimatereport.com/index.php/2008/12/17/recent-temperature-trends-in-context/
How large can temperature changes get. Well the study below shows a 22C rise (no mistake there) in temperatures at the start of current interglacial around 10K years ago.
http://cat.inist.fr/?aModele=afficheN&cpsidt=2934948

KimW
April 16, 2009 6:12 pm

The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
” Overpeck and his colleagues studied sediments beneath Lake Bosumtwi in Ghana that gave an almost year-by-year record of droughts in the area going back 3,000 years. Until now, the instrumental climate record in this region stretched back only 100 years or so. The researchers found a pattern of decades-long droughts like the one that began in the Sahel in the 1960s that killed at least 100,000 people, as well as centuries-long “megadroughts” throughout this long period, with the most recent lasting from 1400 to 1750.
The scientists also described signs of submerged forests that grew around the lake when it dried up for hundreds of years. The tops of some of these tropical trees can still be seen poking up from the lake water. …………….
……. The cause of centuries-long megadroughts is not known, but he said the added burden of climate change could make this kind of drought more devastating. Temperatures in this region are expected to rise by 5 to 10 degrees F (2.77 to 5.55 degrees C) this century, the scientists said, even if there is some curbing of the greenhouse emissions that spur climate change. We might actually proceed into the future … we could cross a threshold driving the (climate) system into one of those big droughts without even knowing it’s coming,” Overpeck said. ”
I am ‘gobsmacked’ !!. I mean to say, if there were these ‘megadroughts’ in history before the present, – there had to be climate change – so how the heck can he assume there is “man-made climate change” now rather than a continuing cyclic climate pattern ?

wenx
April 16, 2009 6:30 pm

which curve is the Reality?
or which one is more close to the Reality?
Could anyone tell me that?
Thank you.

Philip_B
April 16, 2009 6:37 pm

A very simple everyday demonstration of this is to compare deserts to tropics. Which is warmer? and why?
The humid tropics are a lot warmer than hot deserts such as the Sahara, by about 10C (annual average). This is due to the water vapour greenhouse effect (in the tropics).
Although, having lived in both the humid tropics and hot desert climates, I know clouds have a much greater cooling effect in hot deserts than the humid tropics.
So, a wetter (more water vapour) world is a warmer world, but a cloudier world is a cooler world.

Tom in Texas
April 16, 2009 6:40 pm

KimW (18:12:41) :…centuries-long “megadroughts”…climate change could make this kind of drought more devastating…
Wow – AGW could turn the “megadroughts” into “super megadroughts”.
Is anyone else getting tired of hearing this crap?

April 16, 2009 6:42 pm

Continuing on the quality of the models:
Richard Lawson (16:07:10) writes:
“Do you think climate modelers have this many man-hours applied to their code? I think not. Enough said.”
And Squidly (17:21:09) writes:
“As a long time computer scientist myself, I cannot imagine how one could seriously consider conclusions from codes developed so haphazardly… I simply cannot take seriously a piece of software so poorly written, poorly designed, poorly constructed, poorly managed and poorly maintained.”

Looks like many of us are on the same page regarding quality and documentation of the models. Even though I’m not a scientist, the part of the climate debate I know something about (programming and standard practices) is hopelessly corrupted. Why am I not to think the other parts I’m not as well versed in aren’t of similar dubious quality.

Law of Nature
April 16, 2009 7:45 pm

Hi John and Douglas, a very warm “Hello there!” to Anthony
and many thx to Lucia and Chad for the graphs! 🙂
Greeting reader! 🙂
John Finn (06:38:54) :
Douglas DC (06:21:40) :
I say cooling further.Soon too…
Could you be more specific. What do you mean by “soon”?
Well I just wanted to comment that what you are seeing here are the intergrated longyear trends . .
Even if the temperature would decide to rise faster, the difference to the models for the trends shown on these graphs would still be different to the models for years to come.
In other words the 20 year trend is affected by this year for the last and the next 20years!
All the best,
LoN

BarryW
April 16, 2009 8:20 pm

CO2 Realist (18:42:37) :
Guys, welcome to the 20th century. Go back to the 1980’s and this is what you got for code quality and development. I heard those sorts of arguments from some of the big defense contractors when monitoring their software development back then. Couldn’t understand why they needed specs, documentation and testing. They just threw the code and hardware into the integration facility and kept banging on it till it worked (mostly). These guys are just hacking code until they get the answer that “looks right”.

Frank K.
April 16, 2009 9:50 pm

CO2 Realist (18:42:37) :
“Looks like many of us are on the same page regarding quality and documentation of the models. Even though I’m not a scientist, the part of the climate debate I know something about (programming and standard practices) is hopelessly corrupted. Why am I not to think the other parts I’m not as well versed in aren’t of similar dubious quality”
Before we get off this topic, those who are interested should check out the worst example of climate code documentation – the GISS Model E. You can read about it here:
http://www.giss.nasa.gov/tools/modelE/index.html
For their code “documentation” they reference the following paper:
http://pubs.giss.nasa.gov/abstracts/2006/Schmidt_etal_1.html
Please look at this paper and count the number of equations. I think there are six altogether.
I’ve looked around and there doesn’t appear to be any place where the GISS researchers are brave enough to list the differential equations they are solving. And of course there is no complete description of their algorithms beyond the cursory descriptions given in the link above. Like many research organizations, GISS feel that a list of possibly relevant papers is enough code documentation for them.
And then there is the code itself, which can be found here…
http://www.giss.nasa.gov/tools/modelE/modelEsrc/
The FORTRAN here is typical research code i.e. mostly no comments, poor formatting, very few pointers to any documentation, and nearly impossible to follow, unless you were one of the original developers.
For another example of GISS coding standards, there is the infamous GISTEMP here:
http://data.giss.nasa.gov/gistemp/sources/
Try to figure out what this code is really doing versus the description given in the “documentation”…

David Ball
April 16, 2009 10:15 pm

George E. Smith, I was wondering if you had the time and inclination to run in the next presidential election, assuming of course that you live in the U.S. You are one of my heroes!!! And a great sense of humor, to boot !!! I was going to add “master debator”, but I have too much class for that, …..

AJ
April 16, 2009 10:28 pm

great information … i have book marked this site and will come back in times

April 16, 2009 10:32 pm

More on model quality:
BarryW (20:20:41) says:
Guys, welcome to the 20th century. Go back to the 1980’s and this is what you got for code quality and development.

Barry, you made me laugh! I was working on IBM mainframes (COBOL, CICS, IMS database) in the mid 80s. I can certainly agree with your statement. How come our “scientists” haven’t progressed to the 21st century?
Frank K. (21:50:45) says:
Try to figure out what this code is really doing versus the description given in the “documentation”…

I’ll give it a go, but I’m sure I could take your word for it! It will be an enlightening exercise.
While I can certainly appreciate all that Lucia, Steve McIntyre, Jeff ID and others do with the statistics and math, the models fall apart much sooner than that. Like try starting with surfacestations.org and the poor quality of input data.
Seriously, I’m surprised more scientists, mathemeticians, statisticans, programmers, and others don’t call B.S. on the models and the claims of 90% certainty. Sheesh!
Oh, and for those that can remember, doing these HTML embedded format codes for italics and the like reminds me of the commands for “pretty printers” on mainframes and WordStar. Oops! Did I just date myself?
While I like WordPress, you’d think they could add a WYSIWYG editor for comments.

Squidly
April 17, 2009 12:22 am

Philip_B (18:37:10) :
A very simple everyday demonstration of this is to compare deserts to tropics. Which is warmer? and why?
The humid tropics are a lot warmer than hot deserts such as the Sahara, by about 10C (annual average). This is due to the water vapour greenhouse effect (in the tropics).
Although, having lived in both the humid tropics and hot desert climates, I know clouds have a much greater cooling effect in hot deserts than the humid tropics.
So, a wetter (more water vapour) world is a warmer world, but a cloudier world is a cooler world.

Yes, the average temperature in the tropics will stay warmer because the water vapor will hold the heat, however, the tMax will not reach nearly as high in the tropics as it will in the desert because of the negative feedback of the water vapor. Several papers out there (many listed in other threads on WUWT) clearly show this to be true.

anna v
April 17, 2009 1:17 am

KimW (18:12:41) :
from your quote:
The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
” Overpeck and his colleagues studied sediments beneath Lake Bosumtwi in Ghana that gave an almost year-by-year record of droughts in the area going back 3,000 years. Until now, the instrumental climate record in this region stretched back only 100 years or so. The researchers found a pattern of decades-long droughts like the one that began in the Sahel in the 1960s that killed at least 100,000 people, as well as centuries-long “megadroughts” throughout this long period, with the most recent lasting from 1400 to 1750.

Bold mine, to point out that the greatest drought happened during what was the little ice age for the rest of the world.
Actually, if one looks at the ice core records, desertification happend during the deep cold ( note the dust levels, red curve)
http://upload.wikimedia.org/wikipedia/commons/thumb/c/c2/Vostok-ice-core-petit.png/400px-Vostok-ice-core-petit.png
This is because a lot of the water is caught up in the icebergs and the sea level is very low and there are no rains.
We might actually proceed into the future … we could cross a threshold driving the (climate) system into one of those big droughts without even knowing it’s coming,” Overpeck said. ”
Not unless he is prophesying the imminence of the next ice age.
This is sophistry in the exreme.

masonmart
April 17, 2009 1:21 am

Am I missing something? I’ve worked in the Sahara and Arabian deserts and Malysia/Indonesia in the tropics and the deserts are much hotter. Last year I took a walk in 55C. The tropics are typically 20c below that. The tropics van feel uncomfortable becaause of the humidity but dry 55C is still pretty shrivelling.

Editor
April 17, 2009 1:57 am

George E. Smith (13:56:13) : From all the postings I have seen here; GISStemp inputs are based on daily max/min readings of what; Anomalies; or temperatures ?
The bulk of the GIStemp input comes as an already computed MONTHLY average temperature. NOAA has taken MIN / MAX for each day and turned it into a monthly average single data point for each month for each year and it is that which is taken in by GIStemp and {averaged, homogenized, blended, chopped, formed, anomalized, and extruded} as pasteurized processed data food product anomaly maps.
See: http://chiefio.wordpress.com/2009/02/26/gistemp-step0-input-files/
This is only for the land data, which is too sparse in the time domain (not enough old thermometers) and too sparse in the space domain (most thermometers in the U.S.A., Europe, and Japan; not much elsewhere). See:
http://chiefio.wordpress.com/2009/02/24/so_many_thermometers_so_little_time/
For the ocean and arctic coverage, the sampling is even worse. The anomaly maps are adjusted in step4_5 by applying an anomaly map from elsewhere that uses satellite derived anomalies and simulations based on ice coverage estimates. From the bottom of:
http://chiefio.wordpress.com/2009/02/25/inside-gistemp-an-overview/
In STEP4_5 there are a couple of more bits of data downloaded. gistemp.txt lists these as:
http://www.hadobs.org HadISST1: 1870-present http://ftp.emc.ncep.noaa.gov cmb/sst/oimonth_v2 Reynolds 11/1981-present
One of these (oimonth_v2) is a Sea Surface Temperature anomaly map. Some folks have asserted this means that GIStemp uses satellite data since this anomaly map is used. Yes, the map is made from a combination of surface records and satellite data, but by the time it gets here is it just a grid of 1 degree cells (lat long) with a single anomaly number per month. Not exactly what I’d call “satellite data”. More like a “Satellite derived anomaly map” product.

Which I critique in:
http://chiefio.wordpress.com/2009/03/05/illudium/
And, of course, there is the whole issue of False Precision that runs through GIStemp. It starts with the NOAA data that are presented with a precision in the 1/100 F place yet are derived from readings that were only recorded to the whole degree F (and are NOT over sampled, each day being unique). So exactly where to you get any accuracy at all in the 1/10 ths place when there is no data in the 1/10 ths place? Hmmm?
GIStemp then converts this to 1/10 C and runs with the false precision as I discuss in:
http://chiefio.wordpress.com/2009/03/05/mr-mcguire-would-not-approve/
So we end up with folks panicking over 1/10 C “trends” and what import this has for the world when it is all fantasy precision and trash values based on inadequate data sources thoroughly masticated.
Don’t even mention Antarctica; as the Steig paper demonstrates; that place is a textbook case of sampling genius; was it 12 stations or some similar number that were used to concoct his analysis ?
Or the Arctic which is based on anomaly maps from interpolations from simulations based on estimates of ice cover… (NO exhaggeration.)
Just as wild guess, I would guess that the spatial undersampling would be by 4-5 orders of magnitude at the minimum.
Given that the GIStemp sea “temperature” input is a 1 degree cell anomaly map, you may be off by a few more orders of magnitude… and even much of THAT may be made up. From:
http://chiefio.wordpress.com/2009/02/28/hansen-global-surface-air-temps-1995/
Yup, next paragraph. They talk about “Empiracle Orthogonal Functions” used to fill in some South Pacific data… but it uses “Optimal Interpolations” which sure sounds like they are just cooking each datapoint independently… From here on out when they use EOF data they are talking about this synthetic data. It also looks like they use 1982-1993 base years to create the offsets that are used to cook the data for 1950-81. Wonder if any major ocean patterns were different in those two time periods, and just what surface (ship / bouy) readings were used to make the Sea Surface Temp reconstructions? They do say “The SST field reconstructed from these spatial and temporal modes is confined to 59 deg. N – 45 deg S because of limited in situ data at higher latitudes.” OK, got it. You are making up data based on what you hope are decent guesses. But in GIStemp “nearby” can be 1000km away with no consideration for climate differences, so I’m concerned that the same quality of care is being given here.
So you have a huge gap from 59 N to 90 N and another from 45 S to 90 S (when you get to fill in some Antarctic data, via more massaging from too few sites).
But even that doesn’t quite make it clear. Inspection of their map at:
http://www.emc.ncep.noaa.gov/research/cmb/sst_analysis/images/inscol.png
Shows that ship based data are highly concentrated in the northern hemisphere and near N. America, Europe, China Sea while the ice is estimated crudely and the buoy data have better distribution, but being largely from ‘floaters’ may have very poor temporal resolution for any one space.
My conclusion from all this is that GIStemp is largely a made up fantasy with no significance and is substantially divorced from reality.
Other than that, no problem (with sarcastic tone, please!)

PeteB
April 17, 2009 2:04 am

Thanks all,
I found the exchange here between David Stockwell and Stefan interesting
http://www.realclimate.org/index.php?p=554#comment-84440
and also the stuff on David Stockwell’s site
I am not sure I understand the statistics well enough, but 16 years seems too short a period, (certainly for temperature, maybe not for sea level) to say that temperature is increasing faster than the IPCC estimates – to be fair Stefan did acknowledge that in the original paper
Given the relatively short 16-year time period considered,
it will be difficult to establish the reasons
for this relatively rapid warming, although
there are only a few likely possibilities. The first
candidate reason is intrinsic variability within the
climate system.

Its a pity somebody couldn’t have taken up Stefan’s challenge though
[Response: If you really think you’d come to a different conclusion with a different analysis method, I suggest you submit it to a journal, like we did. I am unconvinced, though. -stefan]

Editor
April 17, 2009 2:22 am

Squidly (17:21:09) : As a long time computer scientist myself, I cannot imagine how one could seriously consider conclusions from codes developed so haphazardly.
They you will just love this. From the top level control script of Step4 of GIStemp, we have these “operator instructions” embedded in an error message:

fortran_compile=$FC
if [[ $FC = ” ]]
then echo “set an environment variable FC to the fortran_compile_command like f90”
echo “or do all compilation first and comment the compilation lines”
exit
fi

Can you imagine what would happen in a real production shop if your compilation procedure is ~”or whatever, and edit the code a bit”…
From:
http://chiefio.wordpress.com/2009/03/07/gistemp-step4-the-process/

It then procedes to check that the environment variable “FC” is set to your FORTRAN compiler. Interestingly enough, it suggest hand editing the script to comment out the compilation steps if you want to do everything by hand. Have these people never heard of using “#DEFINE” or passed parameters to control script execution paths?
Next up, we check that the input file input_files/SBBX.HadR2 exists.
(Which it doesn’t right now. STEP3 at the end says to create it or use STEP4 code to update it. It looks like the creation of the file is left as an exercise for the student.)
The second passed parameter is assigned to “mo” and this is manipulated. We prepend an “0” on any single digits, then look for an input file of the form input_files/oiv2mon.[year][start_month]
(which also doesn’t exist right now so is also left as an exercise for the student. Maybe malt liquer would help… or Scotch… yeah, a nice single malt… Sorry, the mind wanders at times like this when faced with crap like this.)
If we have a third parameter, we assign it to “mo2” though no guidance is given as to when you would want 2 vs 3 parameters or what choices might make the anomalies work out “best”. More play time for the students…
Finally, we get to the meat of it. We compile and run the FORTRAN program convert1.HadR2_mod4.f passing to it the parameters year, monthstart and monthend that were passed into the script, or using the monthstart as the monthend if only one month was given.
At the end we are told what to do as a manual step “if all went well”. One is left to wonder how one knows if all went well, and what the acceptance criteria might be…

3x2
April 17, 2009 2:35 am

Frank K. (11:17:51) :
CO2 Realist (07:53:41) :
(…) Extracting exactly the part that corresponds to a single paper and documenting it so that it is clear what your conventions are (often unstated) is non-trivial. – gavin]” (…)

Given the huge sums invested in “climate science” together with the astronomical carbon taxes being proposed it really is a disgrace. Quite unbelievable that they are allowed to continue with this attitude.
It may be interesting to add up the climate science money plus carbon taxes worldwide and divide that by the papers produced to put a figure on just how “non-trivial” that all works out per paper.
A complete house of cards.

Editor
April 17, 2009 2:47 am

KimW (18:12:41) : The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
[…]”with the most recent lasting from 1400 to 1750.”
[…]The cause of centuries-long megadroughts is not known, but he said the added burden of climate change could make this kind of drought more devastating. Temperatures in this region are expected to rise by 5 to 10 degrees F (2.77 to 5.55 degrees C) this century,”
Little Ice age, roughly 1350 to 1850. And they don’t see this connection to the ‘sweet spot’ in the depths of the middle? They are worried about warming, when clearly the drought strongly correlates with an epic cold period?
“I am ‘gobsmacked’ !!. ”
Me too! I think you got it ‘spot on’ in your opening: “refusal to think on what evidence implies.”

Editor
April 17, 2009 2:49 am

KimW (18:12:41) : The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
[…]”with the most recent lasting from 1400 to 1750.”
[…]The cause of centuries-long megadroughts is not known, but he said the added burden of climate change could make this kind of drought more devastating. Temperatures in this region are expected to rise by 5 to 10 degrees F (2.77 to 5.55 degrees C) this century,”

Little Ice age, roughly 1350 to 1850. And they don’t see this connection to the ‘sweet spot’ in the depths of the middle? They are worried about warming, when clearly the drought strongly correlates with an epic cold period?
I am ‘gobsmacked’ !!.
Me too! I think you got it ‘spot on’ in your opening:

Editor
April 17, 2009 2:51 am

KimW (18:12:41) : The problem is not just with climate models, but the refusal to think on what evidence implies. From an article in Science,
[…]”with the most recent lasting from 1400 to 1750.”
[…]The cause of centuries-long megadroughts is not known, but he said the added burden of climate change could make this kind of drought more devastating. Temperatures in this region are expected to rise by 5 to 10 degrees F (2.77 to 5.55 degrees C) this century,”

Little Ice age, roughly 1350 to 1850. And they don’t see this connection to the ’sweet spot’ in the depths of the middle? They are worried about warming, when clearly the drought strongly correlates with an epic cold period?
I am ‘gobsmacked’ !!.
Me too! I think you got it ’spot on’ in your opening: “refusal to think on what evidence implies.”
(It must be getting late… 3 tries to get on short comment straight?…)

Editor
April 17, 2009 3:02 am

Per GIStemp:
CO2 Realist (22:32:59) :
Frank K. (21:50:45) says: Try to figure out what this code is really doing versus the description given in the “documentation”…
I’ll give it a go, but I’m sure I could take your word for it! It will be an enlightening exercise.

I have the code on line and with comments at:
http://chiefio.wordpress.com/gistemp/
You will probably find that an easier start than trying to decipher the code as downloaded. I’m hoping to get a few more folks looking at it and ‘documenting’ what bits do in the comments to the pages…

3x2
April 17, 2009 4:00 am

On the subject of GIGO it is worth noting that, presuming the models are under continual development, GI also alters the code in the process.
As an example, suppose that as a modeller you are happy with your model representation of the Antarctic. Your model correctly (in line with the other models) sees no warming and possibly some cooling in the region. Along comes Steig and tells you that there is significant Antarctic warming.
As a modeller what do you do with this information? If you uncritically take it at face value then you accept that your model is missing something big
and modify it accordingly. Your model now agrees with the new data. This may require major changes that ripple out through the rest of the model. What if it turns out, as many suspect, Steig is a complete fabrication?
I propose Steig Amplification – similar to GIGO only it alters the code on the way through.

doodle
April 17, 2009 4:06 am

you need a celebrity to take up the cause and make it fashionable to question climate change, maybe susan boyle can be the spokesperson in a years time LOL

Frank K.
April 17, 2009 5:44 am

E.M.Smith (03:02:13) :
Thanks for the link! Your analysis of GISTEMP is extraordinary and worthy of a post here at WUWT (perhaps in several installments). How about it, Anthony? :^)

April 17, 2009 7:20 am

E.M.Smith (02:22:33) says:
Can you imagine what would happen in a real production shop if your compilation procedure is ~”or whatever, and edit the code a bit”…
and
At the end we are told what to do as a manual step “if all went well”. One is left to wonder how one knows if all went well, and what the acceptance criteria might be…

Now you’re really scaring me. Apparently it is much worse than I thought.
E.M.Smith (03:02:13) says:
I have the code on line and with comments…

Great effort. I’ll have a look though Fortran is not my strong point.
3×2 (02:35:47) says:
A complete house of cards.

Quite the understatment.
Anthony, here’s another vote for a post of E.M.Smith’s analysis.
REPLY: Convince him to pack it up into a single document and I’ll have a look. – Anthony

John Galt
April 17, 2009 8:03 am

@E.M. Smith:
We’re constantly told how complex the climate models are. Complexity is subjective, but how about lines of code? How big is the codebase? Does it really require a supercomputer to run?

April 17, 2009 8:17 am

Frank K. (05:44:04) writes:
E.M.Smith (03:02:13) :
Thanks for the link! Your analysis of GISTEMP is extraordinary and worthy of a post here at WUWT (perhaps in several installments). How about it, Anthony? :^)

I added my vote at CO2 Realist (07:20:39) and Anthony replied:
REPLY: Convince him to pack it up into a single document and I’ll have a look. – Anthony
So E.M.Smith, what do you say? I think it would be educational for many here.

Philip_B
April 17, 2009 9:25 am

Masonmart, hot deserts are colder most of the time (nighttime, winter, spring, autumn) than the humid tropics. That is why their average temperature is 10C lower than the humid tropics.
For example, the hottest temperature ever recorded on Earth was 58C in El Azizia, Libya (Sahara desert). Yet the winter (January) average temp in Azizia is only 11.5C.
Even in the middle of summer, hot deserts aren’t any hotter than the humid tropics. Both have average temps around 27C/28C. Compare El Azizia with Singapore.
http://hypertextbook.com/facts/2000/MichaelLevin.shtml

Tom P
April 17, 2009 11:51 am

The problems in the science of Monckton’s analysis have been presented elsewhere, e.g.:
e.g.http://arxiv.org/abs/0811.4600?context=physics
and the plots at the top of this very thread are hardly consistent with his claim of current global cooling.
But what most amused me in his submission to Congress referenced above was his frequent use of a logo in the background of his figures that bears a striking resemblance to the portcullis insignia of the Houses of Parliament:
http://img5.imageshack.us/img5/3170/portcullis.png
I suppose these were used by Monckton as a vague appeal to authority, although the use of the portcullis insignia is tightly regulated:
“In 1996, the usage of the crowned portcullis was formally authorised by licence granted by Her Majesty the Queen for the two Houses unambiguously to use the device and thus to regulate its use by others. The emblem should not be used for purposes to which such authentication is inappropriate, or where there is a risk that its use might wrongly be regarded, or represented as having the authority of the House.”
But what is amusing is not whether Monckton is misusing the insignia in his submission. Although he inherited a title, Viscount Monckton is not even a member of the House of Lords. When he stood for election in 2007 by his fellow Conservative peers, he received precisely zero votes.

Roger Knights
April 17, 2009 4:14 pm

Thom Scrutchin (17:54:00) wrote:
“The new acronym should be AGWA for Anthropogenic Global Warming Alarmism.”
Great idea, because it’s pronounceable–and sounds silly.
(BTW, since AGW isn’t pronounceable, it isn’t an acronym.)

Kohl Piersen
April 17, 2009 5:16 pm

@ TomP – “But what most amused me …” and what follows
Why do you think that this public demonstration of your personal antipathy towards Lord Monckton is of any interest to anyone?
Rather than continuing to speak through your backside, I suggest that you put it back in your pants, where it undoubtedly belongs, and then …. sit on it!

Philip_B
April 17, 2009 7:02 pm

the plots at the top of this very thread are hardly consistent with his claim of current global cooling.
Tom P, thanks for giving us an example of the reality denial that pervades the Warming Believer camp.
According to you, temperatures going down is not cooling.

Tom P
April 18, 2009 12:36 am

Philip_B,
Please read the y-axis label of the three figures: it is the trend, not the temperature that is decreasing. All three figures indicate we continue to warm, not cool.
Can you offer any explanations for that?

April 18, 2009 12:20 pm

Lucia’s analysis was inspired by my observation and analysis showing that observed long-term temp trends were rising in the 2001-present period, even though the trend within the period was flat or declining.
I compare these observed longer (20-year) trends to the IPCC early 21st century projected trend of 0.2 C/decade, based on the multi-model ensemble used in IPCC AR4 WG1 Chapter 10 projections. This same benchmark was used by Roger Pielke Jr and Pat Michaels in their analyses, and is fairly standard as far as I know.
The 20-year trends for GISTemp, NOAA and HadCRUT were all ahead of the benchmark earlier this decade, and are presently a little under. This fluctuation about the benchmark value is hardly surprising, given normal interannual variations and a relatively cool La Nina year at the end point.
See:
http://deepclimate.org/2009/04/18/20-year-surface-trends-close-to-models/
http://deepclimate.files.wordpress.com/2009/04/20-year-trend.gif
Thanks for reading.

John M
April 18, 2009 1:26 pm

Deep Climate

compare these observed longer (20-year) trends to the IPCC early 21st century projected trend of 0.2 C/decade, based on the multi-model ensemble used in IPCC AR4 WG1 Chapter 10 projections.

Except Lucia and others have pointed out to you that you’ve sort of cherry-picked that 0.2 C/decade benchmark (how ’bout those volcanoes?)
http://rankexploits.com/musings/2009/hadcrut-march-data-available/#comment-12890
http://rankexploits.com/musings/2009/hadcrut-march-data-available/#comment-12900

Niels A Nielsen
April 18, 2009 1:33 pm

Deepclimate writes:

Deep is not comparing aples to aples. The models’ projected 20 year trends with endpoint 2009 would be far lower than 0.2c/decade if a couple of stratospheric volcano eruptions had occurred within the last 5 years (volcanos cause cooling) and no eruptions had ocurred for the preceding 15 years.
Now that is not the case. El chichon (82) and Pinatubo (91) erupted followed by a vulcanic lull until now, making the models’ projected trend higher than 0.2C/decade as Lucia has explained.
Deep refuse to understand. I trust he will maintain his point about the fixed benchmark of 0.2C/decade should a large stratospheric volcano erupt in the near future and the 20 year observational trend in 5 years time is no longer affected by the Pinatubo eruption at the other end of the 20 year trend line 😉
http://rankexploits.com/musings/2009/hadcrut-march-data-available/

John M
April 18, 2009 1:33 pm

Hmm.
Looks like the individual comment links are not unique.
You’ll have to scroll down to read Lucia’s relevant comments.

Niels A Nielsen
April 18, 2009 1:47 pm

Deep Climate (12:20:03) :

I compare these observed longer (20-year) trends to the IPCC early 21st century projected trend of 0.2 C/decade, based on the multi-model ensemble used in IPCC AR4 WG1 Chapter 10 projections.

Deep is not comparing apples to apples. The models’ projected 20 year trends with endpoint 2009 would be far lower than 0.2c/decade if a couple of stratospheric volcano eruptions had occurred within the last 5 years (volcanos cause cooling) and no eruptions had ocurred for the preceding 15 years.
Now that is not the case. El chichon (’82) and Pinatubo (’91) erupted followed by a vulcanic lull until now, making the models’ projected trend higher than 0.2C/decade as Lucia has explained.
Deep refuse to understand. I trust he will maintain his point about the fixed benchmark of 0.2C/decade should a large stratospheric volcano erupt in the near future and the 20 year observational trend in 5 years time is no longer affected by the Pinatubo eruption at the other end of the 20 year trend line 😉
http://rankexploits.com/musings/2009/hadcrut-march-data-available/

April 18, 2009 1:54 pm

PeteB:
I have submitted a different conclusion with the same analysis method, simply with two additional years data. The non-linear trend is no longer in the upper region of the IPCC projections for temperature. See powerpoint at post http://landshape.org/enm/newcastle-lecture-update/ for the image.

Mike Bryant
April 18, 2009 2:04 pm

Thom Scrutchin (17:54:00) wrote:
“The new acronym should be AGWA for Anthropogenic Global Warming Alarmism.”
How about GWAVA, Global Warming, Anthropogenic, Very Alarming…
That one sounds silly too…

Tom P
April 18, 2009 3:16 pm

David Stockwell,
Your analysis is heavily skewed by a single year’s inclusion, 2008. 2009 is shaping up considerably warmer, and so the trend is now being pushed up again.
Given the noise in the data, the end of trend line will always be wiggling around like the head of a snake. You’ve snapped the snake head looking down, as it has before, but the full movie will show it’s often pointing above trend. If you use monthly rather than yearly data and calculate up to date you’ll see the snake is starting to look up again.
Should a paper on this topic be so time sensitive that its conclusions depend on whether the trend is calculated in 2006, 2008 or 2009?

April 19, 2009 2:15 am

Tom P
“Given the noise in the data, the end of trend line will always be wiggling around like the head of a snake.”
That is the point, that Rahmstorf, Hansen and others made their heavily cited
conclusions on the basis of an uncertain pretext. To say, ‘O but its only
a couple of years data and the endpoint is very uncertain’, and then say that
Rahmstorf et al’s analysis is correct and wonderful is completely inconsistent.
I am not claiming that the trend has turned down. I am claiming that
Rahmstorf’s conclusion that the climate models underestimate was so fragile
that even two years data shows it to be bogus.

Tom P
April 19, 2009 5:23 am

David,
I see your point, though “fragile” is the right word, not bogus – you can’t show Rahmstorf is wrong, just that he hasn’t made the case.
There is certainly a contradiction between your results and Lucia’s analysis. You show that temperatures are near the mid-range of IPCC TAR,
http://landshape.org/enm/wp-content/uploads/2008/10/liite5_paivitetty_rahmstorf.jpg
while Lucia shows them falling below. I think the model comparison is different, but this might also to be due to her using long-term linear least-square trends, which will always fall below even a perfect model which predicts temperatures above a linear trend.
An OLS trend is also a poor choice as it weights data points at the beginning of the series equally to the most recent. The best estimate of the future trend at any point will be from a Kalman filter, which reduces to an exponentially weighted least squares trend. Wouldn’t this be the best way of comparing data to the model?
By the way, have you posted a version of your submitted article on this?

April 19, 2009 2:18 pm

Tom P
The specific error I claim, is that they made a type 1 error, rejecting a null hypothesis when there was no change in trend. While this appears inconsistent with Lucia, it actually makes no claim about that, and I don’t believe you can conclude that trend has not changed, as that is another test. Lucia uses different methods so they are not comparable. And anyway, their claim was that the trend had changed into high model areas. The data shows trend has not. Says nothing about whether its changed middle or low, and there is also the fuzzy issue of how model projections are spliced onto the temperature data that makes their analysis very poorly structured from an analytical pov.
If a forecast is wrong, it is not surprising for it to be quickly shown to be wrong by subsequent data. But the method needs to be held constant. Sure there are other methods, but then you get into my method vs your method, rather than showing an error of judgment, on their chosen method.
I don’t think its fair to the journal to post before publication, and editors can reject on that basis.