#AGU14 poster demonstrates the divergence problem with IPCC climate models and observations

Earlier this week I to reported on some of the poster sessions at the American Geophysical Meeting but was told the next day that I’m not allowed to photograph such posters to report on them. However, when the authors send me the original, for which they own the copyright, there’s nothing AGU can complain about related to me violating their photography policy.

This poster from Pat Michaels and Chip Knappenberger builds on their previous work in examining climate sensitivity differences between models and reality.

The annual average global surface temperatures from 108 individual CMIP5 climate model runs forced with historical (+ RCP45 since 2006) forcings were obtained from the KNMI Climate Explorer website. Linear trends were computed through the global temperatures from each run, ending in 2014 and beginning each year from 1951 through 2005. The trends for each period (ranging in length from 10 to 64 years) were averaged across all model runs (black dots). The range containing 90 percent (thin black lines), and 95 percent (dotted black lines) of trends from the 108 model runs is indicated. The observed linear trends for the same periods were calculated from the annual average global surface temperature record compiled by the U.K. Hadley Center (HadCRUT4) (colored dots) (the value for 2014 was the 10-mon, January through October, average). Observed trend values which were less than or equal to the 2.5th percentile of the model trend distribution were colored red; observed trend values which were between the 2.5th and the 5th percentile of the model trend distribution were colored yellow; and observed trend values greater than the 5th percentile of the model trend distribution were colored green.
The annual average global surface temperatures from 108 individual CMIP5 climate model runs forced with historical (+ RCP45 since 2006) forcings were obtained from the KNMI Climate Explorer website. Linear trends were computed through the global temperatures from each run, ending in 2014 and beginning each year from 1951 through 2005. The trends for each period (ranging in length from 10 to 64 years) were averaged across all model runs (black dots). The range containing 90 percent (thin black lines), and 95 percent (dotted black lines) of trends from the 108 model runs is indicated. The observed linear trends for the same periods were calculated from the annual average global surface temperature record compiled by the U.K. Hadley Center (HadCRUT4) (colored dots) (the value for 2014 was the 10-mon, January through October, average). Observed trend values which were less than or equal to the 2.5th percentile of the model trend distribution were colored red; observed trend values which were between the 2.5th and the 5th percentile of the model trend distribution were colored yellow; and observed trend values greater than the 5th percentile of the model trend distribution were colored green.
Introduction:

Recent climate change literature has been dominated by studies which show that the equilibrium climate sensitivity is better constrained than the latest estimates from the Intergovernmental Panel on Climate Change (IPCC) and the U.S. National Climate Assessment (NCA) and that the best estimate of the climate sensitivity is considerably lower than the climate model ensemble average.

From the recent literature, the central estimate of the equilibrium climate sensitivity is ~2°C, while the climate model average is ~3.2°C, or an equilibrium climate sensitivity that is some 40% lower than the model average.

To the extent that the recent literature produces a more accurate estimate of the equilibrium climate sensitivity than does the climate model average, it means that the projections of future climate change given by both the IPCC and NCA are, by default, some 40% too large (too rapid) and the associated (and described) impacts are gross overestimates.

A quantitative test of climate model performance can be made by comparing the range of model projections against observations of the evolution of the global average surface temperature since the mid-20th century.

Here, we perform such a comparison on a collection of 108 model runs comprising the ensemble used in the IPCC’s 5th Scientific assessment and find that the observed global average temperature evolution for trend lengths (with a few exceptions) since 1980 is less than 97.5% of the model distribution, meaning that the observed trends are significantly different from the average trend simulated by climate models.

For periods approaching 40 years in length, the observed trend lies outside of (below) the range that includes 95% of all climate model simulations.

Quantifying the Lack of Consistency between Climate Model Projections and Observations of the Evolution of the Earth’s Average Surface Temperature since the Mid-20th Century
Patrick J. Michaels and Paul C. Knappenberger Center for the Study of Science, Cato Institute, Washington DC
Published at AGU: https://agu.confex.com/agu/fm14/meetingapp.cgi#Paper/20121
Full poster here: http://object.cato.org/sites/cato.org/files/articles/agu_2014_fall_poster_michaels_knappenberger.pdf
0 0 votes
Article Rating
137 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
December 19, 2014 8:54 am

The abstract of this paper states
“We conclude that at the global scale, this suite of climate models has failed. Treating them as mathematical hypotheses, which they are, means that it is the duty of scientists to, unfortunately, reject their predictions in lieu of those with a lower climate sensitivity.
Unless (or until) the collection of climate models can be demonstrated to accurately capture observed characteristics of known climate changes, policymakers should avoid basing any decisions upon projections made from them. Further, those policies which have already be established using projections from these climate models should be revisited. ”
Section 1 of my post at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html
concerns the inutility of the IPCC climate models for forecasting purposes. It concludes in complete agreement with Michaels and Knappenberger:
“In summary the temperature projections of the IPCC – Met office models and all the impact studies which derive from them have no solid foundation in empirical science being derived from inherently useless and specifically structurally flawed models. They provide no basis for the discussion of future climate trends and represent an enormous waste of time and money. As a foundation for Governmental climate and energy policy their forecasts are already seen to be grossly in error and are therefore worse than useless. A new forecasting paradigm needs to be adopted.”
Using a new forecasting paradigm ,the same post contains estimates of the timing and amplitude of the possible coming cooling based on the natural 60 and important 1000 year quasi-periodicities seen in the temperature data and using the 10Be and neutron data as the best proxy for solar “activity”.

RoHa
Reply to  Dr Norman Page
December 19, 2014 2:43 pm

So the models don’t actually work? They just stand there looking pretty? Shame we can’t have any photos to make our own observations of them.

Reply to  RoHa
December 19, 2014 2:52 pm

So the models don’t actually work? They just stand there looking pretty?
That is work, for models…
http://upload.wikimedia.org/wikipedia/en/f/f4/PlNTM1.jpg

RoHa
Reply to  RoHa
December 20, 2014 4:48 am

Thank you for that. I shall have to make very careful observations. It may take some time.

Stargazer
Reply to  RoHa
December 20, 2014 7:06 am

The baker’s dozen model ensemble presented so colorfully by Leo Smith contains a number if figures that I think should be thoroughly validated. I hereby nominate myself to take on this challenging task. Any other volunteers?

Bill Fish
Reply to  Dr Norman Page
December 25, 2014 12:54 pm

Patrick Michaels:Cato’s Climate Expert has History of getting it Wrong! http://www.skepticalscience.com/patrick-michaels-history-getting-climate-wrong.html

David Schofield
December 19, 2014 8:54 am

If people put up a poster they are openly publicising it. Don’t understand why you can’t photo them.

brians356
Reply to  David Schofield
December 19, 2014 9:22 am

They’re more concerned with unflattering photos of attendees snoozing or picking their noses.

Frank
Reply to  David Schofield
December 19, 2014 9:35 am

Because it might embarrass the conference organizers?

Steve from Rockwood
Reply to  David Schofield
December 19, 2014 10:40 am

Not really as the AGU conference is not a free event open to the general public.

Reply to  David Schofield
December 19, 2014 10:56 am

Not exactly true – you have to pay to get in. If you photograph & distribute, then the AGU is losing potential revenue from those who otherwise might have paid to see. This is pretty standard for most technical society conferences

AndyE.
Reply to  David Schofield
December 19, 2014 12:57 pm

They are not scientists – they are typical politicians. Real scientists actually seek to be proved wrong so that they can change and improve their theories (ref. Richard Feynmann). Politicians are just the opposite. You try make a politician admit he/she is wrong.

Crispin in Waterloo
Reply to  David Schofield
December 19, 2014 10:59 pm

Events which charge admission frequently have contract clauses for the participants which give exclusivity or copyright to the hosting organisation. They also penalise stand and poster owners who do not appear or who depart early. The reason is they advertise “x-many stands and poster sessions” and “y-many classroom sessions”. When people pay and enter and see half the place empty or the stand staff heading off for beer at 3 PM they demand their money back. So there are clauses that keep the content unique and the stands attended for specified hours. The organiser is selling what you are paying to display or talk about. Nice work if you can get it.
Sending Anthony the poster before the end is probably a technical foul.

Adrian_O
Reply to  David Schofield
December 20, 2014 10:31 am

The main source of money (I know from the American Math Society, but I assume it’s similar here) is an annual membership fee of about $100-$200, from which they provide some professional services, organize events such as this one at which they pay the main speakers only. And publish a few journals and book series.
The fellows who happen to walk in front of the Moscone center, are curious to see the posters and pay the entrance fee are NOT a significant source of income…

Reply to  Adrian_O
December 20, 2014 5:44 pm

The evidence points to the ‘Meth Society’ running the AGU 😉

n.n
December 19, 2014 9:14 am

The system is incompletely or insufficiently characterized and unwieldy, which ensures our perception will remain limited to correlation between cause and effect. It’s like the scientific consensus has lost touch with the scientific domain. They really should resist urges to dabble in predictions.

Stacey
December 19, 2014 9:15 am

I think if the King came into the room naked whilst they all admired his clothes then Anthony’s photographs would reveal all 🙂

cheshirered
December 19, 2014 9:22 am

Denying the world’s leading climate blogger the opportunity to use photo’s of their oh-so-important climate models doesn’t sound much like transparency to me. I wonder why they would take such a position…?

Steve from Rockwood
Reply to  cheshirered
December 19, 2014 10:43 am

Perhaps they are trying to reduce the dominance (high-jacking) of the climate change issue and moving climate scientists to the most remote presentation venues and limiting photography are part of that effort.

kenw
Reply to  cheshirered
December 19, 2014 10:43 am

simple copyright. It works both ways.

george e. smith
Reply to  kenw
December 19, 2014 12:02 pm

I don’t mind copyright protection, although it is generous beyond belief compared to patent protection. And inventive ideas have to be useful, as well as novel, and non-obvious to one “of ORDINARY skill in the art”.
And if you continue to pay fees, you can get 17 years or so (maybe 20) of protection.
With copyright, even if it is total rubbish; totally fictional, of no commercial value to anyone, you get lifetime, I think plus 50 years and evidently extendability.
Such a deal.
But if taxpayer funded grant money is used to produce such materials, the public should be the copyright owner.
If I invent something while on the job, my employer, who pays the bills, retains ownership of any patents, and I’m happy for him.
But just remember; there is NO requirement that copyrighted materials be factual, or accurate, or even useful for any purpose.
And as we can see in the “science” literature, much of it is worthless rubbish.
Two thirds of all US Physics PhD “graduates” wrote a thesis on something so totally useless, (but original) that nobody is willing to pay them to come and work on their “speciality” for them. So they end up as Post doc fellows somewhere, where they can try and interest some new naïve students in their arcane trivia.
If there was some expectation that PhD thesis results, had to be of some redeeming value, rather than ‘nobody thought of doing this before’, there would be a lot less doctors, and a lot more actual working physicists doing useful things for a living.
But I’m happy that they can choose what to put their name to for posterity.

garymount
Reply to  kenw
December 19, 2014 5:38 pm

If my poster has nothing but a drawn circle on it, though you might not be able to photograph it and use the image elsewhere due to my copyright rights, there is nothing stopping you from re-creating the circle yourself and widely distributing its image.
Other types of copyrighted work, such as an image of Mickey Mouse can not be reproduced, unless it falls under the fair use provisions of the law.
And then there are trademarks…

brians356
December 19, 2014 9:25 am

“Black circles – multi-model mean trend” Huh? Which models ever showed a sharp dip like that?

michaelspj
Reply to  brians356
December 19, 2014 5:36 pm

The models now have Pinatubo tuned into them. That’s why there’s the dip.

December 19, 2014 9:38 am

What I notice more and more is no one wants to admit they are wrong when it comes to forecasting the climate while the reality is everyone thus far has been wrong. No one can take being held accountable. Every time I try to take that path it is met in a hostile way. If you can’t stand the heat get out of the kitchen which is what they should all do. Always an excuse or this did not happen or that.
I might add the solar forecast have been equally quite bad.
This is why I do not subscribe to any one particular when it comes to why and how the climate may change going forward. I have my own thoughts which I have expressed many times but my thoughts are tied to solar activity which still remains much stronger at present then I ever imagined at this point in time. If solar activity should reach my low value parameters and the climate does not respond the way I think it should I will say I am wrong. No excuses.
I will not know however unless solar activity becomes very minimal and last for quite sometime in duration. My confidence in this is not as high as it was some two years ago. I have been fooled by this cycle and really have no clue to what lies ahead going forward for solar activity. I think/guess it will be on the decline soon and stay quite weal for some time.
I guess monitoring is the best way. Time will tell.

Reply to  Salvatore Del Prete
December 19, 2014 12:29 pm

I printed off the prediction for the last solar cycle. Nobody predicted what happened. It was suppose to be a lot like the one before it. Since it seems that the climate has gone into a hiatus, as far as temps, it will be interesting to see if the climate tracks the solar cycles if solar activity continues to decrease.

BobW in NC
Reply to  Salvatore Del Prete
December 19, 2014 4:29 pm

Salvatore – Don’t know if you saw this, but I think this point answers your first paragraph—”Once government takes up an issue it will expand and never be resolved. There is nothing ironic about the fact that, as always, the people will pay the price and the politicians and deceivers will not be held accountable.” ( http://wattsupwiththat.com/2014/12/18/ironically-change-catches-up-with-climate-change-alarmists-in-lima/ (next to last paragraph)
l Understand your frustration…

bones
December 19, 2014 9:41 am

What happened about 1992 to change the modeled trend so abruptly?

Reply to  bones
December 19, 2014 10:58 am

Mt Pinatubo.

bones
Reply to  Chip Knappenberger
December 19, 2014 12:52 pm

and it changed the trends of a lot of models forever after?

Michael C. Roberts
December 19, 2014 9:43 am

Governor Jay “I’ll Pass a Carbon Tax on my Watch” Inslee of the State of Washington in the good Ole’ US of A, really needs to be read aloud the results of this study, and the point must be made – preferably in a public forum, with cameras rolling – repeatedly, until he cannot just use the “but-it’s for the children” semi-truth as he proposes costly “carbon (sic) taxes” on “large carbon polluters” that purvey fuels based upon the carbon atom, in the (former) Great State of Washington. After all, it would only raise an additional (estimated) $947,000,000.00 in the first year of 2017 (equating to approximately $147.50 new tax for every man, woman, and child in the State or +/- 6.5 million inhabitants)
http://www.king5.com/story/news/politics/2014/12/18/inslee-capital-gains-tax/20593487/
http://www.washingtonpost.com/blogs/govbeat/wp/2014/12/18/washington-governor-proposes-billion-dollar-carbon-emissions-cap-and-trade-plan/
I doubt actual facts would be able pierce through his watermelon-rind of a cranium to actually influence his steamroll of a plan. Best scenario would be that the green taxing of the proletariat through secondary means such as proposed, would slow the immigration of Americans/Foreign Immigrants to the overtaxed State of Washington there are places to move to that do not tax the air!!!
Ramble ended, thanks for reading.
Michael C. Roberts

December 19, 2014 9:44 am

http://spaceweatherlive.com/en/solar-activity/solar-cycle-progression
See how far off this is becoming especially solar flux.

December 19, 2014 9:44 am

Methane it is said is way over 20 times more powerful than CO2 as a Green House Gas. I’ve found that the biggest reason for that claim is that at 1 or 2 ppm or so it doesn’t take much to double its concentration. That implies that the logarithmic nature of a Green House Gas is in operation at 1 or ppm. In the world of geese and ganders that should apply to CO2 as well.
Dr. James Hansen tells us in Chapter 8 of the IPCC’s AR4 Report
http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8s8-6-2-3.html
… the climate response to a doubling of … CO2 … with no feedbacks operating … the global warming from GCMs would be around 1.2°C.
If you double 1 or 2 ppm 7 or 8 times you get around 400 ppm and it follows that the warming would be around 9°C.
We commonly hear that Green House Gas keeps us 33°C warmer than we otherwise would be. And we also hear that CO2’s contribution is anywhere from 9% to 26% or that. As can be seen from the above, warming with no feed backs 9°C is 27% of the 33 degrees and exceeds those limits.
Our friends at SkepticalScience provide this rebuttal:
http://www.skepticalscience.com/argument.php?p=2&t=318&&a=115
How sensitive is our planet?
#53 Glenn Tamblyn at 18:15 PM on 21 September, 2010
I remain unconvinced considering the limits of the logarithmic nature of Green House Gas that CO2’s Climate Sensitivity is much different with or without feed backs. If anything, it’s less.

Reply to  Steve Case
December 19, 2014 3:52 pm

Do you not simply shake your head when you see information saying CO2 represents between 9% and 26% of the 33C of supposed greenhouse warming? Not exactly a solid foundation of physics and mathematics to make a computer model from is it?
The interesting thing about the two extremes of what CO2’s greenhousyness is supposed to be is how each one can be used for the narrative of global warming. The 26% one fits nicely into the logarithmic graph you mentioned allowing proponents to blame the ice ages on low CO2. Zero CO2 being some 8.55C colder. Sadly that narrative doesn’t correspond well to the doomsday catastrophic warming scenario though. If the last 120ppm rise between 1860 and today only caused 0.9C of temperature increase, when the previous 280ppm is responsible for over 7.6C then CO2 is aready spent. So bring in the 9% narrative. That fits well with the “2C rise above pre industrial levels with a doubling of CO2 since 1860”. If you draw that one on a graph you get a beautiful straight line from zero through 280ppm to the present that corresponds perfectly to the 0.9C rise in temperature and predicts the total 2C of post industrial warming by 560ppm. The only issue is it puts the total greenhousyness of CO2 today at 2.97C and leaves the temperature of the ice ages at 1.34C lower than today. Then there’s Al Gore and Michael Mann’s graphs! Oh dear! The problem with trying to reconcile the last 0.9C of warming and 120ppm of CO2 rise with a compounding effect is twofold. First is that it takes the total greenhousyness of CO2 well below the 9% parameter as if it compounding going forward it is logarithmic going backwards. Then there is the pure problem of compounding itself. If you’re going to predict that the next 120ppm of CO2 is going to produce double the temperature rise of the last 120ppm then you have to admit you’re advocating Earth’s surface temperature to surpass that of the planet Venus before CO2 levels get to those commonly experienced in a room full of people as the exhale! I wouldn’t put it past Al Gore to argue for it, but even for him, it’s a hard sell!!

Reply to  wickedwenchfan
December 19, 2014 5:12 pm

The rebuttal I linked to says that climate sensitivity changes with climate. And still if you try to make a sensitivity of 3.2°C fit you have to say that the green house effect isn’t logarithmic below 20 ppm. That’s a figure I’ve heard bandied about, and now I know why.
Claiming a sensitivity of 3.2°C is like trying to put ten gallons of gas in a five gallon can.
Thanks for the reply.

December 19, 2014 9:51 am

Methane it is said is way over 20 times more powerful than CO2 as a Green House Gas. I’ve found that the biggest reason for that claim is that at 1 or 2 ppm or so it doesn’t take much to double its concentration. That implies that the logarithmic nature of a Green House Gas is in operation at 1 or ppm. In the world of geese and ganders that should apply to CO2 as well.
Dr. James Hansen tells us in Chapter 8 of the IPCC’s AR4 Report
http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8s8-6-2-3.html
… the climate response to a doubling of … CO2 … with no feedbacks operating … the global warming from GCMs would be around 1.2°C.
If you double 1 or 2 ppm 7 or 8 times you get around 400 ppm and it follows that the warming would be around 9°C.
We commonly hear that Green House Gas keeps us 33°C warmer than we otherwise would be. And we also hear that CO2’s contribution is anywhere from 9% to 26% or that. As can be seen from the above, warming with no feed backs 9°C is 27% of the 33 degrees and exceeds those limits.
I guess I can’t link to or mention John Cook’s web site but you can find a rebuttal for this argument over there on a search for the following title and post number:
“How sensitive is our planet?”
#53 Glenn Tamblyn at 18:15 PM on 21 September, 2010
I remain unconvinced considering the limits of the logarithmic nature of Green House Gas that CO2’s Climate Sensitivity is much different with or without feed backs.

December 19, 2014 9:52 am

I give up

Ralph Kramden
December 19, 2014 9:54 am

I think most scientists and people would agree the climate models have failed. But I also think the politicians and activists will continue to use them.

Rob Dawg
December 19, 2014 9:54 am

The real takeaway from this study is that even if temperatures begin to fall back within even very generous error bounds, the models have spent so long so consistently out of range that they are wrong. That’s actually good news, at least good news once the climate modellers admit they are wrong.
“I have not failed. I’ve just found 10,000 ways that won’t work.”
~ Thomas A. Edison
Now the job begins trying to find a model that works.

JeffC
Reply to  Rob Dawg
December 19, 2014 11:33 am

why ? its silly and arrogant to think we can accurately model a chaotic system … its a waste of time and money …

Reply to  JeffC
December 19, 2014 11:47 am

It is silly to assume that the climate system is chaotic when it clearly isn’t. There are obvious periodicities in the Milankovitch cycles which have been stable for hundreds of millions of years>Similar quasi-periodicities are seen in the solar activity and temperature data. I agree that numerical reductionist models are useless. My approach is quite different – you should actually read the linked post before commenting.

Reply to  JeffC
December 19, 2014 12:52 pm

It is silly to think that models of chaotic systems cannot provide useful information, or that a chaotic system has no stable or even periodic trajectory.

Reply to  Rob Dawg
December 19, 2014 2:56 pm

A model that works?
dT/d(CO2) ~= 0?

John Francis
Reply to  Leo Smith
December 21, 2014 12:20 am

Much closer sensitivity equation than any other figures being bandied about. How can any “scientist” believe the 33 degree figure, after you research where it came from (flat earths etc)? Astounding.

John Whitman
December 19, 2014 10:07 am

Patrick J. Michaels and Paul C. Knappenberger,
It is a very well-conceived debate stimulating poster. Thanks, debate is what we need to have more of in climate focused science if that science is to regain some trust in the critical public’s eye.
My understanding of your poster is that the models are too warm to be statistically credible when compared to observations. I add to it my view that your poster’s finding is very much more significant when one considers that the observed temp dataset used in your poster can be reasonably considered to have significant biases in the warm direction.
Happy holiday season to you guys.
John

whiten
December 19, 2014 10:14 am

Thanks Anthony.
A very intresting information.
The summary of the introduction above to me reads like something as:
“The test of the models while compared to reality show that the model performance is not good enough.
The range of the possible AGW is not correct.
The AGW is not as strong as projected by the range of the models,but never the less is AGW.
While compared to the reality the AGW seems to be less strong and possibly not that ctrastophic, but again never the less an AGW.”
The main problem is that the models these guys complain about are much much better at estimating the possible range of the AGW than them.
That range means that there is no possibliity under any circumstances to have an AGW above the upper range…..and below the lower range there is no any chance of AGW. The models are very well performing at estimating that.
Simply the reality shows that the AGW while compared to the reality drops very low from a 97% certanty to nearly nothing, simply because the models project a very clear picture how the AGW should look like, and the reality seems no where near it.
These guys seem to claim that it should still be AGW, but below the range the models projected as possible.
My bet is that the models are far much better than these two at estimating how AGW should look like.
If it does not look like AGW most probably it is not. There is no need to maintain up the 97% certainty of AGW, by moving it to a lower range of impact.
Little tip:
Climate sensitivity (CS) been considered as ECS (equilibrium climate sensitivity) at any value above 2C means a metric for estimating AGW.
CS becomes and is considered ECS, as the same actually, only in the AGW, as a requirment to estimate the possible range of the AGW and measure it at any given point in time.
The ECS has no value for estimating or measuring variation in climate in regard to temp/CO2 variation, before the anthropogenic era in principle, as it been just an exageration of the CS in principle, as far as natural climate concerned. (actually makes no sense in principle as it does, in principle, mean a possible change of the metric to a different range due to the climate moving towards a new equilibrium, aka the ACC-AGW)
So if imagening some one contemplating with a GCM that the ECS is ~2C and requiring from the GCM to take that in account and adjust the AGW projections, the most probable thing that it will get as a response from a GCM will be something as:
“Please do grow up, and don’t ask for such silliness”.
cheers

Robert of Ottawa
December 19, 2014 10:29 am

That graph is really stretching it – 2.5th and 97.5th between .55 and MINUS 0.1 C/decade.

average joe
Reply to  Robert of Ottawa
December 19, 2014 12:29 pm

Robert – that’s because at the far right the trend is taken over a period of only 10 years, the shorter the time period of the trendline, the noisier the signal. The wide error bars at the right threw me too until I gave it some thought, then it made sense.

pdtillman
December 19, 2014 10:35 am

Make sure to visit
http://climateaudit.org/2014/12/11/unprecedented-model-discrepancy/
Great post & thread, one of McIntyres’s best, lots of Heavy Hitters in the comments. Not to be missed, to stay up-to-date on the model-failure problem.
Cheers, Pete Tillman
Professional geologist, amateur climatologist

Resourceguy
December 19, 2014 10:37 am

From the abstract, the word “unfortunately” is inappropriate for professional statistical presentations. It is either acceptance or rejection of the null and there is no unfortunate this or that to it. Period

Walt D.
December 19, 2014 10:54 am

By trying to condense global temperatures to a single arithmetic average, we are throwng away a huge about of detail. We have time series at every point. Since the effect we are looking for is global, the hypothesis we are trying to test is whether or not the actual temperature at any given location has changed over time. We can partition the data set into locations where the temperature has increased and locations where it has not. If significant global warming is taking place, the majority or locations should show significant increases.
Another question – why use the arithmetic average as opposed to T squared or T to the fourth power?
It would seem that T to the fourth power might give something which relates to some physical property. (I know that the Earth and the oceans in particular do not behave as a perfect black body).

LeeHarvey
December 19, 2014 11:06 am

What strikes me:
Michaels and Knappenberger essentially show the inverse of model certainty on their graph – that is to say, the ‘error bars’ take into account a greater range of models than the primary trendlines.
When including another five percent of model runs, the aggregate accuracy gets pretty drastically worse.
This should be a learning moment for those in power… they shoud realize that the most extreme models are to be summarily disregarded rather than focused upon.
Sadly, I doubt there’s much money in pragmatism.

John Whitman
Reply to  LeeHarvey
December 19, 2014 12:10 pm

{bold emphasis mine – JW}
LeeHarvey says on December 19, 2014 at 11:06 am ,
“[. . .]
This should be a learning moment for those in power… they shou[l]d realize that the most extreme models are to be summarily disregarded rather than focused upon.
Sadly, I doubt there’s much money in pragmatism.”

LeeHarvey,
If by the word ‘pragmatism’ one refers in any way to the American Pragmatism School of Thought (Philosophy)*** then an immense amount of American money is involved; specifically all the American money involved to date and ongoing in the myopic promotion of the observationally failed theory of significant climate change by CO2 from fossil fuel use.
The American Pragmatism Tradition in Philosophy basically says what is right (in the realm of cultural values, social structure and gov’t economic policy) is what works when constructing continuous politically implemented social/cultural/ economic experiments to see if they work. As to the meaning of ‘they work’ it has always referred in the American Pragmatism Tradition of Philosophy to mean they work to the benefit some collective. Pragmatism fully anticipates that the experiments, if they work, would likely only work for a limited period; thus they expect endless experimentation as a normal process. You can see in the Climate Change Cause in America the essence of American Pragmatism Traditions.
***American Pragmatism School of Thought ( Philosophy) is the well known and currently widely held philosophical tradition in both academia and in politics that was started and established by the following Amercians: Charles Sanders Peirce (1839–1914), William James (1842–1910) and John Dewey (1859–1952).
John

David in Cal
December 19, 2014 11:15 am

Naïve question, but I hope someone can answer. precisely what is meant by “Multi-model Mean Trend”, which is the big black dots. I guess the is the average trend of a certain set of models. Presumably, for each model the Trend is derivable from whatever Climate Sensitivity that model derives. Is that right? And, the Multi-model Men Trend changes over time. So, that apparently means that the set of model trends being averaged keeps changing. So, precisely which set of models is used for an average at any point in time?
Or, am I all wet and the Multi-model Mean Trend is calculated some other way?
Thanks to anyone who can explain.

michaelspj
Reply to  David in Cal
December 19, 2014 5:40 pm

Its the average of each of the IPCC’s 108 2014 ensemble models. And the error bars are based upon the spread of the model results. If you download our entire poster you will see that they are normally distributed and therefore we can do very straightforward tests on the mean output versus “reality”.

David in Cal
Reply to  michaelspj
December 20, 2014 8:14 am

Thank you very much michaelspj. I am still confused about how the Multi-model mean trend varies with time. Are these model trends assigned to a point in time the same way the actual temperatures are? E.g., would a Model value for “50” show the Model trend for the period1964 to 2014? If so, would not the Model trend be close to the actual trend, since actual data was available when these models were created? Any further explanation would be appreciated.

Jake J
December 19, 2014 11:32 am

Terrible chart. Too many lines, impossible for a non-specialist to interpret or use.

michaelspj
Reply to  Jake J
December 19, 2014 5:41 pm

It’s a very simple chart. Read the legend and the text. We are totally straightforward and only ask that you read our words. Thanks.

rgbatduke
December 19, 2014 11:58 am

Interesting, flawed, and curious. Interesting because it quantifies to some extent the observation that the climate models “collectively” fail a hypothesis test. Flawed because it once again in some sense assumes that the mean and standard deviation of an “ensemble” of non-independent climate models have some statistical meaning, and they do not. Even as a meta-analysis, it is insufficient to reject “the models of CMIP5”, only the use of the mean and variance of the models of CMIP5 as a possibly useful predictor of the climate. But we didn’t need a test for that, not really. The use of this mean as a predictor is literally indefensible in the theory of statistics without making assumptions too egregious for anyone sane to swallow.
What we (sadly) do not see here is the 105 CMIP5 model results individually compared to the data. This would reveal that the “envelope” being constructed above is a collective joke. It’s not as if 5 models of the 105 are very close to the actual data at the lower 5% boundary — it is that all of the models spend 5 percent of their time that low, but in different places. Almost none of the models would pass even the most elementary of hypothesis tests compared to the data as they have the wrong mean, the wrong variance, the wrong autocorrelation compared to the actual climate. Presenting them collectively provides one with the illusion that the real climate is inside some sort of performance envelope, but that is just nonsense.
The curiosity is they are plotting “trend”, not the data itself, that is, the derivative of the data (model or otherwise), and that the derivative of the data has a peak and complex structure over the last 20 years. Say what? Looking at figure 9.8a, AR5, this is difficult to understand. The “pause” is something that actually was neither explained nor anticipated as of 2006 in the models. So I’m a bit suspicious when CO_2 cranks up but “forcings” have somehow been found that moderate the trend, not in just one model but in the bulk of them, hindcasting the pause that wasn’t there four or five years ago. Really?
A final flaw is all of the usual nonsense about fitting a linear trend to a nonlinear timeseries in the first place. Note well that the authors say nothing about error in the fit trends themselves, they just plot out the mean fit trend and some sort of standard deviation of sample fit trends without ever talking about the probable error in each fit trend from data that itself is systematically diverging from the data being fit.
And a good thing, too — they’d be eaten alive by e.g. William Briggs:
http://wmbriggs.com/blog/?p=3266
or in m ore detail:
http://wmbriggs.com/blog/?p=5172
and ff. Not that they’d notice.
So, very curious. If I wish to compare two different timeseries (say) the measured global anomaly and the global anomaly predicted by a single model run of a single model, or for that matter the mean over many runs of a single model with perturbed parameters, there are straightforward ways to do it. One of them is to look at the linear trends, to be sure, because if they differ at all then the models will separate without bound over time. Better still one can use e.g. Kolmogorov-Smirnov tests, look at the symmetry of the models, look at the variance of the models, and so on — all of their statistical moments, not just one moment that is picked to make some point.
With that said, I agree well enough with the conclusion. My own fits to HadCRUT4 indicate a total climate sensitivity of around 1.8 C, just under the 2.0 C and far under the 3+ C (still) being asserted by various parties. 3 C cannot be rationally fit to HadCRUT4, period, unless one takes back the co-assertion that natural variation is irrelevant, which is (incidentally) confounded by the substantial variation in linear trend in the curves presented above given steadily increasing CO_2.
rgb

Reply to  rgbatduke
December 19, 2014 12:38 pm

+1 . Good for you.
Moreover, even if this collection of model results were a valid statistical ensemble, where are the tests for statistical independence or that the distribution is Gaussian to justify plotted limits — or was nonparametric statistics used (and if so, where is this explained)?

rgbatduke
Reply to  Philip Lee
December 20, 2014 3:11 pm

In AR5 (which nobody ever reads, of course) it clearly states that they are not a valid statistical ensemble, and that in particular they are not independent. Indeed, of the 36 or so models in CMIP5 portrayed there, there are only maybe 7 to 11 independent models. If you read the names off, you can see furthermore that the big players (e.g. NASA GISS) have disproportionate representation with 7 or 8 named models that are part of the “ensemble” all by themselves. But the whole idea is silly beyond compare.
My favorite analogy is to the Hartree (mean field) model in quantum mechanics. We know that the Hartree model is fundamentally flawed as a means of computing e.g. electronic energy levels for a multi-electron atom. It ignores the Pauli exclusion principle and does not allow for the powerful short range repulsion between electrons. Both of these things increase the size of atoms by pushing the electrons further apart than the Hartree model allows for, creating a systematic bias in the energy structure of a Hartree (modelled) atom compared to reality. You can get a decent idea of how quantum atoms work — get energy states out with reasonable labels, for example — from the Hartree model, but quantitatively it isn’t so good.
You can, of course, have 36 different people program in a Hartree model, and run the resulting programs on different hardware to different precision and tolerance, and get 36 different results. Those results might well be normally distributed around some sort of “mean Hartree model result”! And even if this were true, it would not ever, under any circumstances, make the “multimodel ensemble mean” of the Hartree model a good predictor or descriptor of reality!
You can average an infinite number of broken or incorrect models and still not converge to a correct model. The entire idea is silly. Under an enormously fortuitous, incredibly unlikely special state of affairs where the systematic errors attributable to different models happen to cancel, you might find an “ensemble” of models that converges in the mean to reality, but one cannot even sanely argue that this extremely special circumstance holds for climate models.
Indeed, nobody expects climate models to work at all, let alone work in the multimodel ensemble mean. And by “nobody”, I include climate scientists. They all know perfectly well that it is unlikely that one single climate model is working correctly even in a mean sense one model at a time! They know this because no two unrelated models converge to the same model predictions in the mean. At most one model could be correct, and it is far more likely that none of them are correct. There is overwhelming evidence — some of it presented above — of systematic bias in the CMIP5 models. And yet again in AR5, this is acknowledged, right before they state that they’re going to ignore this inconvenient truth and just use the CMIP5 MME mean wherever as if it were some sort of meaningful predictor, and even attach words like “high confidence” to it in the Summary for Policy Makers where the term “confidence” at any level, high, low, or medium hasn’t the slightest defensible meaning in any sense of statistical confidence.
This is what drives me personally bananas. It makes the entire report a “confidence” game. Confidence in science is not a statement of opinion. It isn’t even a statement of the opinion of authoritative experienced researchers in the field. It is a defensible statement of a result from statistical analysis. It is a p-value.
AR5’s use of the term is a direct violation of the very precepts of science. It has reduced it from quantitatively defensible analysis to punditry and politics. It is abhorrent. It is despicable. It is just plain wrong.
rgb

bones
Reply to  rgbatduke
December 19, 2014 12:58 pm

+10! Your comments are always spot on and interesting.

whiten
Reply to  rgbatduke
December 19, 2014 1:05 pm

Hello rgb.
Considering what you have stated:
“With that said, I agree well enough with the conclusion. My own fits to HadCRUT4 indicate a total climate sensitivity of around 1.8 C, just under the 2.0 C and far under the 3+ C (still) being asserted by various parties.”
————-
Would you consider that the AGW is impossible!
Estimating a CS ~1.8C through the assessed reality of HadCRUT4 puts the posiible range of ECS ~1.2C
~1.8C (with an average of ~1.5C) far below for a possible AGW.
As the CS at ~1.8 the possible range of CS is ~1.6C to 2C (with an average of ~1.8C, as you put it) which will make the ECS be at the range given above.
The ECS in principle stand for a condition while the CS moves to a new range, like in the case of AGW….and always it will move to a lower value, …….. but never the less CONSIDERING the ECS ~2.4C to 4.5C (with an average of ~ 3.4C) as prior to AR5 means most possibly an AGW.
In that range of ECS the CS would have been somewhere inbetween 2.8C to 4.4C (with an average of ~3.6C)….AND AT SUCH VALUES NO REAL IMPORTANT DIFFERENCE AS TO BE CONSIDERED DIFFERENT and with a problem for AGW, but if you lower the CS significantly the value of ECS drops too low for a comfort and becomes meaningless and paradoxal, and therefor so ends up to be the AGW.
In this kind of interpretation will you really consider that the AGW is impossible?
cheers.

rgbatduke
Reply to  whiten
December 20, 2014 3:37 pm

Forgive me, but I’m having a hard time understanding what you are saying here. Let me instead clarify what I’m saying.
* IF one takes HadCRUT4 at face value (not arguing about whether or not that is justified, as that’s a distinct issue).
* AND one takes the hard result of real physics line by line computations as well as slightly approximated models that the average surface temperature ought to vary with the log of the CO_2 concentration (at absorption saturation, where we long since are)
* AND one constructs a “reasonable” model for CO_2 from 1850 to the present that almost perfectly matches Mauna Loa data from 1959 through the present (so it is pretty much dead on there) and ice core data back to 1850 (so it is at least likely to be approximately correct there and in between)
* AND one uses the latter two assumptions to fit the former data,
* THEN one obtains a very, very good fit to the data. That is, it is absolutely impossible to reject the null hypothesis that CO_2 was the proximate cause of the average warming in between. Quite the contrary, it is a hypothesis with sufficient explanatory power of the data that there nothing much left to explain!
Do you see how that works? It is simply a matter of fact that CO_2 concentration is a sufficient explanation for HadCRUT4 if the latter is a valid and reasonably accurate statement of the surface temperature anomaly. It is as simple as that. I freely acknowledge that there could be alternative hypotheses that would also work. I freely acknowledge that HadCRUT4 might not be accurate. I freely acknowledge that my interpolatory CO_2 model might not be valid. But it is undeniably true that the three assumptions above form a very coherent and consistent result, one that directly measures a TCS of 1.8 C in the warming observed so far — a result that really only depends on the beginning and end points but the fact that it works well in between is fairly strong evidence that the hypotheses above could be mutually correct. Or that HadCRUT4 was cooked up to support a TCS of 1.8 (unlikely, since most of the IPCC seems to want it to be much higher). Or that my CO_2 model is wrong but coincidentally happens to produce a good fit by pure chance. Or…
So please, by all means, come up with or express your own explanations for the warming, but if you are going to argue with mine please understand that a) you need to be quantitative. I can defend mine with a physics based, quantitative fit built using R and the hypotheses and the data. No handwaving necessary. b) Be clear about what you are asserting. I can’t figure out if you agree or disagree with HadCRUT4, with any given model for CO_2 increase, I can’t figure out whether or not you agree or disagree with the expected \Delta T = A \ln(cCO_2(t)/cCO_2(0)) warming (and are arguing about the particular value of A that is reasonable) or what. c) If you are asserting that no warming has occurred, by all means say so. Then we can terminate the discussion early, because (note well) that I’m “assuming that HadCRUT4 is reasonably accurate”. If you change this assumption, of course you will arrive at different conclusions, but then the discussion has to be about something else, that is, the problems or lack thereof with HadCRUT4. I might even AGREE that it has problems, but that doesn’t affect the value of the exercise above.
rgb

whiten
Reply to  whiten
December 20, 2014 5:40 pm

Hi rgb.
I think your comment was a reply to me.
You say:
“Forgive me, but I’m having a hard time understanding what you are saying here. Let me instead clarify what I’m saying.”
———————–
Forgive me too, but I have to put my understanding of this little debate of ours as clearly as I can, no offence intended, honestly.
Reading your reply to me I have a very hard time to accept the above statement of yours as true, as you actually have achieved in a very clever way to avoid the answering of my question by defaulting that question in the point that it stood at.
In your latest reply to me, you are saying to me that while you said the “total climate sensitivity of around 1.8 C, just under the 2.0 C”, you actually did not mean the CS (as I happend to have understood it) but the TCS, and therefor you do not need or do not have to answer my question anymore, simply because any value of TCS (whatever that value be) in its own proves or disproves nothing about AGW, and my interpretation that the question was based at happends to be outside the meaning you had about your “total climate sensitivity” .
Very clever of you I must say.
About the rest of your latest reply to me, I can’t make head or tails for most of it to be honest,….and no I did not dispute the accuracy of HadCRUT4, it was taken in face value as you say, because it does not really matter on the subject of that question you had to answer.
So you did not need to go to all that trouble and make such a long reply to me, you just had to say that by “total climate sensitivity” you did mean TCS, contrary to what I thought, the CS.
That would have been good enough.
You see, TCS does not actually mean “total climate sensitivity”, but actually for what is worth it means Transient Climate Sensitivity, a kinda of CS needed to explain, uphold and measure the possibility of The Climate moving towards a new climate equilibrium, aka the AGW.
Forgive me for thinking that “total” in relation of CS has no any actual meaning and therefor my mistake of thinking that you simply meant CS when you said “total climate sensitivity”, BUT YOU SEE THERE WAS NO MUCH CHOICE THERE, FOR NOT SAYING NONE.
Simply as far as I can see, you did manage to avoid and dodge my question by simply, very cleverly and stealthy moving the goal posts.:-)
Never the less you have indirectly answered my question, I assume….. ..you can’t really consider the AGW been impossible, no matter what, under no circumstances.
But while at it, as I am not very comfortable with assumptions, allow me and also forgive me for asking again the same question to you but now by aiming at it to where you have moved the goal posts.
So according to your computation that puts the TCS at the present as a value of 1.8C and considering that the same computation in a “present” 14 years ago would have produced a TCS value distincly higher than 1.8C, therefor considering that the TCS value is going downhill contrary as expected in an AGW scenario moving towards of a new climate equilibrium, would you consider that AGW is impossible, according to such an interpretation?
cheers

Robert B
Reply to  rgbatduke
December 19, 2014 3:01 pm

“My own fits to HadCRUT4 indicate a total climate sensitivity of around 1.8 C, just under the 2.0 C and far under the 3+ C (still) being asserted by various parties” I’m assuming that is a fit of a sin+linear function that treats the dots as just noise. Maybe a bigger envelope but still a back of envelope and still the same conclusion. The lowest estimates of the models are the only ones that you can take serious – with a pinch of salt.

rgbatduke
Reply to  Robert B
December 20, 2014 3:57 pm

I’ve played with linear in time as well, but it doesn’t work as well as the natural log of the concentration of CO_2 (which is itself not linear in time):
http://www.phy.duke.edu/~rgb/Toft-CO2-PDO-jpg
As you can see, it works really, really well — especially with the sin variation that I have no explanation for at all, it merely seems to improve the fit empirically and is probably nothing meaningful (certainly nothing I’d gamble on into the distant future:-).
The hard question is this: This result has been obvious for decades now. Hansen could have computed the fit back in 1980 and would have gotten nearly the same thing. By 2000 it was very clear, and by then we had enough Mauna Loa data to fit back to at least the 1940s on enormously simple assumptions. Where, precisely, is there any reason to think that TCS is over 3 in this figure? Note well that by the time the anomaly increases by 3 C, CO_2 has nearly tripled. It isn’t even close. And this figure includes all feedbacks — by ignoring them, and assuming a linear response added on to the otherwise logarithmic increase.
IMO the single stretch from roughly 1983 to 2000 caused a widespread panic (even among climate scientists who really should have known better) because people simply could not see the overall pattern of temperature variation back to 1850. What they probably saw was a moderate warming due to CO_2 plus a strong, rapid warming due to the unexplained harmonic, capped by an unusually strong ENSO. But in perspective, even the unusually strong ENSO was just another transient modulating the climate around the dominant trend, one that very likely is CO_2 driven but which is unlikely to be catastrophic.
This situation was not improved by appointing Hansen to be the head of NASA GISS. Talk about political disaster! Appointing somebody who’s mind is clearly already made up and who cannot even maintain the facade of objectivity, somebody who gets arrested at protests against nuclear power (at the same time he is demonizing CO_2, pretty much working to bring down civilization itself) — madness!
rgb

Robert B
Reply to  Robert B
December 23, 2014 8:01 pm

Sorry for getting back to this so late. My tractor caught fire. The battery decided to arc with the bonnet.
The link doesn’t work but I can guess what it is. I noticed it about a year ago that there is an significant warming of about 0.2°C since 1950 as well as the 60 year period but I’m convinced that it is bigger than reality because of the inconvenient 40s blip.

John Whitman
Reply to  rgbatduke
December 19, 2014 4:19 pm

rgbatduke on December 19, 2014 at 11:58 am

rgbatduke,
Two points:
Point #1 – Every time I see your comments here at WUWT I have this vision of being again a freshman at university. In the visions, instead of majoring in Engineering Science with Nuclear Power focus as I did, in my vision I see majoring in both statistics and in the philosophy of science.
Point #2 – You said, “ [. . .] The “pause” is something that actually was neither explained nor anticipated as of 2006 in the models. So I’m a bit suspicious when CO_2 cranks up but “forcings” have somehow been found that moderate the trend, not in just one model but in the bulk of them, hindcasting the pause that wasn’t there four or five years ago. Really?” You caught the GCMers / IPCC GCM assessers in a real gotcha there alright! : )
Have a happy Holiday Season!
John

michaelspj
Reply to  rgbatduke
December 19, 2014 5:45 pm

Pls download the entire document so you can see the normal distribution of the predictions. Thx!

scf
Reply to  rgbatduke
December 21, 2014 9:38 am

I have also been disgusted by the use of the word “confidence” in AR5. Not only was it devoid of any statistical or scientific meaning, it was also devoid of any honesty, in the sense that an honest person would have noted that the fact that global temperatures had not risen for many years should have prevented anyone from expressing their own non-scientific and non-statistical confidence in climate science.

December 19, 2014 12:17 pm

Obviously we can’t begin to estimate from the empirical data what the climate sensitivity is until we have a reasonable understanding of the natural variation, see
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html

Reply to  Dr Norman Page
December 20, 2014 12:27 am

“Obviously we can’t begin to estimate from the empirical data what the climate sensitivity is until we have a reasonable understanding of the natural variation.”
I agree with Dr. Page’s above statement.
I further suggest that we can hypothesize that current ECS (to atmospheric CO2 on Earth) is so low as to pose no threat to humanity or the environment (I say ECS is less than ~1C), and observed warming and cooling in the 20th Century was almost entirely caused by natural variation.
I suggest this hypo enables better prediction of actual climate performance to date (late 2014) and into the future than all assumptions that ECS is ~2C or greater.
Is there ANY credible observational data that disproves this hypo? If so, let’s hear it here and now.
As a test of this hypo, I suggest that despite increasing atmospheric CO2, Earth will cool measurably in the next decades, due primarily to natural causes.

Reply to  Allan MacRae
December 20, 2014 6:23 am

The post at http://climatesense-norpag.blogspot.com/2014/10/comment-on-mcleans-paper-late-twentieth.html
provides estimates of the timing and amplitude of the likely coming cooling. The main control is the 1000 year quasi-periodicity in the temperature data. Forecasts which ignore this obvious periodicity are really worthless and can be safely ignored.

whiten
Reply to  Dr Norman Page
December 20, 2014 8:59 am

Hello Dr Norman Page.
“Obviously we can’t begin to estimate from the empirical data what the climate sensitivity is until we have a reasonable understanding of the natural variation, see.”
——————
You may be right yes, but my take and the understanding of the relation of RF and CO2 emissions with the climate change while the CS considered, is that the lack of exactly knowing the value or the range of the CS is not the problem anymore.
The problem is the hiatus, which simply means that whatever the CS the effect of the man in climate is unmeasurable, and in that regard bigger the considered range of CS, more certain that man is not effecting climate, strange I know, BUT THAT IS WHAT CALLED A TRAVESTY.
The reality of the hiatus is very confusing.
Does not allow for a consideration of man effect in climate… and a hiatus at 14 years long in a only 100+ years of GW is really significant and beyond the possibility for it to be ignored, ……that is why not ignored anymore and accepted even from IPCC..
So if CS considered as too low (as anything below 0.5C for a doubling) the effect of the CO2 emissions is too little and not even really measurable in climate terms while the 0.8C global warming up to the present considered and therefor by default the man can not clame any measurable effect it could have on the climate.
In the other hand, if CS considered as above 0.7C for a doubling, then, higher the value of CS, more certain it becomes that man is not having an effect in the climate, because the expected measurable warming is higher and easier to detect, a warming that actually is missing and remains undetectable for the last 14 years……..and that is the problem.
That is the problem… the missing finger-print of man in climate regardless of what CS.
That is what called a travesty.
Either the hiatus very soon ends and a warming starts or otherwise these AGW climatologists may better start pulling off their hairs, because starting to argue what exactly the value of CS or that of what they call ECS is (are), wont really help, while facing a stubborn hiatus.
cheers

Berényi Péter
December 19, 2014 12:28 pm

There is a much more serious and way deeper problem with the current computational climate modelling paradigm than failure to replicate observed temperature trends.
As long as the solar constant is constant indeed, annual cumulative insolation of the two hemispheres matches exactly. That’s because of a peculiar geometric property of Keplerian orbits, which cancels variations along a tropical year.
Now, physical properties of the two hemispheres are very different, because most land masses are located North of the equator. Water being much darker than land, clear-sky albedo of the Northern hemisphere is some 1.8% higher. In spite of this fact all-sky albedo of the two hemispheres matches within observational error (the difference is less than 0.03%, that is, almost two orders of magnitude lower).
That’s because of higher abundance and/or reflectivity of clouds in the Southern hemisphere.
The upshot is that energy input to the climate system is symmetric with respect to the equator on an annual scale.
This symmetry is not replicated by computational climate models, which in itself is sufficient to falsify the underlying paradigm.
Q.E.D.
Journal of Climate, Volume 26, Issue 2 (January 2013)
doi: 10.1175/JCLI-D-12-00132.1
The Observed Hemispheric Symmetry in Reflected Shortwave Irradiance
Aiko Voigt, Bjorn Stevens, Jürgen Bader and Thorsten Mauritsen

rgbatduke
Reply to  Berényi Péter
December 20, 2014 4:05 pm

Hi Berényi,
OK, so this makes something else that is commonly said even harder to understand. The Earth is closest to the Sun in January, farthest in July. Yet global average temperature is lowest in January and highest in July. The variation in TOA insolation is substantial — 91 W/m^2, a figure that dwarfs the forcing from CO_2, almost 7% of the average insolation. Usually the claim is made that it is the variation in albedo that is responsible for this, but if albedo is symmetric, then nothing makes sense. How in the world can global temperature countervary with TOA insolation when global albedo does not change?
rgb

Berényi Péter
Reply to  rgbatduke
December 21, 2014 12:05 am

Good question. Annual average reflected shortwave radiation is indeed the same for the two hemispheres. That implies absorbed radiation should also be the same.
However, the curious fact is its spatio-temporal distribution is very different between the hemispheres within a year (definitely non-symmetric).
Also, annual average outgoing longwave (thermal IR) radiation is higher in the Northern hemisphere (by some 1.2 W/m²), that is, it’s not symmetric, it is cooling more efficiently.
Yet global average temperature is substantially higher during Northern summer.
The difference is compensated for by trans-equatorial oceanic heat transport from the Southern hemisphere to the Northern one, another minuscule detail which computational climate models fail to capture (besides clouds).

Tom
December 19, 2014 3:09 pm

Quoting from the poster:
“From the recent literature, the central estimate of the equilibrium climate sensitivity is ~2°C, while the climate model average is ~3.2°C, or an equilibrium climate sensitivity that is some 40% lower than the model average.
….., it means that the projections of future climate change given by both the IPCC and NCA are, by default, some 40% too large (too rapid) and the associated (and described) impacts are gross overestimates.”
No.
I buy that 2 is about 40% lower than 3.2 degrees, but it does not work in reverse. 3.2 is 60% higher than the appropriate estimate of ECS, not 40%.

December 19, 2014 3:14 pm

Does the AGU publish EOS? If so, have they publicly apologized for their unwarranted attacks in EOS on Baliunas and Soon and Veizer and Shaviv, as described here?
http://wattsupwiththat.com/2011/11/28/the-team-trying-to-get-direct-action-on-soon-and-baliunas-at-harvard/#comment-811913
If they have not apologized, then I suggest they are unfit for human consumption.

December 19, 2014 3:30 pm

“From the recent literature, the central estimate of the equilibrium climate sensitivity is ~2°C, while the climate model average is ~3.2°C, or an equilibrium climate sensitivity that is some 40% lower than the model average.”
Sorry, but I can find no compelling evidence (much less “proof”) that a doubling in CO2 atmospheric concentration would increase the average temperature (whatever that is) of planet earth by 2°C. In fact, I don’t think we have been able to measure any warming caused by CO2 at all. They say that the question of God is totally outside of science since there is no way to detect or find evidence of said entity. But by the same token, If we can not detect temperature rise caused by CO2 then we don’t have any scientific basis to claim any CO2 cause warming. We are still trying to do science are we not?
~Mark

michaelspj
Reply to  markstoval
December 19, 2014 10:05 pm

Jeez we’re just reporting the literature. Got a problem with that?

Reply to  michaelspj
December 19, 2014 10:48 pm

The mainstream media says it is “just reporting the literature” also.
In this case, we can have the delusion of 2°C or the delusion of 3.2°C — both of which are not supported by the facts and real-world observations. I prefer neither one. I prefer we use measurements that demonstrate the effect of CO2.

Chris Wright
Reply to  markstoval
December 20, 2014 4:44 am

Absolutely. The basic physics predicts roughly a one degree warming for a CO2 doubling. But we’re not talking about basic physics, we’re talking about the climate system, where everything is a function of everything else and every constant is a variable.
As far as I’m aware, there is no reliable historical data that shows an increase in CO2 followed by a corresponding increase in temperature, as predicted by AGW. The ice core data shows that it works the other way around: it’s temperature driving the amount of CO2. The ice core data appears to be a complete disproof of AGW. If science were working properly, AGW would long ago have been scrapped, allowing scientists to concentrate on what really drives the climate. Unfortunately, at least for now, a poisonous combination of vested interests and green extremism continues to corrupt the science.
Even in the last century nearly half of the warming occurred before there was sufficient CO2. It is truly remarkable that a theory that is so soundly disproven by all the empirical scientific data still flourishes. But it won’t last forever. I’m confident that science will regain its integrity, but I’m not holding my breath….
Chris

rgbatduke
Reply to  markstoval
December 20, 2014 4:13 pm

I have no idea why you are asserting this. Attribution of cause is, in all cases, based on observation of coincidence. You let go of a penny, it falls, you attribute the cause of falling to “gravity” because the explanation works to explain the past and has predictive value.
If you look at the figure I posted above, CO_2 concentration is a truly excellent explanation of observed past warming, at least from 1850 to the present. Outside of that range the uncertainties in the data very likely defeat the model (if they aren’t already too great within the range of the model). Time only will tell if this has predictive value.
However, there are some very good reasons — as is also the case with gravitation — to think that CO_2 concentration increases will cause warming. Those good reasons result in the model being fit, which ends up working very well.
So has anybody “proven” that CO_2 has caused warming? No, no more than anybody has “proven” that gravity makes things fall. In the case of gravity, we have little doubt left. In the case of CO_2 the evidence isn’t as strong and you might well have some doubt, but to assert that there is no compelling evidence — well, I disagree. In fact, I think that the curve above that I posted is pretty compelling. The residual standard error of the fit is around 0.1 for 163 degrees of freedom. That’s compelling.
rgb

Reply to  rgbatduke
December 21, 2014 12:49 am

Rgbatduke
You write

So has anybody “proven” that CO_2 has caused warming? No, no more than anybody has “proven” that gravity makes things fall. In the case of gravity, we have little doubt left. In the case of CO_2 the evidence isn’t as strong and you might well have some doubt, but to assert that there is no compelling evidence — well, I disagree. In fact, I think that the curve above that I posted is pretty compelling. The residual standard error of the fit is around 0.1 for 163 degrees of freedom. That’s compelling.

Sorry, but what you have presented is certainly NOT “compelling”.
Having checked this thread, I assume that the ”curve” you mention is supposed to be provided by the link in your post at December 20, 2014 at 3:57 pm where you wrote

I’ve played with linear in time as well, but it doesn’t work as well as the natural log of the concentration of CO_2 (which is itself not linear in time):
http://www.phy.duke.edu/~rgb/Toft-CO2-PDO-jpg
As you can see, it works really, really well — especially with the sin variation that I have no explanation for at all, it merely seems to improve the fit empirically and is probably nothing meaningful (certainly nothing I’d gamble on into the distant future:-).

Well, no, I cannot “see” anything because when I click on the link I get

Not Found
The requested URL /~rgb/Toft-CO2-PDO-jpg was not found on this server.
Apache/2.2.22 (Fedora) Server at http://www.phy.duke.edu Port 80

However, let us assume the graph does exist and it does show the CO2 and temperature correlate, then so what? Correlation does not indicate causation.
Both the CO2 and the temperature have risen so they will exhibit some correlation.
And changes to CO2 follow the temperature at all time scales. This coherence indicates that if there is a causal relationship between them then the “compelling evidence” is that the temperature changes induce the CO2 changes which is the opposite of what you suggest.

Richard

Reply to  rgbatduke
December 21, 2014 7:52 am

richardscourtney
December 21, 2014 at 12:49 am
Richard, I’ll go w/RGB’s statement at face value. The Nimbus-satellite graph that shows total upward-radiating IR shows this — a big “gouge” of CO2 radiating upward at a lower temp than the surface. If you increase the “gouge” by increasing CO2, the total IR upward has to stay the same (1st Law), and all else being equal, would increase the surface IR thru the atmospheric window, which translates to higher surface temps.
That said, notice I said “all else being equal”. That’s the $64K question. We don’t know how the earth will react. More/less cloudiness? More/less water vapor? Increase/decrease in convection? Change in air flow patterns? Change in albedo? Change in the tropopause level?
Offhand, pretty much all natural, complicated systems have overall negative feedbacks, and there’s plenty of evidence the earth is no exception. So, IMHO, the “real” temp rise would prb’ly be less than the simplistic CO2-doubling warming equation of 1.2C or so.

Reply to  rgbatduke
December 21, 2014 11:21 am

beng1
Thankyou for your interest.
Sorry, but your response ignores the coherence evidence. For your argument to stand you need to explain the fact that changes to CO2 are observed to FOLLOW changes to the temperature at all time scales.
Please note that I do not know if the recent rise in atmospheric CO2 concentration (as measured e.g. at Mauna Loa) has an anthropogenic cause, a natural cause, or some combination of anthropogenic and natural causes. But I want to know.
The rise can be modeled as being either anthropogenic or natural
(ref. Rorsch A, Courtney RS & Thoenes D, ‘The Interaction of Climate Change and the Carbon Dioxide Cycle’ E&E v16no2 (2005))
and I have been arguing with both ‘sides’ of the argument for over a decade.
The OCO data may resolve the issue and preliminary data seems to suggest a natural cause
(see Tim Ball’s article)
but that could be misleading.
Richard

December 19, 2014 4:08 pm

AGU has no copyright on any poster. What they are getting at is that posters and presentations occupy a peculiar netherworld in scientific publishing, being often preliminary in nature and unpublished in the dead electron or tree world. Sometimes, the good times, people use posters to provoke discussion, scientific that is. Thus, when someone photographs a poster and puts it up for everyone to see problems can arise.
An amusing, well not at the time, to the authors example of this was an encounter Eli had with Bruce Malamud and Don Turcotte at AGU. Turns out that they had given a seminar at York University, and somebunny put up the powerpoints, which was picked up by Tim Curtin, then Eli. Malamud and Turcotte had, in their own words, no idea that it was out there, and indeed it took then another couple of years to complete the study. Take a look at the link, and the links at the link to Marohasy and Curtin and the comments at both places.

December 19, 2014 5:18 pm

With respect to the idea that humans are causing harmful changes to the climate at this very moment, I am waiting for some peer-reviewed papers that propose what the optimum climate is for our biosphere. The first question that would naturally flow would be where is our current climate and trend in relation to this finding.
Strangely, nobody seems interested in this vital comparison. Not so strangely, the solutions that are frequently demanded in the most urgent voice, all converge on a socialist worldview: statism, bigger government, higher taxes, less personal liberty. That bigger picture tells me all that I need to know about “climate science”.

michaelspj
December 19, 2014 5:43 pm

May I add that Bill Gail, the new Prexy of the AMS came by to chat for about twenty minutes. Given that this poster is a speck in a sea of 3000 others at the time, it obviously commanded some serious attention! Gail also appeared to understand there are some systematic problems in science, and not just climate science. Perhaps we are making progress.

Donb
December 19, 2014 9:08 pm

Pat,
The scientific method is supposed to be self-correcting of bias as the process moves upward to ever higher levels. Thus, whereas all scientists (and I am one) are biased, and the reviewers of our papers (and sometimes the editors) have biases, the free debate of ideas among the many levels of science, and the interactions of many points-of-view, are supposed to winnow out such biases and permit us to arrive at the “truth”. This seems to have worked relatively well so far.
But I have concerns over that process being corrupted by biases coming from above, including professional societies, politicians, the press, etc., and especially those biases being driven by financial or political agendas toward specific goals. How does the scientific method handle such influences?
I see some evidence of this in other agencies. For example, when NASA sent spacecraft to Saturn, it did not show a preference for what “facts’ were discovered or what scientific “conclusions” were drawn. With Mars it is a different story. Former administrator Dan Goldin recognized that public interest and support of the public was important to an agency like NASA. And the public was interested in the origin of life and life outside Earth. So these themes, along with the supporting theme of finding past or present water on Mars, have driven NASA’s sizeable effort at Mars exploration. Never think that NASA focus has not influenced funding for research and conclusions reached by especially young scientists trying to stay in the research system.
Such efforts to bias their own science cannot become a common theme at science support agencies and their political and public supporters, or all science will be the lesser for it.

michaelspj
Reply to  Donb
December 19, 2014 9:58 pm

Science is no where near as simple as may think. Having been in top tier academia for thirty years, perhaps I can summarize a promotion-and-tenure exam:
1. How much money did he bring in and was it government money (a more virtuous type)?
2. What did he publish and was it any good?
3. How much more money can he bring in?
Thx.

rgbatduke
Reply to  michaelspj
December 20, 2014 4:18 pm

Don’t forget:
4. How many undergraduates did he molest? (Ignore if under 3.)
5. How many classes was he forced to stop teaching in mid-semester because students rioted outside of class? (Ignore if under 2.)
6. How many connections does he have to top-rank Universities so that he can place his postdocs well?
The person cannot be an active liability to the department. Well, unless the answers to 1 and 3 are very large numbers.
In fact, really it boils down to 3, doesn’t it?
rgb

mpainter
Reply to  michaelspj
December 20, 2014 4:41 pm

Never had an inkling academia as so rough. Glad I was a longshoreman.

Reply to  michaelspj
December 20, 2014 6:28 pm

The person cannot be an active liability to the department.

It is never considered the liability that the administration is to academics … academics, in their minds, are just the pawns on the chessboard.

PerT
December 19, 2014 11:45 pm

Graph somehow resembles of a hockey stick

PerT
Reply to  PerT
December 19, 2014 11:53 pm

… and a mirrored one

Geckko
December 19, 2014 11:55 pm

Just a point on the maths.
If the actual forcing is 40% lower than the models assumptions, predicted temperature increase from CO2 rise is likely to be 67% more than will actually be expected

Steve Jones
December 20, 2014 2:15 am

As soon as any attempt is made to prevent access to results be suspicious. Very suspicious.

December 20, 2014 3:50 am

A few month ago ( 26.th of october) I made a similiar approach: http://kaltesonne.de/wp-content/uploads/2014/10/sensi2.gif . I compared the trends (GISS) to 2004 and to 2014. It’s clear, that the model-tuning from 1975 to 2004 ( brown for obs. and green for CMIP5) produced some kind of good agreement between CMIP5 model-mean and the observations. If one looks at the trends to 2014 ( blue for obs. and purple for CMIP5) the divergence is good visible. The trends of the model-mean is about 30% too high. This could be solved very easy: 30% Reduction of the TCR of the model-mean ( which is now about 1.9), this leads to a TCR of about 1.33. It’s just the value of Lewis/Curry.

December 20, 2014 6:27 am

The continuing lack of reconciling this difference in timely fashion is evidence, not only of bias but intentions that go beyond authentic climate science.
There can be no other interpretation for it. The climate models with the current physics and math have busted. It would be like a meteorologist that predicted sunny skies on Tuesday, after it was raining half of the day on Tuesday, adjust his forecast to partly sunny and not acknowledging that it’s actually raining.
In climate science, because the “projections” are in decades instead of days, there is no accountability since the majority of the projection time frame is always in the distant future.

Gary Pearse
December 20, 2014 7:14 am

Dr Norman Page
December 19, 2014 at 11:47 am
“It is silly to assume that the climate system is chaotic when it clearly isn’t. There are obvious periodicities in the Milankovitch cycles which have been stable for hundreds of millions of years>Similar quasi-periodicities are seen in the solar activity and temperature data.”
Yes, I’ve commented before similarly in response to rgb’s often plaint that its an intractably chaotic system. He’s correct, depending on what scale we are looking at, but understanding climate is, by definition, the integral of details. The gas laws, for example are, at the molecular level, a chaotic system, but our objectives in employing the gas laws are not to locate every molecule in space and time. The statistical integration of this chaotic system is, of course, used unfailingly throughout a wide range of our technology. The climate system is complex because of the number of variables, but in a chaotic system, temperatures would not oscillate merely 2% of K above and below an average.
As an engineer, looking at the grand scale, I would say that the main factor that stands out is the amazing in-built control of the system. Even Milankovic etc., and asteroids that from time to time wipe out 75%+ of species, do not tip anything. Rather they are countered and compensated for, to keep climate in its long term range. A engineer with a geologically long view could build an engine that would run on these oscillations – (the bearings and material fatigue would be a problem!). What we are looking for is the equivalent of the gas laws. Maybe water in its different phases quantitatively provides 75% of the answer.
A powerful support for the ; correctness of the paleoclimatology evidence is life itself. Example: we still have creatures in the sea related to Ordovician nautiloids from half a billion years ago. Scroll down and look at the fossil varieties.
https://www.tonmo.com/community/pages/nautiloids/

Reply to  Gary Pearse
December 20, 2014 7:20 am

Excellent points. Agreed.

Reply to  Gary Pearse
December 20, 2014 7:46 am

Gary
For an empirically based refutation of the CAGW meme see also
http://www.seipub.org/des/paperInfo.aspx?ID=21810
which states
“The planetary radiative balance is maintained by the equilibrium cloud cover which is equal to the theoretical equilibrium clear sky transfer function. The Wien temperature of the all sky emission spectrum is locked closely to the thermodynamic triple point of the water assuring the maximum radiation entropy. The stability and natural fluctuations of the global average surface temperature of the heterogeneous system are ultimately determined by the phase changes of water. Many authors have proposed a greenhouse effect due to anthropogenic carbon dioxide emissions. The present analysis shows that such an effect is impossible.”
“The post at
http://climatesense-norpag.blogspot.com/2014/10/comment-on-mcleans-paper-late-twentieth.html
provides estimates of the timing and amplitude of the likely coming cooling. The main emergent phenomena for human time scales of interest is the 1000 year quasi-periodicity in the temperature data. Forecasts which ignore this obvious periodicity are really worthless and can be safely ignored.

basicstats
Reply to  Gary Pearse
December 21, 2014 2:07 am

On the issue of chaotic climate dynamics, this seems a matter of precise definition. From reading rgbatduke in the past, he means ‘endogenous’ climate (ie without external shocks) has (2?) (bounded) attractor sets within which the dynamics is chaotic. Sometimes called strange attractors, although this probably opens up more issues of definition. One assumes no one really means climate dynamics is literally chaotic over the entire ‘climate space’, whatever that might exactly be. Although a professor of climate models did recently refer to semi-chaotic dynamics somewhere! Who knows what that means? Incidentally, the definition of chaotic dynamics permits periodic dynamics of any length for some (starting) points in the relevant space.

Jim G
December 20, 2014 8:52 am

Gary Pearse says:
“Even Milankovic etc., and asteroids that from time to time wipe out 75%+ of species, do not tip anything.”
I would label variation from dense tropical growth to snowball Earth a significant “tip”. However, that said, it is not likely CO2 plays a significant role. I believe it is obvious that the 70% of the Earth that is covered with water, the atmosphere that the panet has so far maintained and the geothermic heat of our planet along with the relative consistency of solar radiation all play a significant role in keeping climate suitable for life of one type or another. The relationships of these major players in climate is not, howerver, well established and does in many ways resemble chaos.

December 20, 2014 10:20 am

Dr. Norman Page is wrong the climate is random and chaotic but subject to cycles if they are extreme enough in degree of magnitude, duration of time ,and different items phase together, combined with what the initial state of the climate is on earth when these cycles take place.
One will never ever get the same climate result form given cycles, the best one could hope for is a general trend in the climate..
Dr Norman Page
December 19, 2014 at 11:47 am
“It is silly to assume that the climate system is chaotic when it clearly isn’t. There are obvious periodicities in the Milankovitch cycles which have been stable for hundreds of millions of years>Similar quasi-periodicities are seen in the solar activity and temperature data.”
I agree with the statement below to a point but cycles superimpose themselves upon the chaotic climate..
Yes, I’ve commented before similarly in response to rgb’s often plaint that its an intractably chaotic system which causes many of the cyclic effects to get lost in the noise of the climate system unless they are extreme.

December 20, 2014 10:51 am

For making climate policy the general trend is a good start. Based on the 1000 year cycle it is reasonable to suppose that the general trend for the next 600 years or so will be down once we pass the current peak.It is also reasonable to use as the most obvious first working hypothesis that the trends over shorter time periods may be similar to the 50 year moving average see in Fig 9 at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html
The drop in solar activity in 2005-6 seen in Fig 13 suggests that we are past the driver peak.
For a test of the general idea that the changing Neutron count is a useful proxy for changing solar activity which shows up in global temperatures with a time lag which varies according to which climate metric we use, we may consider that the sharp drop in the Ap index at about 2005-6 should presage a noticeable temperature drop in the RSS global temperatures in about 2016-17.
The lag between driver and climate has been variously estimated depending on the climate metric used . I think the RSS temps may be the canary in the coal mine with a lag of as little as 12 years.

December 20, 2014 11:02 am

Sorry- I managed to add 12 incorrectly the drop should be in 2017 -18

December 20, 2014 11:05 am

This solar cycle is really strange in that the maximum is still going on 7 years into this cycle, and the solar lull from 2008-2010 surprised all in how deep it was.
This could mean this solar cycle could be 14 years in length (the longest or close to it ever)which could have climatic implications if solar activity drops to very low levels and stays at those levels going forward. I think it will.

December 20, 2014 11:33 am

Salvatore We are not seven years into the cycle. See the neutron data in Fig 14 of my link. The minimum was right at the end of 2009 so we are now only 5 years in- just a couple of months away from the 24 peak using the neutron data,

December 20, 2014 12:03 pm

http://wattsupwiththat.com/2008/01/04/solar-cycle-24-has-officially-started/
Yes we are. Read article. One of many which confirms this.

December 20, 2014 12:07 pm

ONE MORE SOURCE WIKEDPIA
2008[edit]
On January 4, 2008, a reversed-polarity sunspot appeared, and this signaled the start of Solar Cycle 24. It was high latitude (30º N) and magnetically reversed. NOAA named the spot AR10981, or sunspot 981 for short.[3]
Sunspot 1007 produced the first solar flare above the B-class on November 2, 2008.
Sunspot 1009 produced the first solar flare above the C-class, a C1.4, on December 11, 2008. Only a few sunspots were observed in the surface of the Sun throughout 2008.

December 20, 2014 12:13 pm

If you want to tie yourself in knots about one sunspot – feel free. For practical purposes the neutron data is more useful and the minimum is very clearly at end 2009 . The SSN peak was in Feb 2014 – the neutron peak usually lags that by 10 months to a year, We are now very close.

December 20, 2014 12:17 pm

Your Wiki article says “Only a few sunspots were observed in the surface of the Sun throughout 2008”
Not of great significance in my opinion – more useful especially for climate matters to go with the neutron count.

December 20, 2014 12:44 pm

Nevertheless the same criteria was used for the start of sunspot cycle 24 as was used for other cycles therefore this is on track to be one of the longest in contrast to those other cycles which could be significant.
One has to be consistent in using the same criteria for each sunspot cycle to see how it ranks.

December 20, 2014 12:46 pm

Sunspot cycle length are NOT based on neutron counts.

December 20, 2014 12:55 pm

http://en.wikipedia.org/wiki/List_of_solar_cycles
Dr Norman Page here it is, this is the official records for sunspot cycle lengths.

December 20, 2014 1:24 pm

My main interest is climate forecasting , The most useful solar activity proxy for this purpose is the neutron count and the 10Be record for pre-instrumental times . I’m happy to use the time between the neutron minima
as the most useful measure of solar cycle length for my purposes.If others wish to spend their valuable time counting sunspots they are free to do so.

December 20, 2014 1:29 pm

To clarify the last comment it is actually the solar minima ie neutron count maxima we are talking about.

Donb
December 20, 2014 2:09 pm

On the exchange just above: IF one believes that cloud cover variations produced by variations in cosmic ray flux in Earth’s atmosphere is the major cause of a solar influence on global temperature, then the neutron counts are superior. This is because the neutron flux directly monitors the cosmic ray flux, whereas individual sunspots are not a direct measure of total solar output of wind and electromagnetic fields.

December 20, 2014 2:19 pm

Donb Exactly though I would say a major cause rather than the. What I say about all this in the post at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html
is
“NOTE!! The connection between solar “activity” and climate is poorly understood and highly controversial. Solar “activity” encompasses changes in solar magnetic field strength, IMF, CRF, TSI, EUV, solar wind density and velocity, CMEs, proton events etc. The idea of using the neutron count and the 10Be record as the most useful proxy for changing solar activity and temperature forecasting is agnostic as to the physical mechanisms involved.
Having said that, however, it is reasonable to suggest that the three main solar activity related climate drivers are:
a) the changing GCR flux – via the changes in cloud cover and natural aerosols (optical depth)
b) the changing EUV radiation – top down effects via the Ozone layer
c) the changing TSI – especially on millennial and centennial scales.
The effect on climate of the combination of these solar drivers will vary non-linearly depending on the particular phases of the eccentricity, obliquity and precession orbital cycles at any particular time.”

December 21, 2014 11:08 am

This is where you are not quite correct. It is just not cloud cover changes due to cosmic ray flux which by the way is also influence by the strength of the earth’s magnetic field which is going to determine the climate outcome. If it were only that simple but it is not. That is part of the puzzle however.
Just as important are the amounts of EUV light the sun is generating which can be shown to be tied to ozone concentrations which impact the atmospheric circulation and thus have a big impact on the climate.
A zonal atmospheric circulation at times of high solar activity is NOT going to result in N.H. cooling at the unset .
In addition there are many studies which show a pretty good correlation between sustained prolonged solar activity and major volcanic activity which does/will influence the climate. The AP index being a very important indicator of how much/little influence the sun may have upon the climate.
Then the initial state of the climate plays a big role in determining how much GIVEN solar activity may or may not change the climate. If the land /ocean arrangements are favorable (as they are now) this will promote solar changes effecting the climate. If the climate is close to interglacial/glacial conditions this will promote effect given solar changes effects upon the climate.
Then for the big picture Milankovitch Cycles have to be considered .Obliquity playing a big role as well as precession.
I also believe there are climate thresholds out there which could be activated if given solar conditions change enough in degree of magnitude and duration of time which could cascade the climate into a different regime through primary solar effects and all of the associated secondary solar effects.
The atmospheric circulation can impact oceanic circulations if extreme enough which would have a major impact on the climate. One thought is a more meridional atmospheric circulation could promote sea ice build up in the North Atlantic(Nordic Sea) which would melt when brought southward slowing down the thermohaline circulation and thus promoting N.H. cooling.
Also some suggest that lunar tides may exert an influence upon oceanic circulation patters.

December 21, 2014 11:10 am

I did not notice you had mention this. Correct.
b) the changing EUV radiation – top down effects via the Ozone layer
c) the changing TSI – especially on millennial and centennial scales.
The effect on climate of the combination of these solar drivers will vary non-linearly depending on the particular phases of the eccentricity, obliquity and precession orbital cycles at any particular time.”

December 21, 2014 11:19 am

Dr. Norman Page we are quite close in our thoughts.

Donb
December 21, 2014 3:29 pm

@SDP.
The Earth’s magnetic field has essentially nothing to do with the cosmic ray flux entering the atmosphere. The average cosmic ray proton energy is about 3 billion electron-volts. To significantly bend such energetic particles away from the inner solar system, as the Sun’s magnetic field does, requires that a magnetic field act over very large distances. Compared to the solar field, which acts over most of the solar system, the Earth’s field is weaker and of lesser extent.
The Earth’s field does act some on solar energetic protons (up to several million ev in energy, and dramatically bends away solar wind protons of 1 kilo-volt energy. Even if these lower energy particles got into the atmosphere, they would be stopped by nuclear or kinetic reactions before entering the troposphere.

prjindigo
December 21, 2014 7:04 pm

The thought that an atmospheric climate model can produce a hyperbolic curve without either the removal of the heat source or a change in gravitational force is laughable. The moment the curve appears it proves the entire premise of the model wrong.

December 22, 2014 9:08 am

http://iceagenow.info/2012/10/scientists-link-magnetic-reversal-climate-change-super-volcano-time-period/
Donb others see the earth’s magnetic field (as I do) having a much bigger role in the climate.

December 22, 2014 9:18 am

If the earth had no magnetic field the earth would have no atmosphere. The point being there are degrees of weakness and duration of that weakness in the field which must have some significant impacts upon the earth. What level of weakness that is I do not know.

Donb
Reply to  Salvatore Del Prete
December 22, 2014 11:48 am

@SDP
Reversal of the Earth’s magnetic field and its likely diminished state during that process can have a significant effect on magnetic field interactions with solar flare protons and corona mass ejection particles, solar protons of lesser energy than cosmic ray protons. In the absence (or nearly so) of Earth’s magnetic field, energetic solar particles could enter the upper atmosphere. BUT, solar wind protons have a mass interaction distance of micrograms, and energetic (>5 MeV) solar protons have mass interaction distances of only several grams. The Earth’s atmosphere has a column mass of a bit over one kilogram (14.7 lbs. per sq-inch). So even in the absence of Earth’s magnetic field, very few solar particles would penetrate to the lower atmosphere. In contrast, the secondary particles produced from nuclear interactions of cosmic ray protons penetrate to Earth’s surface.

December 22, 2014 3:10 pm

Power Cord is a type of an American wire, and is permanently fixed to the electrical equipment or appliance from one end. They do it with the aim that they give you the best possible deal so that they get the most number of customers and therefore are able to maximize their profits. I wanted to cry, I thought there was something wrong with the HDTV.

December 22, 2014 3:20 pm

2 Flexible Flat Cable (FFC) Production Line Marketing Channels Characteristic. Blade rotates 90 degrees by activating blade lever, allowing easy mid-span preparation (ringing and slitting). Some connections carry analog signals and some carry digital signals.

December 22, 2014 7:26 pm

Power Cord is a type of an American wire, and is permanently fixed to the electrical equipment or appliance from one end. Effectual functioning, superior performance, high strength and longer service life are synonyms to the range of Wire and Cable Machinery provided by them. Regular or switch, inside or out, you see plenty of spins, 313s, roll to blinds, KGB’s and each mobe or Raley trick you would never think of.

December 23, 2014 8:32 am

Donb what is your general stance on the climate? What do you see as the main factors that govern it? thanks

Donb
December 23, 2014 3:20 pm

@SDP
I suspect that natural factors and greenhouse warming are all involved. Which natural factors are more important, I don’t know. Exactly how CO2 feedbacks (and clouds) are involved, I don’t know. I am interested in the science and try not to be biased in my thinking.
Related to this, even IF significant future global warming will occur, I do not think rapid change from fossil fuel to green energy will work without disrupting economies. The world will have to make a gradual evolution in its energy sources and take actions to mitigate negative effects of warming.