'Let’s face it. The climate has never been more boring.'

Image from an essay by Cliff Mass on “boring weather” Click

Why you won’t see headlines as climate science enters the doldrums

Guest post by Dr. Robert G. Brown, Physics Department of Duke University (elevated from a  comment on this thread: RSS Reaches Santer’s 17 Years)

This (17 years) is a non-event, just as 15 and 16 years were non-events. Non-events do not make headlines. Other non-events of the year are one of the fewest numbers of tornadoes (especially when corrected for under-reporting in the radar-free past) in at least the recent past (if not the remote past), the lowest number of Atlantic hurricanes since I was 2 years old (I’m 58), the continuation of the longest stretch in recorded history without a category 3 or higher hurricane making landfall in the US (in fact, I don’t recall there being a category 3 hurricane in the North Atlantic this year, although one of the ones that spun out far from land might have gotten there for a few hours).

We (the world) didn’t have an unusual number of floods, we don’t seem to have any major droughts going on, total polar ice is unremarkable, arctic ice bottomed out well within the tolerances slowly being established by its absurdly short baseline, antarctic ice set a maximum record (but just barely, hardly newsworthy) in ITS absurdly short baseline, the LTT temperatures were downright boring, and in spite of the absurdly large spikes in GASTA in GISS vs HADCRUT4 on a so-called “temperature anomaly” relative to a GAST baseline nobody can measure to within a whole degree centigrade, neither one of them did more than bounce around in near-neutral, however much the “trend” in GISS is amplified every second or third month by its extra-high endpoint.

The US spent months of the summer setting cold temperature records, but still, aside from making the summer remarkably pleasant in an anecdotal sort of way (the kind you tell your grandchildren when they experience a more extreme weather, “Eh, sonny, I remember the summer of ’13, aye, that was a good one, gentle as a virgin’s kiss outdoors it was…”) it was unremarked on at the time.

Let’s face it. The climate has never been more boring. Even the weather blogs trying to toe the party line and promote public panic — I mean “awareness” — of global warming are reduced to reporting one of GISS’s excessive spikes as being “the fourth warmest September on record” while quietly neglecting the fact that in HADCRUT4, RSS and UAH it was nothing of the sort and while even more quietly neglecting the fact that if one goes back a few months the report might have been that June was the fourth coldest in 20 years. Reduced to reporting a carefully cherry-picked fourth warmest event? Ho hum.

So, good luck in getting any news agency to report reaching 17 years in any or all of the indices — this isn’t news, it is anti-news. It is old. It is boring.

It is also irrelevant. If GASTA (Global Average Surface Temperature Anomaly) stubbornly refuses to rise for five more years, stretching the interval out to 20 to 22 years in a way that nobody can ignore, does this really disprove GW, AGW, or CAGW? It does not. The only thing that will disprove GW or CGW is reaching 2100 without a climate catastrophe and without significantly more warming or with net cooling. A demonstrated total climate sensitivity of zero beats all predictions or argument. The “A”(nthropogenic) part is actually easier to prove or disprove in a contingent sort of way, although it will probably take decades to do so. Contingent because if there is no observed GW at all, AGW seems difficult to prove. But since we are in the part of the periodic climate cycle observed over the last 150 years where the climate remains neutral to cools around an overall warming trend, we might well see neutral to very slow warming even if AGW is correct, if there is an anthropogenic component to the long term trend and oscillation that we can observe but not really explain over the last 150 years.

The one thing the 33 years of satellite measurements and increasingly precise surface temperature measurements have been able to prove is the one thing that the 17 year interval is truly relevant to. The GCMs used to predict CAGW suck. The GCMs in CMIP5 (Coupled Model Intercomparison Project) that contribute to the conclusions of AR5 are almost without exception terrible predictors of the Earth’s actual climate.

This conclusion is unavoidable. Even if they all cannot be rejected at the “95% confidence level”, almost none of them are close to predicting even GASTA alone, let alone RSS/UAH, global rainfall, frequency and violence of storms, etc. As we leave 2013′s hurricane season behind with almost no chance for an Atlantic storm this year, which GCM predicted the paucity of hurricanes and tornadoes over the last few years? Where are the droughts and floods? Which GCMs actually got the temperature distribution right (when they didn’t get the average or average anomaly right, the answer is almost certainly “none of them”)?

We are told “Catastrophic warming is coming, it is just around the corner”. We ask why and without exception we are told “Because the 30 or more GCMs we carefully built in the 1990′s in response to the CAGW threat and normalized with the warming data from the 70′s and 80′s (not to mention Hansen’s initial model report from the late 1980′s) all say so.” We then quite reasonably ask what they predicted for the last 20 years, and of course we can see that they all did indeed predict shockingly rapid warming. We then compare this to what actually happened, which is almost no warming over the last 20 years — a single warming pulse associated with the 1997/1998 ENSO event and then neutral ever since. We note that the warmest of the models that are still included in the CMIP5 data because nobody ever rejects a model just because it doesn’t work are a whopping 0.5 to 0.6C warmer than reality — they are the models with a total sensitivity of 5 or 6 C by 2100, so they have to warm at 0.5C a decade to get there.

This really is shocking. Shockingly bad science, shockingly dishonest political manipulation of policy makers on the part of scientists who participated in the creation of AR5 and permitted their names to give the report its weight.

As I’ve pointed out once and will point out again, by failing to be honest in AR5, by removing words that expressed honest doubt from the earlier draft and redrawing the figure to obscure the GCM failure, the IPCC has now gone far out on a limb that will end the career of many scientists and politicians before AR6 if there is no significant warming by that time. Not only significant warming, but a resumption of some sort of regular upslope to GASTA. Even if there is another ENSO-related burst of warming (which I’m sure is what they are hoping for) if it is only 0.2 C — and it is difficult to imagine that it could be much more given evidence from the past — it will barely suffice to restore the warming trend to 0.1 C/decade give or take a hair, roughly half of the lowest estimates of climate sensitivity. And they run the very real risk of getting to 2020 with GASTA basically the same as it was in 2000.

At that time, the hottest GCMs are going to be almost a full degree C too hot compared to reality. The people who contribute to the IPCC reports aren’t fools — most of them know perfectly well that the high sensitivity models are trash at this point, and they know equally well that it will no longer be possible to conceal this fact even from ignorant politicians by 2020 if there is no statistically significant warming by that time. Because it is an open secret that there was a cover-up that deliberately concealed this, effectively lying to policy makers, there will be a public scandal. Heads will roll.

The only way the IPCC can possibly avoid this as it proceeds is to issue a correction to AR5. Go back in and eliminate the GCMs with absurdly high sensitivity, the ones that obviously fail a hypothesis test when compared to the actual climate record. Personally I would advise eliminating at a much more generous level than 95% — a complete idiot with experience in computational modeling could go into these models and figure out what is wrong, given an additional 16 years of data — simply retune the models until they can manage both the warming of the late 20th century AND the warming hiatus since. Models for which no tuning can reproduce the actual past go into the dustbin, period — ones that can manage it will all have a vastly lowered climate sensitivity and will produce a much larger fraction of warming from “natural” variability, and less from CO_2. Finally, insist that all models use common numbers for things like CO_2 and aerosol contributions instead of individually tuning the largely cancelling contributions to reproduce an interpolated temperature change.

I’m guessing that over half of the participating models will simply go away at this point. They can then reconstruct figure 1.4 in the SPM, note the good news that even though the remaining models will all still predict more warming than actually occurred the warming that they project by 2100 will be between 0.5 and 1.5 C, not 2.5 C or more. This is almost precisely in line with what was observed in the 19th and 20th century without CO_2, and will grant a far larger role to natural variability (and hence a smaller one to CO_2).

Why should they do this, even though it is near-suicide to do it at this point? Because it is sure thing suicide not to do it. Because it is the right thing to do. Because they have a queasy feeling in their tum-tums every time they look at figure 1.4 in the AR5 SPM and realize that the dent that they made in the car isn’t going to go away and Dad is going to be even more pissed when he finds out if they lie about it. After all, everybody knows that the worst models in CMIP5 are wrong at this point. The people that wrote the models and ran the models, they know that their models are broken at this point. It’s not like the failure of a model is difficult to detect or something.

If it were “just science”, all of this would have been happening in the literature for some time anyway. People would jump all over models that fail, because in the usual realm of science there is little money on the line and because trial and error and try try again is the normal order of business and what keeps you getting paid. Not so in climate science. Here it is all political. Hundreds of billions of dollars and the directed energy of the entire global civilization ride on the numbers. Here there is a real risk of congressional hearings where a flinty-eyed committee chair grills you by showing you GCM curves selected from figure 1.4 of the AR5 SPM and asks you “Sir, at what point was it obvious to you that this curve was not a good predictor of the future climate?” Because if the answer was “2012″ — and given the REMOVED TEXT from the earlier draft of AR5 everybody knows that it was 2012 at the latest — that’s contempt of congress right there, given that AR5 directs billions of dollars in federal research money and hundreds of billions of dollars of subsidies and misdirected governmental energy at all levels from federal to state to local to personal.

We pay, pay, and pay again in the form of taxes, higher energy prices, neglect of competing services and goals — and what we pay pales to nothing compared to the terrible price paid by the third world for the amelioration of hypothetical CAGW. Millions of people die every year from respiratory diseases alone brought about because they are still cooking on animal dung and charcoal because coal burning power plants are now “unclean” and have artificially inflated price tags at every level.

If CAGW is a true hypothesis, then maybe — just maybe — it is worth sacrificing all of these people, most of them children under five, on the altar to expiate our carbon sins. But given this sort of ongoing catastrophe, this ongoing moral price we pay on the basis of the “projections” of the GCMs, how great is the obligation of the scientists who wrote AR5 towards “mere honesty”, to put down not their own beliefs but to put down the objective support for their beliefs given the data?

For some time the data has been sufficient to prove that the tools that claim the biggest, scariest AGW are simply incorrect, broken, in error, failed. Yet their predictions are still included in AR5 because without them, the “catastrophe” disappears and we are forced to rebalance the cost of gradual accommodation of the warming while continuing to civilize and raise the standard of living of the third world against the ongoing catastrophe of adopting measures that everybody knows will not prevent the catastrophe anyway (if the extreme models are correct) at the cost of a hundred million or more lives and unspeakable poverty, disease, and human misery perpetuated for decades along the way.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
177 Comments
Inline Feedbacks
View all comments
November 5, 2013 1:07 pm

The CAGW meme is built on the outputs of climate models. Many of the modelers and IPCC and Met Office scientific chiefs had a background in weather forecasting In spite of the inability of the weather models to forecast more than about 10 days ahead, in an act of almost unbelievable hubris and stupidity, the modelers allowed themselves to believe, or at least proclaim, that they knew enough about the physical processes and climate driving factors involved to forecast global temperatures for decades and centuries ahead.Indeed, many establishment scientists appear to think that humanity can dial up a desired global temperature by keeping CO2 within some appropriate limit. What arrant nonsense!
In practice the modelers have known for some time that their models have no skill in forecasting and have indeed said so in the WG1 reports. The IPCC AR4 WG1 science section actually acknowledges this fact. Section IPCC AR4 WG1 8.6 deals with forcings, feedbacks and climate sensitivity. The conclusions are in section 8.6.4 which deals with the reliability of the projections. It concludes:
“Moreover it is not yet clear which tests are critical for constraining the future projections, consequently a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed”
What could be clearer. The IPCC in 2007 said itself that we don’t even know what metrics to put into the models to test their reliability.- i.e. we don’t know what future temperatures will be and we can’t calculate the climate sensitivity to CO2.This also begs a further question of what erroneous assumptions (e.g. that CO2 is the main climate driver) went into the “plausible” models to be tested anyway. This means that the successive SPM uncertainty estimates take no account of the structural uncertainties in the models and that almost the entire the range of model outputs may well lay outside the range of the real world future climate variability. By the time of the AR5 report this is obviously the case as their outputs and reality continued to diverge.
The key factor in making CO2 emission control policy is the climate sensitivity to CO2 . By AR5 – WG1 the IPCC is saying: (Section 9.7.3.3)
“The assessed literature suggests that the range of climate sensitivities and transient responses covered by CMIP3/5 cannot be narrowed significantly by constraining the models with observations of the mean climate and variability, consistent with the difficulty of constraining the cloud feedbacks from observations ”
In plain English this means that they have no idea what the climate sensitivity is and that therefore that the politicians have no empirical scientific basis for their economically destructive climate and energy policies.
In summary the projections of the IPCC – Met office models and all the impact studies which derive from them are based on specifically structurally flawed and inherently useless models. They deserve no place in any serious discussion of future climate trends and represent an enormous waste of time and money. As a basis for public policy their forecasts are grossly in error and therefore worse than useless.
Continuing to even discuss climate forecasting in the IPCC modeling context is a waste of time.
How then can we predict the future of a constantly changing climate? A new forecasting paradigm is required .
It is important to note that it in order to make transparent and likely skillful forecasts it is not necessary to understand or quantify the interactions of the large number of interacting and quasi independent physical processes and variables which produce the state of the climate system as a whole as represented by the temperature metric. I suggest a simple rational empirical approach to climate forecasting based on common sense and quasi repetitive- quasi cyclic patterns..
For an estimate of the coming cooling based on such an approach see
http://climatesense-norpag.blogspot.com

DayHay
November 5, 2013 1:33 pm

RGB, love your posts, but really, heads will roll? Simple honesty? Sheesh, you are assuming these folks are like you in some way? These are double down pathological liars who have executive orders and MSM and large universities protecting their grant money. One bristlecone pine, please.? You have no idea who and what you are dealing with. The “I” in IPCC stands for one thing, access to your pocketbook by folks who own the guns. Truth is not even part of the discussion and never has been.

rogerknights
November 5, 2013 2:18 pm

rgb says (above):
. . . if you were forced to put your own money down on your own model, you would (if you had any sense) decline. You would without any possible question doubt your own model, especially given the secular trends in the data that are clearly visible across HADCRUT4. You would probably conclude that your model is quite naive, and overfitting a linear trend on the training set that is coincidentally excessively steep.

I was doing OK on Intrade betting on annual temperatures against the warmists. Too bad Intrade is gone.
Incidentally, does AR5 cite Foster & Rahmstorf in an attempt to explain (away) the divergence between projections and reality?

rogerknights
November 5, 2013 2:30 pm

DayHay says:
November 5, 2013 at 1:33 pm
RGB, love your posts, but really, heads will roll? Simple honesty? Sheesh, you are assuming these folks are like you in some way? . . . Truth is not even part of the discussion and never has been.

“A mad world, my masters.”

November 5, 2013 3:34 pm

rgbatduke says:
November 5, 2013 at 6:13 am
*
Just my opinion – This really should be posted as an article. Everyone should read this. Brilliantly put, Rgb.

1sky1
November 5, 2013 4:02 pm

rgb:
Kudos for a many-faceted, realistic overview of the situation. In light of the demands of your duties at Duke, unless you type at 1000wpm, it leaves me wondering how you find time for essay-length comments. I only wish you had said “recurrent” instead of “periodic” climate cycle in your presentation,.which minor inaccuracy deflects attention away from the major issues.

rogerknights
November 5, 2013 4:20 pm

@rgb: Maybe put your posts here into an e-book? And/or maybe Anthony could set up a tab at the top containing links to all your threads and comments.

November 5, 2013 4:48 pm

Taking the main post by rgbatduke and all subsequent comments by all commenters on this thread including also rgbatduke’s follow up comments, can one reasonably find the primary fault that is both necessary and sufficient (N&S) for the entire 20+ years of the IPCC that culminated in the AR5? Is it within the environmentalist community or within the certain types of media or within a certain sets of politicians or within certain departments within academia’s halls or somewhere within science’s community or is it somewhere else entirely?
I find the N&S primary fault to be within the portion of the scientific community that supports a philosophy of science thats views science as primarily an adaptable expedient in support of a predetermined climate answer where the answer is supplied by an ideology. Without their complicity I do not see how the IPCC AR5 could exist.
John

John West
November 5, 2013 6:50 pm

John Whitman says:
”can one reasonably find the primary fault that is both necessary and sufficient (N&S) for the entire 20+ years of the IPCC that culminated in the AR5?”
One could argue that it all hinges on the failure of the educational system to instill good critical thinking skills among the general public and together with the environmental movement for encouraging impassioned personalities to pursue science in order to “make a difference” instead of dispassionately and objectively searching for answers.

November 5, 2013 10:26 pm

@Steven Mosher:
Another fly by trolling from Mosher… who gets off on feeling important because we pay attention to his nonsensical musings.
I used to think you understood physics and math, but it seems you don’t. I’ve said this before, Kelvin is not the same as C nor is it the same as anomalies in Celsius. They are orders of magnitude different from each other.
The average test score is 100%
someone scores 98 and another scores 99%
As anomalies, one score would be a -2 the other -1. In your world, the score of -1 would be 50% lower than a score of -2. When you know that’s simply not true.

Richard Heyn
November 6, 2013 2:40 am

[snip – you are welcome to resubmit this without the ad homs -mod]

Craig King
November 6, 2013 9:51 am

Gary Hladik says:
November 4, 2013 at 1:57 pm (Edit)
—————————
There is nothing wrong about Mr Mosher. He is rigorous in his approach and unconcerned with being popular. Many may not agree with him but he always fights his corner and I have never seen him run away or try and defend something when he has been proved wrong. A right royal PITA and the better for all of us that he is.

Gary Hladik
November 6, 2013 10:33 am

Craig King says (November 6, 2013 at 9:51 am): “There is nothing wrong about Mr Mosher.”
I’m a bit baffled by your comment, as it doesn’t contradict anything I wrote.

1sky1
November 6, 2013 2:50 pm

rgb:
Allow me to anticipate how the promoters of the AGW meme will attack the realistic views that you eloquently express here. The first thing they will say is that, while absolute levels of global temperatures may be difficult to pin down accurately prior to the satellite era, the trend of “anomalies” is what really matters, since it reveals the effect of inceasing CO2, Then they will point to your own graph of HADCRUT4 as incontrovertible empirical evidence, while soft-shoeing the enormous gaps and systematic biases involved in the manufacture of that “historical” index. Yes, they will admit the presence of natural multidecadal cycles (which have been progressively diminished in all published GASTA time-series by data manipulation via “scalpel” corrections and various other “adjustments”), but they will insist that the “physics-based” trend will continue into the foreseeable future after the present “pause” runs its course,
What may bring about a stronger sobering is the recognition that GHGs introduce no energy of their own into Earth’s climate system; they merely redistribute and modulate, in the case of cloud-producing H2O, the effects of thermalized insolation. You can press that fundamental point very effectively.

November 6, 2013 8:45 pm

Craig King says:
November 6, 2013 at 9:51 am
Gary Hladik says:
November 4, 2013 at 1:57 pm (Edit)
—————————
There is nothing wrong about Mr Mosher. He is rigorous in his approach and unconcerned with being popular. Many may not agree with him but he always fights his corner and I have never seen him run away or try and defend something when he has been proved wrong. A right royal PITA and the better for all of us that he is.
+++++++++++++
Really – look up his name in posts. He flies by – drops drivel and hides away.

rgbatduke
November 7, 2013 3:00 pm

There is no need to disprove GW – it has been happening recently (since end of LIA) just as GC has happened many times in the past.
I believe, if Dr. Brown is specifically talking about proving or disproving the models, then his assertions seem correct.

I was specifically referring to proving or disproving the CAGW hypothesis, which is contingent upon the AGW hypothesis. That is, that humans are causing global warming (not that GW has or is occurring) and the contingent proposition that the GW caused by us Anthropes will be Catastrophic.
Those are specific hypotheses, and in the time frame of 2100 disproof would be no observed global warming anthropogenic or otherwise by 2100, where proof would be observed global warming by 2100 AND the positive connection of that warming to human activity. Ditto “catastrophic” AGW. — disproof is the failure of any GW at all, any proven anthropogenic connection to whatever GW occurs, and of course, the failure for any GW (anthropogenic or not) to be catastrophic. Proof would be catastrophe caused by GW that has been sufficiently strongly connected to human activity by experiments.
rgb

rgbatduke
November 7, 2013 3:12 pm

What may bring about a stronger sobering is the recognition that GHGs introduce no energy of their own into Earth’s climate system; they merely redistribute and modulate, in the case of cloud-producing H2O, the effects of thermalized insolation. You can press that fundamental point very effectively.
Sure, but bear in mind that I personally have absolutely no doubt in the reality of the GHE. It is actually very simple physics, and there is no doubt at all that it occurs. You are welcome to buy Grant Perry’s book if you do not understand this and work through it. The reality of the effect, however, does not suffice to prove the specific pattern of feedbacks that go into the overall climate sensitivity, and even the model for the variation of the GHE with CO_2 concentration itself is open to some doubt. The CO_2-linked GHE we observe is overwhelmingly dominated by the first 100 ppm of concentration. That does not mean that the GCMs are all correct, however, just because they all implement some variation of CO_2 based radiative physics, especially by the time one gets through adding feedback from water vapor, albedo variation from clouds, cloud variation from aerosols, the direct albedo effect of aerosols, the modulation of radiation due to heat redistribution (or the lack thereof), buffering due to the ocean, latent heat transport, variations in atmospheric chemistry due to variations in solar state, variations of insolation with orbital stuff, and more. This is reflected by the fact that the GCMs not in good agreement even amongst themselves.
So I don’t think any climate researchers will be impressed by the assertion that GHGs introduce no energy of their own, as that is a straw man. That isn’t how they work and everybody knows it, so reasserting an irrelevant fact changes nothing. The insulation of my house doesn’t introduce addtional heat, but it sure gets warmer with it than without it once one takes the furnace into account.
rgb

rgbatduke
November 7, 2013 3:18 pm

Kudos for a many-faceted, realistic overview of the situation. In light of the demands of your duties at Duke, unless you type at 1000wpm, it leaves me wondering how you find time for essay-length comments. I only wish you had said “recurrent” instead of “periodic” climate cycle in your presentation,.which minor inaccuracy deflects attention away from the major issues.
My friends and colleagues have sometimes referred to me as the “rgbbot”, suspecting that I’m actually a beowulf-based AI.
And you are quite right — recurrent would be a far better and more precise term than periodic. It expresses the reservations that I’ve been (of course) holding in my own mind and when I often introduce this as numerology, not science per se.
rgb

1sky1
November 7, 2013 5:21 pm

rgb:
I agree that the atmosphere provides an insulating layer between space,
which keeps the surface warmer than it would be in its absence. Earth’s
atmosphere, however, is warmed primarily by heat transfer via moist
convection, rather than by radiative transfer. This has been shown by
numerous experiments around the globe, in which some of my former
colleagues participated. [You may google “Bowen Ratio” for some
references.]
The customary academic explanation nowadays is much-too-simple GHE physics,
which leads to unrealistically high temperature calculations. In fact,
based on radiation-only reasoning, Adrian Gill’s “Ocean-Atmosphere
Dynamics” reaches the aphysical conclusion that, with enough absorbing
layers in the atmosphere, the surface temperature can exceed that of the
Sun.
Having digested R.M. Goody’s “Atmospheric Radiation” and John A. Dutton’s
“The Ceaseless Wind” many decades ago, I dispute the basis of such
conclusions. And I doubt that “Grant Perry’s book”, which is nowhere
identified on the web, would persuade any field-experienced geophysicist
otherwise. That there always is a radiative exchange between surface and
atmospheric matter is as boring as the exchange between the floor and the
insulated attic in a heated room. It’s the heat source and the net
transfer by all mechanisms, not just directional radiative intensity, that
really matters. That was the thrust of my remarks.
In any event, I don’t wish to detract in any way from your excellent
presentation. And, being an incompetent typist, I simply can’t spare time
from my professional duties for prolonged blog discussions.

rgbatduke
November 8, 2013 6:21 am

Having digested R.M. Goody’s “Atmospheric Radiation” and John A. Dutton’s
“The Ceaseless Wind” many decades ago, I dispute the basis of such
conclusions. And I doubt that “Grant Perry’s book”, which is nowhere
identified on the web, would persuade any field-experienced geophysicist
otherwise. That there always is a radiative exchange between surface and
atmospheric matter is as boring as the exchange between the floor and the
insulated attic in a heated room. It’s the heat source and the net
transfer by all mechanisms, not just directional radiative intensity, that
really matters. That was the thrust of my remarks.

Sorry. should have included the link:
http://www.amazon.com/First-Course-Atmospheric-Radiation-2nd/dp/0972903313
If you are a geophysicist (and hence physicist) then I doubt that you will argue with much of the book. I teach graduate-level electrodynamics and quantum mechanics from time to time (at the moment I’m teaching an endless series of intro physics but that’s the way our department rotates teaching) and the physics seems quite sound. In fact, it seems quite elementary. Nowhere in the book does Petty make a concrete assertion about e.g. the total climate sensitivity, BTW. It does, however, contain the TOA and BOA spectrographs that to any physicist are rather concrete proof of the GHE.
As for heat transfer mechanisms, I certainly don’t disagree that heat transfer by all mechanisms is important, but the point of the GHE is that without the atmospheric opacity in the LWIR bands, the troposphere would descend more or less to the surface of the Earth. But I’m guessing that we agree on all of this. I just don’t think there is a smoking gun disproof of CAGW and also don’t think that one can likely point to any particular subroutine in a GCM and say “aha, this is the wrong physics!” The errors are more subtle than that — omission, incorrect initialization, inadequate granularity, and a touch of good old fashioned code tuning to get a desired result.
rgb

Gail Combs
November 8, 2013 6:56 am

rgbatduke says: November 8, 2013 at 6:21 am
… I just don’t think there is a smoking gun disproof of CAGW and also don’t think that one can likely point to any particular subroutine in a GCM and say “aha, this is the wrong physics!” The errors are more subtle than that — omission, incorrect initialization, inadequate granularity, and a touch of good old fashioned code tuning to get a desired result.
>>>>>>>>>>>>
Dr. Brown, could you answer the question of whether the Climate Models are projections for the different storylines and scenarios purposed by Ged Davis. If the ensemble is not for business as usual but for an array of futures including a ” much reduced greenhouse gas emissions alternate scenario” then that spaghetti graph is a monumental lie since none of the ‘Projections’ fit reality including their desired scenario.
I can not tell if the climate models in AR5 much discussed here at WUWT are the same models Ged Davis prepared the scenarios for, or if they are for a completely different section of the IPCC’s report.
Link to Climategate e-mail: http://assassinationscience.com/climategate/1/FOIA/mail/0889554019.txt
Excerpt:
Dear Colleagues:
I am sending you a copy of Ged Davis’ IPCC-SRES Zero Order Draft on
storylines and scenarios….
Draft Paper for the IPCC Special Report on Emissions Scenarios…
Contents
1. Introduction
2. Scenarios – overview
3. Golden Economic Age (A1)
4. Sustainable Development (B1)
5. Divided World (A2)
6. Regional Stewardship (B2)
7. Scenario comparisons
8. Conclusions
Appendix 1: Scenario quantification
1. Introduction
The IS99 scenarios have been constructed to explore future developments in the global environment with special reference to the production of GHGs….
1.1 What are scenarios?
Scenarios are pertinent, plausible, alternative futures. Their pertinence, in this case, is derived from the need for climate change modelers to have a basis for assessing the implications of future possible paths for Greenhouse Gas Emissions (GHGs). Their plausibility is tested by peer review, in an open process, which includes their publication on the World Wide Web.
There are clearly an infinite number of possible alternative futures to explore. We have consciously applied the principle of Occam’s Razor, seeking the minimum number of scenarios to provide an adequate basis for climate modelling and challenge to policy makers…
2.1 Scenarios: key questions and dimensions
Developing scenarios for a period of one hundred years is a relatively new field. Within that period we might expect two major technological discontinuities, a major shift in societal values and a change in the balance of geopolitical power…. “

rgbatduke
November 8, 2013 2:41 pm

Dr. Brown, could you answer the question of whether the Climate Models are projections for the different storylines and scenarios purposed by Ged Davis. If the ensemble is not for business as usual but for an array of futures including a ” much reduced greenhouse gas emissions alternate scenario” then that spaghetti graph is a monumental lie since none of the ‘Projections’ fit reality including their desired scenario.
Hi Gail,
I suppose the best answer I could give would be “No”. Both no I can’t answer the question and no the GCMs are not really for that purpose AFAIK, although there have been figures published in the past for 0 emissions, continued emissions, increased emissions that probably are. But the GCMs (however they were USED in AR5) are for the purpose of modelling the climate. They can be judged on the basis of how well they modelled the climate over the trial period that began as soon as the models were initialized in such a way that they did well over the training period. Since they were initialized to show excessive CO_2-driven warming in what was likely a natural spike that had little to do with CO_2, as soon as the natural process responsible for that warming shifted the models all rapidly diverged from nature. That’s by far the simplest interpretation, and one that at this point I think a lot of even the most ardent warmists would agree with.
The solution is simple enough. In the short run, throw out all of the models that failed to reproduce this shift, as obviously they cannot actually predict what nature does, let alone what nature plus CO_2 does. This alone would significantly lower climate sensitivity estimates, and we are starting to see papers that are doing precisely that. We will see more, every year the climate fails to warm, because every year without significant warming further constrains feedbacks and overall sensitivity and even though the researchers are struggling to overcome a really serious bias towards their Bayesian prior assumption of huge sensitivity, data talks, bullshit walks, and those estimates must come down eventually in the face of the data.
In the longer run, build better models. The idea of building good climate models is not itself flawed. The problem has been building climate models that politically interpreted the late 20th century warming as being somehow different from the nearly identical early 20th century warming and blaming the former entirely on CO_2. This was then built right into the models in the form of maximum direct CO_2 forcing and strong water vapor feedback, and compensating for this by cranking up the (mostly unknown) contribution of aerosols until they got a good fit to the 1970s-1990s warming.
The problem with multivariate systems is there is usually more than one way to get a model to agree with training data, but those different ways usually diverge once you get outside of the training set, and at most one of them will be actually correct. Discovering the actually correct climate description almost certainly requires more than a stretch of monotonic warming to use as a training set.
If they trained on data from 1940 to 2013, OTOH, they’d have at least two periods of neutral to negative temperature changes AND a period of positive change (assuming, of course, that the various GASTA/GAST estimates are reasonably correct over this interval, which is far from certain). A model built that could correctly describe those changes would likely be more robust than the ones built and initialized to produce monotonic warming. None of this is really surprising — except that the writers of AR5 and its SPM would completely neglect mentioning it and indeed, would treat this divergence almost as if it is a problem with the actual global temperature instead of a problem with the models.
And it may not be a problem with the models — it is important to remember that this is a possibility too. They could still be right. In a year, two years, global temperature COULD spike up a half a degree C or whatever and the curves could rejoin one another. The hard thing about prediction is that it is about the future, and it isn’t terribly easy to prove or disprove a prediction except by waiting, and imprecise or multivariate predictions are difficult to check even by waiting. It worked over and over again for Biblical prophets — make an obscure prophecy and wait — sooner or later an even occurs that you can point at as proof that your prophecy was correct. So if I prophecy “There will be an enormous storm, the strongest storm in a 100 years” I have zero chance of being proven incorrect, and every chance of being proven correct. Eventually. If I say “There will be an enormous storm this year and it will strike central Florida“, well, only really stupid prophets would prophecy that. Too easy to get wrong, too “final” when you do.
So it really doesn’t matter how many years the climate stays flat and boring. It could always get exciting next year, and since the GCM “prophecies” are non-specific, any excitement at all can be counted as evidence that they are right, or will eventually be. All we can say is it hasn’t warmed the way they predict yet. We could say more, of course, if we used probability theory and statistics theory to assess the models as null hypotheses — we could then say that the probability that the models are correct is smoothly decaying in time as long as the climate remains neutral or fails to warm as they describe. How long should we wait before we reject the null hypothesis — or to put it more precisely, given the probability that this model is correct AND we have the observed climate different from its predictions, how small does that probability have to get before we conclude that the theory given the data is very, very implausible? The usual rule (lacking Bayesian priors that bias the estimate) is less than 0.05, but really this is pretty arbitrary. Less than 0.01 is pretty safe — it is asserting that one would observe the prediction AND the data only one Universe in 100 that were nearly identically prepared, so we’d have to be pretty unlucky to be in that one. Less that 0.001 and there is only one chance in a thousand that the model is right, given the data.
However, given many independent models (not independent runs of a single model), all of them too warm, the odds change. It becomes much less stringent to reject a model because you have so many chances to get the climate right and the models themselves have some noise and variance — you’d expect even a bad model to have a chance of getting the observed climate right at least some of the time. If you roll a 20 sided coin looking for 0.05 acceptance 20 times, you can’t be surprised if you get it around one time and GETTING it one time doesn’t mean you should accept the model, it means you should reject it!
rgb

1sky1
November 8, 2013 5:25 pm

rgb:
Without even glancing at Petty’s text, I offer a few brief caveats and clarifications:
Much caution needs to be exercised in interpreting widely published emission
spectra. They are often selected from ultra-dry regions (e.g. Sahara,
Antarctica) to de-emphasize the usually dominant role of water-vapor, while
casting CO2 in a disproportionate role. Inasmuch as molecular
collision with “inert” atmospheric constituents is what most frequently
grounds the excited state of CO2, there is energy transfer from CO2’s
characteristic spectral lines to the broad continuum (extending into the
microwave region) emitted by those constituents. Such transfer is seldom
properly acoounted for; in fact, it is effectively concealed by displaying
the spectra only in a much narrower range.
Contrary to the impression thus created, total atmospheric radiation is
nowhere near as dependent upon trace gases. I would argue that they cannot
store more energy than their miniscule total mass allows. To the extent
that dry convection can heat the vastly greater atmospheric bulk, GHG’s
strike me as hardly indispensible in maintaining a tropopause at an
altitude roughly equal to maximum convective reach. Furthermore, LWIR
atmospheric backscattering is not on equal thermodynamic footing with
high-entropy insolation. Its low-entropy radiation is almost totally
absorbed in the surface skin of the oceans, thereby sustaining
temperature-lowering evaporation. Instead of being an external “forcing,”
“backradiation” merely recirculates energy within the system.
Perhaps we’ll have an opportunity to take up these issues again in a more
appropriate forum. Have a good weekend!

rgbatduke
November 9, 2013 7:17 am

I would argue that they cannot
store more energy than their miniscule total mass allows. To the extent
that dry convection can heat the vastly greater atmospheric bulk, GHG’s
strike me as hardly indispensible in maintaining a tropopause at an
altitude roughly equal to maximum convective reach. Furthermore, LWIR
atmospheric backscattering is not on equal thermodynamic footing with
high-entropy insolation. Its low-entropy radiation is almost totally
absorbed in the surface skin of the oceans, thereby sustaining
temperature-lowering evaporation. Instead of being an external “forcing,”
“backradiation” merely recirculates energy within the system.

Not sure what you mean by a “more appropriate forum”, so I’ll reply here. First, the mass fraction of the GHG’s is almost totally irrelevant to the GHE. The simplest single-slab model of the GHE doesn’t have a stated heat capacity for the atmosphere at all — the atmospheric mass/heat capacity determines the relaxation rates but not the steady state temperature. What matters is the opacity of the atmosphere in certain bands of LWIR. Second, nobody suggest that LWIR backscattering is on an “equal thermodynamic footing” with insolation — the latter is the actual energy source for the system, just as outer space at 3 K is the actual energy sink, within more or less irrelevant perturbations. However, I fail to see your point.
Again, in my house in winter the heat source is burning natural gas, the heat sink is the cold outside air, but insulation in between the two, while producing no energy whatsoever, can and does substantially modify the achievable temperature difference between the inside of the house and the outside by modulating the rate of energy flow from the inside (once delivered) to the outside. In the case of the Earth, the sun delivers energy at the TOA at some rate peaked at SW wavelengths. In a single slab model, this energy is partially absorbed by the atmosphere, hits the surface, is partially reflected (with the reflected component again being partially absorbed by the atmosphere on the way out), is partially absorbed by the surface, is reradiated by the surface as LWIR, is partially absorbed by the atmosphere on ITS way out. The atmosphere gains energy from direct insolation and LWIR surface emission and symmetrically reradiates it, with half escaping and half being returned to the surface. All radiation occurs at rates proportional to temperature to the fourth. This constitutes a system that one can easily solve for the surface equilibrium temperature as a function of insolation rate, albedo, and the atmospheric SW and LW absorptivity. In the limit that albedo is zero, SW absorptivity is zero and LW is unity, one gets the usual 2^{1/4} factor between the steady state temperature of the surface with a perfect LW absorber shell and the temperature without any shell at all. I don’t see what the “footing” of LW vs SW radiation has to do with an energy balance formula. Energy is conserved, period.
Can one dress this up and make it more complex? Sure. That’s almost precisely what GCMs are, models that are more complex. And no, they do not neglect LW absorption at the surface of the ocean, surface evaporation and heat transport in the form of water vapor, the substantial GHE from water vapor alone, and so on. They may or may not do the physics of this all correctly, they may or may not have the correct parametric values for the various rates involved, but in fact one of several reasons that the models run too warm is that they do incorporate a substantial water vapor GHE as a net positive feedback, leading to climate sensitivity some 2x (or more) what one expects from CO_2 alone. I think there is some evidence that this is incorrectly treated in models, that water vapor feedback is anywhere from net neutral to slightly negative, because water vapor is tied to a complex water cycle involving latent heat transfer of surface energy vertically, modulating albedo at different vertical heights, and losing heat to space similarly at different heights and wavelengths and rates (after being largely lofted through the CO_2 “barrier”). But it is not the case that models do not have this stuff going on at all, that they leave the physics out.
If you want to look at this to see for yourself, there is an open source GCM here:
http://www.cesm.ucar.edu/models/atm-cam/docs/description/
The documentation carefully details all of the physics accounted for in the model, how it is initialized, how it is run, and so on. Since the source is open (although a total PITA to build — it is terribly organized and requires modules from multiple sources/sites with separate — but still open — licensing) you can check the physics, add your own physics, modify the code in any way you like, change the parametric initialization, publicly criticize and/or correct any errors you find or wish to assert, and see if you can do better. As can I. With the catch, of course, being that neither of us have time to do so, or the computational resources handy required to integrate the model forward over decades in five minute timesteps. But you look at the documentation and tell me — what is being omitted? There are some real candidates for omission — GCR modulation of albedo, for example — but note well that the model explicitly treats the (single slab) ocean and sea ice. Should they use a multiple slab ocean model? Probably. With ARGO, one might eventually have the data needed to construct one, and of course the ocean is where some climate scientists are looking for the “missing heat” (and if so, good luck getting it out again as the ocean is a heat sink with century-scale heat capacity at the rate of the supposed net radiation anomaly assuming that there is NOT sufficient negative feedback at the surface to eliminate most of its heating). But note well, this is an issue of the model being broken or inadequate or incorrectly parametrized, not the model failing to include the effect.
Third, I do not understand your remark that backradiation “merely recirculates energy within the system”. As noted above, in one sense this is quite irrelevant to the question of whether or not it is a substantial source of power delivered to the surface. If you can measure 200+ W/m^2 in downwelling atmospheric radiation integrated over all wavelengths at nighttime (no sun at all in the sky) the fact that this energy originally came from the sun and is downwelling only after following a complex path is utterly immaterial to the fact that the energy it delivers partially cancels radiative energy outflow from the surface (and, in the event that there is an inversion, can actually MORE than cancel outflow). Nobody is asserting that the atmosphere “produces” this energy, but nobody could possibly argue that this energy is not a substantial factor in the rate that the surface cools and hence in the average surface temperature.
Finally, although I did not directly quote you above, you do remark that often people only look at dry air TOA/BOA spectrographs to demonstrate the GHE. Petty certainly presents several figures of this sort, because without the complication of water vapor one can most easily and cleanly see the relevant absorptive bands associated with e.g. CO_2 and Ozone as well as the “windows” in LWIR where cooling does occur. But Petty also presents several wet air spectrographs and sure, they show a much stronger GHE. All that this means is that the GHE is most certain real, that water vapor is the most important GHG and (as noted) the bulk of the effect from CO_2 was obtained when it first became saturated so that the atmosphere became optically opaque in its primary absorptive bands. It doesn’t mean that CO_2 is unimportant — the Earth teeters along in this particular ice age on the edge of a return to glaciation, and without CO_2 in the atmosphere at all I think there is little doubt that the Earth would devolve to being a permanent iceball everywhere except possibly a narrow band near the equator in a matter of centuries and never emerge. Water vapor alone has a dangerous instability on the cold side and water ice has a huge surface albedo. Even with CO_2 the Earth is at best borderline unstable as evidenced by the glacial record over the last 3 million years.
Do the GCMs get the physics of this sort of feedback and the additional multichannel feedback from aerosols as they play the dual role of nucleation sites for cloud formation and a high-albedo component to the atmosphere in their own right correct? Quite possibly not. This is a really difficult problem. Petty avoids making any positive statements about it — explaining the physical arguments for line broadening and the various methods used to approximate what happens in the absorptive bands and how they might vary with concentration and so on, but this is quite possibly a place where the GCMs don’t do the physics correctly, where they use a smoothed, separable approximation that for one reason or another breaks down in reality because it is non-separable. One cannot just say that they do make this or any other particular error, however, without building a GCM and changing the physics or parameters in a defensible way and showing that one gets a better description of the climate when one does. All once can do without this is what I am doing — note that the agreement between model predictions and actual climate outside of the era used to initialize the models and determine their parametrization is poor, so that it is probable that each GCM in poor agreement contains at least one error responsible for the difference. And IMO, almost certainly more than one. It is a hard problem.
I don’t always agree with Steve Mosher — I think he is too uncritical of the GCMs, and too quick to jump to their defense when there is little benefit in doing so, when the only way to improve them is to acknowledge the fact that they probably aren’t working accurately enough to be used to divert trillions of dollars and negatively impact every human being alive right now on the basis of their predictions of future catastrophe when they manifestly are not doing a good job on the present. But I do agree with him that if it weren’t for the political burden and the fact that an entire community of scientists have essentially bet their careers, their reputations, and a hell of a lot of everybody’s money on what should originally have been considered a very, very, preliminary conclusion, we’d be saying that GCMs aren’t terribly conceived or implemented and that in some ways they do surprisingly well given the complexity of the system.
But surprisingly well is not well enough to trust them to direct the course of civilization itself, especially not if they are in error by a factor of two or even more in their estimates of climate sensitivity, as I think that the mounting evidence suggests that they are. As I point out above, there are real human lives — millions of them — being last as we dick around with “wood burning electricity generation”. We might as well pile our money up and shovel it into the furnaces and use that to make electricity when there is comparatively cheap and abundant coal to use instead, when lowering energy prices by using the least expensive resources affects human health, wealth and happiness worldwide. The only reason not to burn coal to generate electricity in the short run (over the next 20 to 40 years) is the GCM’s prediction of global catastrophe, which then becomes a highly nonlinear, Pascal’s Wager style hidden cost.
Otherwise we’d just burn coal while gradually improving the cost-benefit and safety of nuclear including Thorium, while working on fusion, while perfecting e.g. PV solar and while developing efficient energy storage and transport mechanisms so that we can store solar and use it over 24 hours or more, so that we can transport solar generated electricity from dry tropical and subtropical deserts to moist wet snowy temperate and polar zones. Inside a decade (or at most two), PV solar is going to become one of the least expensive energy resources for a broad band of the Earth’s surface, and if we figure out how to store PV solar energy efficiently to buffer solar energy over (say) 48 to 72 hours (or even more) then nobody will have to advocate solar power as world-saving policy, people will implement it out of simple human greed, a desire to have the cheapest possible energy. By 2050, burning coal will disappear not because it produces dangerous CO_2 but because we have cheaper, cleaner ways of making electricity.
This sort of technological discontinuity is almost completely predictable, although people rarely do make this sort of prediction. Thirty years ago, nearly every household in America and Europe had a rooftop television antenna. The entire planet radiated substantial amounts of electromagnetic energy in certain bands, and we started to look for distant civilizations by means of looking for the “signal” of electromagnetic radiation in those bands, reasoning that any advanced civilization would have TV. Look how well that assumption turned out. no houses have external TV antennae at all any more, and the satellite dishes that many houses do have receive far weaker signals directed straight down. I’ve read proposals to look for remote civilizations by looking for some sort of “pollution signature” in the atmospheres of exoplanets when and as resolution permits, but of course any such signature would also very likely be a short term transient as the civilization evolves to cleaner steady state resources.
I actually think it is perfectly lovely to encourage our global civilization along these pathways, to cleaner energy that doesn’t involve burning coal or oil, since in the long run both of these substances will be far more valuable to us unburned and not used to make energy we can get in other ways without facing long term scarcity issues. We shouldn’t burn diamonds, either. We shouldn’t use Helium in kid’s balloons, or treat Thorium as a toxic waste byproduct of mining rare earth metals. However, the Earth has 7 billion people and is a social and political powderkeg as long as 3 billion of those people are still living in the 17th century, 2 billion of them are living in the 20th century, and only 2 billion of them get the full benefits of living in the 21st century as far as wealth, access to resources, comfort, health, happiness, freedom and all that are concerned. We must choose a path to a steady state civilization that does not advance on the scarred backs and corpses (mostly children) of the poorest people in the world. That means that we should implement political measures that directly raise the cost of energy worldwide only when compelled by near certainty that the only alternative is an even greater catastrophe than the ongoing catastrophe of living in the most peaceful, wealthiest period in the history of mankind, with almost unimaginable wealth and knowledge to bring to bear on any given global issue, and then expend all of this wealth not to bring the gift of civilization to all of the world’s people but rather to prevent it, to keep them energy poor and in the invisible chains of ignorance, poverty, disease, misery, and totalitarian religious beliefs.
rgb
rgb

Slartibartfast
November 11, 2013 5:45 am

I think he is too uncritical of the GCMs

While being overly critical of a cartoon-version of DoD model validation, I would add. One that he’s gotten horribly, embarrassingly wrong.
FWIW, DoD system test trumps model predictions.
The large, smelly turd in the punchbowl of this dispute is that there is no way for experimental validation of climate models to occur. There’s only one experiment: the one that is happening right now. You can’t ever change the conditions or indeed any of the variables and run a new experiment. Contrast that with DoD missile defense testing, where you by definition have a different missile and different target on a different day having different wind shear, etc. Sometimes you even have different radars supporting the test. This makes DoD simulation models accurate on a micro-scale that climate models cannot hope to emulate in any foreseeable future. When’s the last time you saw a major climate-model prediction anomaly get resolved?
I liken it to economics, in that way.