Under the radar – the NAS Report

Guest Post by Willis Eschenbach

Under the radar, and un-noticed by many climate scientists, there was a recent study by the National Academy of Sciences (NAS), commissioned by the US Government, regarding climate change. Here is the remit under which they were supposed to operate:

Specifically, our charge was

1. To identify the principal premises on which our current understanding of the question [of the climate effects of CO2] is based,

2. To assess quantitatively the adequacy and uncertainty of our knowledge of these factors and processes, and

3. To summarize in concise and objective terms our best present understanding of the carbon dioxide/climate issue for the benefit of policymakers.

Now, that all sounds quite reasonable. In fact, if we knew the answers to those questions, we’d be a long ways ahead of where we are now.

Figure 1. The new Cray supercomputer called “Gaea”, which was recently installed at the National Oceanic and Atmospheric Administration. It will be used to run climate models.

But as it turned out, being AGW supporting climate scientists, the NAS study group decided that they knew better. They decided that to answer the actual question they had been asked would be too difficult, that it would take too long.

Now that’s OK. Sometimes scientists are asked for stuff that might take a decade to figure out. And that’s just what they should have told their political masters, can’t do it, takes too long. But noooo … they knew better, so they decided that instead, they should answer a different question entirely. After listing the reasons that it was too hard to answer the questions they were actually asked, they say (emphasis mine):

A complete assessment of all the issues will be a long and difficult task.

It seemed feasible, however, to start with a single basic question:  If we were indeed certain that atmospheric carbon dioxide would increase on a known schedule, how well could we project the climatic consequences?

Oooookaaaay … I guess that’s now the modern post-normal science method. First, you assume that there will be “climatic consequences” from increasing CO2. Then you see if you can “project the consequences”.

They are right that it is easier to do that than to actually establish IF there will be climatic consequences. It makes it so much simpler if you just assume that CO2 drives the climate. Once you have the answer, the questions get much easier …

However, they did at least try to answer their own question. And what are their findings? Well, they started out with this:

We estimate the most probable global warming for a doubling of CO2 to be near 3’C with a probable error of ± 1.5°C.

No surprise there. They point out that this estimate, of course, comes from climate models. Surprisingly, however, they have no question and are in no mystery about whether climate models are tuned or not. They say (emphasis mine):

Since individual clouds are below the grid scale of the general circulation models, ways must be found to relate the total cloud amount in a grid box to the grid-point variables. Existing parameterizations of cloud amounts in general circulation models are physically very crude. When empirical adjustments of parameters are made to achieve verisimilitude, the model may appear to be validated against the present climate. But such tuning by itself does not guarantee that the response of clouds to a change in the CO2 concentration is also tuned. It must thus be emphasized that the modeling of clouds is one of the weakest links in the general circulation modeling efforts.

Modeling of clouds is one of the weakest links … can’t disagree with that.

So what is the current state of play regarding the climate feedback? The authors say that the positive water vapor feedback overrules any possible negative feedbacks:

We have examined with care ail known negative feedback mechanisms, such as increases in low or middle cloud amount, and have concluded that the oversimplifications and inaccuracies in the models are not likely to have vitiated the principal conclusion that there will be appreciable warming. The known negative feedback mechanisms can reduce the warming, but they do not appear to be so strong as the positive moisture feedback.

However, as has been the case for years, when you get to the actual section of the report where they discuss the clouds (the main negative feedback), the report merely reiterates that the clouds are poorly understood and poorly represented … how does that work, that they are sure the net feedback is positive, but they don’t understand and can only poorly represent the negative feedbacks? They say, for example:

How important the overall cloud effects are is, however, an extremely difficult question to answer. The cloud distribution is a product of the entire climate system, in which many other feedbacks are involved. Trustworthy answers can be obtained only through comprehensive numerical modeling of the general circulations of the atmosphere and oceans together with validation by comparison of the observed with the model-produced cloud types and amounts.

In other words, they don’t know but they’re sure the net is positive.

Regarding whether the models are able to accurately replicate regional climates, the report says:

At present, we cannot simulate accurately the details of regional climate and thus cannot predict the locations and intensities of regional climate changes with confidence. This situation may be expected to improve gradually as greater scientific understanding is acquired and faster computers are built.

So there you have it, folks. The climate sensitivity is 3°C per doubling of CO2, with an error of about ± 1.5°C. Net feedback is positive, although we don’t understand the clouds. The models are not yet able to simulate regional climates. No surprises in any of that. It’s just what you’d expect a NAS panel to say.

Now, before going forwards, since the NAS report is based on computer models, let me take a slight diversion to list a few facts about computers, which are a long-time fascination of mine. As long as I can remember, I wanted a computer of my own. When I was a little kid I dreamed about having one. I speak a half dozen computer languages reasonably well, and there are more that I’ve forgotten. I wrote my first computer program in 1963.

Watching the changes in computer power has been astounding. In 1979, the fastest computer in the world was the Cray-1 supercomputer. In 1979, a Cray-1 supercomputer, a machine far beyond anything that most scientists might have dreamed of having, had 8 Mb of memory, 10 Gb of hard disk space, and ran at 100 MFLOPS (million floating point operations per second). The computer I’m writing this on has a thousand times the memory, fifty times the disk space, and two hundred times the speed of the Cray-1.

And that’s just my desktop computer. The new NASA climate supercomputer “Gaea” shown in Figure 1 runs two and a half million times as fast as a Cray-1. This means that a one-day run on “Gaea” would take a Cray-1 about seven thousand years to complete …

Now, why is the speed of a Cray-1 computer relevant to the NAS report I quoted from above?

It is relevant because as some of you may have realized, the NAS report I quoted from above is called the “Charney Report“. As far as I know, it was the first official National Academy of Science statement on the CO2 question. And when I said it was a “recent report”, I was thinking about it in historical terms. It was published in 1979.

Here’s the bizarre part, the elephant in the climate science room. The Charney Report could have been written yesterday. AGW supporters are still making exactly the same claims, as if no time had passed at all. For example, AGW supporters are still saying the same thing about the clouds now as they were back in 1979—they admit they don’t understand them, that it’s the biggest problem in the models, but all the same but they’re sure the net feedback is positive. I’m not sure clear that works, but it’s been that way since 1979.

That’s the oddity to me—when you read the Charney Report, it is obvious that almost nothing of significance has changed in the field since 1979. There have been no scientific breakthroughs, no new deep understandings. People are still making the same claims about climate sensitivity, with almost no change in the huge error limits. The range still varies by a factor of three, from about 1.5 to about 4.5°C per doubling of CO2.

Meanwhile, the computer horsepower has increased beyond anyone’s wildest expectations. The size of the climate models has done the same. The climate models of 1979 were thousands of lines of code. The modern models are more like millions of lines of code. Back then it was atmosphere only models with a few layers and large gridcells. Now we have fully coupled ocean-atmosphere-cryosphere-biosphere-lithosphere models, with much smaller gridcells and dozens of both oceanic and atmospheric layers.

And since 1979, an entire climate industry has grown up that has spent millions of human-hours applying that constantly increasing computer horsepower to studying the climate.

And after the millions of hours of human effort, after the millions and millions of dollars gone into research, after all of those million-fold increases in computer speed and size, and after the phenomenal increase in model sophistication and detail … the guesstimated range of climate sensitivity hasn’t narrowed in any significant fashion. It’s still right around 3 ± 1.5°C per double of CO2, just like it was in 1979.

And the same thing is true on most fronts in climate science. We still don’t understand the things that were mysteries a third of a century ago.  After all of the gigantic advances in model speed, size, and detail, we still can say nothing definitive about the clouds. We still don’t have a handle on the net feedback. It’s like the whole realm of climate science got stuck in a 1979 time warp, and has basically gone nowhere since then. The models are thousands of times bigger, and thousands of times faster, and thousands of times more complex, but they are still useless for regional predictions.

How can we understand this stupendous lack of progress, a third of a century of intensive work with very little to show for it?

For me, there is only one answer. The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error.

Now we can debate what that fundamental misunderstanding might be.

But I see no other explanation that makes sense. Every other field of science has seen huge advances since 1979. New fields have opened up, old fields have moved ahead. Genomics and nanotechnology and proteomics and optics and carbon chemistry and all the rest, everyone has ridden the computer revolution to heights undreamed of … except climate science.

That’s the elephant in the room—the incredible lack of progress in the field despite a third of a century of intense study.

Now me, I think the fundamental misunderstanding is the idea that the surface air temperature is a linear function of forcing. That’s why it was lethal for the Charney folks to answer the wrong question. They started with the assumption that a change in forcing would change the temperature, and wondered “how well could we project the climatic consequences?”

Once you’ve done that, once you’ve assumed that CO2 is the culprit, you’ve ruled out the understanding of the climate as a heat engine.

Once you’ve done that, you’ve ruled out the idea that like all flow systems, the climate has preferential states, and that it evolves to maximize entropy.

Once you’ve done that, you’ve ruled out all of the various thermostatic and homeostatic climate mechanisms that are operating at a host of spatial and temporal scales.

And as it turns out, once you’ve done that, once you make the assumption that surface temperature is a linear function of forcing, you’ve ruled out any progress in the field until that error is rectified.

But that’s just me. You may have some other explanation for the almost total lack of progress in climate science in the last third of a century, and if so, all cordial comments gladly accepted. Allow me to recommend that your comments be brief, clear and interesting.

w.

PS—Please do not compare this to the lack of progress in something like achieving nuclear fusion. Unlike climate science, that is a practical problem, and a devilishly complex one. The challenge there is to build something never seen in nature—a bottle that can contain the sun here on earth.

Climate, on the other hand, is a theoretical question, not a building challenge.

PPS—Please don’t come in and start off with version number 45,122,164 of the “Willis, you’re an ignorant jerk” meme. I know that. I was born yesterday, and my background music is Tom o’Bedlam’s song:

By a host of furious fancies

Whereof I am commander

With a sword of fire, and a steed of air

Through the universe I wander.

By a ghost of rags and shadows

I summoned am to tourney

Ten leagues beyond the wild world's end

Methinks it is no journey.

So let’s just take my ignorance and my non compos mentation and my general jerkitude as established facts, consider them read into the record, and stick to the science, OK?

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

272 Comments
Inline Feedbacks
View all comments
Bob Johnston
March 8, 2012 7:37 am

“An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning.” – Max Planck
With the human mind being what it is, I have little hope that the current warmist crop will ever have the ability to conceive that their CO2 hypothesis could ever be wrong. The brain doesn’t work like that in most people, it’s why you see the warmist side resort to name calling and distortions in an effort to resolve their cognitive dissonance. People get entrenched (What, me wrong???) and they simply cannot shift course after having taken a stand.
The CAGW situation and it’s lack of forward movement is very similar to another field of study that interests me and that has not moved forward in 40 years; that field being cardiovascular health and obesity as it relates to diet. This field had it’s own Michael Mann, that being Ancel Keys back in the 50’s. His flawed studies and overbearing manner set this field of study down a path that has led to incorrect conventional wisdom that just can’t seem to be broken by logic or even studies showing the contrary.
People develop blind spots and no amount of persuasion will ever get them to change. It’s only after they die or leave the field that new ideas can take hold.

Hector M.
March 8, 2012 7:44 am

@Judith Curry (March 8, 2012 at 4:37 am): The link to the NAS report is in the post, where it mentions the Charney report.
More specifically, it is at http://www.atmos.ucla.edu/~brianpm/download/charney_report.pdf.

March 8, 2012 7:45 am

OT but I was listening to a March 8th CBC radio program “Quirks and Quarks” and W.Richard Peltier, the most recent winner of the Gerhard Herzberg Medal and prize, was being interviewed. Peltier is the founding director of the University of Toronto’s Centre for Global Change Science, a 2010 winner of the Bower Award and prize for achievement in Science from the Franklin Institute in Philadelphia, the 2002 winner of the Vetlesen Prize for Earth Sciences, and a mentor of more than 30 doctoral students and an equal number of post doctural fellows. His use, more than 6 times in a short intervew, of the term “denier” caught my ear. Here we have a very prestigious award, worth a million dollars, awarded to a man with obvious talents and he spends most of his intervew time whining about “deniers”.

Frank K.
March 8, 2012 7:48 am

Willis – you had me going there for a minute – 1979!! Heh :^)
It is interesting to note that one of the two models discussed by Charney is none other than Jim Hansen’s early GCM.

RockyRoad
March 8, 2012 7:53 am

pwl says:
March 8, 2012 at 6:54 am

A fundamental mistake that the climate scientists are making is to assume that the climate can be modeled at all, let alone modeled using traditional mathematics.

Actually, they do a pretty good job of it–for example the “Map” procedure at Weather.com. That gathers radar data using “traditioinal mathematics”, interpolates it, and presents it in a series of frames–a “model” if you will of the past six hours or so. They even have a “Next 6 Hours” feature that projects the trends out for an equal duration (apparently predictions beyond 6 hours don’t have any validity or they may just be too resource intensive). And they have a 10-day forecast, but admit today is 80% accurate, tomorrow is 60% accurate, while the next day 40% accurate, and so on. (They recently added a “5-day” forecast apparently recognizing that after 5 days, their predictive ability is zero.)
So I submit there is a “climate model” based on actual data, but most of you would argue that weather isn’t climate, while one could argue it is a glimpse of the climate. And yes, I’m stretching this assertion quite a bit. (Thank goodness I’ve not seen a CO2 meter anywhere on Weather.com yet, but I fear I may have given them an idea.)

DocMartyn
March 8, 2012 8:04 am

The date is quite interesting, 6 years after H. Kacser and J. A. Burns. The control of flux.
Symp. Soc.Exp. Biol., 32:65–104, 1973.
The biochemists had realized that box models, using known rate constants, didn’t describe multi-component systems. They failed when tested against real data. This led to metabolic control theory, which is now part of canonical control theory. Interestingly, economics has also gone in this direction, although using a different formalism.
In layman’s terms, metabolic control theory shows you that complex systems have both inertia and elasticity. Tipping points are few and far between.
A review for those who like math is here:-
http://www.siliconcell.net/sica/NWO-CLS/CellMath/OiOvoer/Hofmeyr_nutshell.pdf
The nice thing about MCT is that you can actually do experiments to test the analysis, thus testing the hypothesis.

March 8, 2012 8:04 am

@Willis — Do we have any documentation on the “millions of lines” of code. If you include all the libraries used to make the application happen, I am sure that is easy to get to. Most of us don’t monkey in the libraries to create our code. Include them and use them. Do I get to count the lines in the library as part of my “program size”?
If it is millions of lines of code, it begs for a re-eval. Might I suggest using minecraft as their modeling engine… Maybe Roblox…

Nate_OH
March 8, 2012 8:10 am

“Jason Calley says:
March 8, 2012 at 5:17 am
You asked that we not draw analogies with fusion research, but please forgive me if I point out what may be the single most pertinent similarity. There are HUGE sums of money and power involved in NOT solving both questions.
I wish I could attribute the quote, but someone said, “Science + politics = politics.””
Exactly!
Fusion, space flight, climate research, etc have become political institutions, not R&D goals.
Pournelle’s Iron Law of Bureaucracy goes into effect and we end up where we are.
“…in any bureaucratic organization there will be two kinds of people: those who work to further the actual goals of the organization, and those who work for the organization itself. Examples in education would be teachers who work and sacrifice to teach children, vs. union representatives who work to protect any teacher including the most incompetent. The Iron Law states that in all cases, the second type of person will always gain control of the organization, and will always write the rules under which the organization functions.”

kim
March 8, 2012 8:17 am

Gaea looks like shadows on the wall of the cave.
=================

Michael Palmer
March 8, 2012 8:28 am

“Now me, I think the fundamental misunderstanding is the idea that the surface air temperature is a linear function of forcing.”
While this is a charming idea—just an innocent scientific mistake, correct it, and all will be well—it doesn’t ring remotely true to me.
Firstly, in a trivial, tautological sense, the assumption must be true—a forcing is anything that changes the surface temperature, and the greater the change, the stronger the forcing.
Secondly, with respect to the influence of specific observable forcings, I don’t believe that no one ever thought of removing the assumption that their effects will be linear. The idea of feedbacks between different observables naturally leads to the expectation of non-linearity.
It is worth noting that the climate of the earth is not the only complex problem that so far has withstood the onslaught of increasing computing power. Take, for example, the function and the development of the brain. We understand the function of individual nerve cells, and the interactions of small assemblies of nerve cells, in a qualitative to semi-quantitative fashion; however, no one has a valid working model approaching anything like a bird, or even insect, brain.
Complexity continues to elude us even with something comparatively simple such as the folding of single protein molecules. For background: Each protein molecule is initially synthesized in the cell as an inert, linear strand of amino acids. This strand then spontaneously bends, twists and curls into a certain, specific folded shape; only in this folded state does the molecule assume some function useful to the cell.
When artificially unfolded back to a linear strand in vitro and then left alone, most proteins will spontaneously revert to their folded, functional structure. This tells us that all the information required for reaching that structure must be contained in the amino acid sequence of the linear strand. Therefore, it should be possible to predict the folded structure from the amino acid sequence alone. However, while some heuristics exist to predict some likely features of the folded structure, we are very far away from accurate, complete and meaningful prediction, and therefore we continue to require X-rays and NMR to study folded structures.
The protein folding problem is many orders of magnitude simpler than the climate; and importantly, unlike the latter, it is also amenable to extensive experimentation. If we cannot even understand the choreography of a single molecule with a few thousand atoms, is it really surprising that we have failed to understand something as humongously complex as the climate?
Of course, any focus on scientific reasons for the lack of progress ignores the fact climate science has become so politicized that free, open, disinterested debate has been disrupted. So, even if the physical problem had been a simple one, this lack of openness would have ensured failure, much like Lysenko’s politically endorsed dogma assured failure to understand the comparatively trivial question of genetic variation and inheritance.

March 8, 2012 8:31 am

Lack of progress on understanding “climate change” is easy to explain. When I read the garbage published by the so-called “climate scientists” that is presented here on WUWT and on other sites, it’s easy to see that those doing the research are far from being the “best and the brightest”.

Joachim Seifert
March 8, 2012 8:33 am

Willis, great article…..good to read…..the climate MISERY since 1979!
I agree profoundly with your major quote:
“””For me, there is only one answer. The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error…..””””
There are 2 major errors concerning fundamentals/underlying paradigms:
(1) To CO2: The trace gas does not warm the climate, see papers of G.Kramm &Tscheuschner
and G. Gerlich on the role of CO2
(2) The global climate warming/cooling cause on centennial scale is not atmospherical
but ASTRONOMICAL, thus only HARMONIC models can show the cause….
I just finished the last numbers on it, after being inspired by Nic Scafetta’s harmonic model
vs. IPCC GMC’s , the underlying warming mechanics is absolutely clear now, no doubt left
and results show (for the keen reader) that each of the 3 coming decades will shed 0.1’C
per decade in GMT…..
Cheers to all….
JS

michael hart
March 8, 2012 8:41 am

I see the fundamental problem as one of kinetics. A synthetic chemist can go into the lab to test their great new idea. From bitter experience I can tell you that it will usually fail the first time. And it fails again. That comes with the territory. Wash the glassware. Maybe change something, and try again. But at least the chemist will learn that rapidly. Possibly within hours. Reality can be a tough disciplinarian in science.
Now, what if the experiment takes several years, or even decades? The student may have completed their Ph.D. and be building a successful career before the results come in. If the results are not as expected there may be no time or resources to start again from scratch. What are they going to do? Retract publications, say they may have been wrong and resign? Of course not. Press on. More analysis, faster computers. Even when there may be reasons to think that can’t help. Sure the student had a good advisor, but that advisor likely faced the same problems with the same unsatisfactory solutions.
Every endeavour has it’s associated hype and marketing. When a new drug or medical treatment fails, it will be in the papers. An engineer sure gets to know what people think if their bridge falls down, or their satellite doesn’t work, or the rocket blows up on the launch pad. Right now I can’t see where the failure point is in climate science. It is not necessarily the fault of someone studying it that they have wait a whole career to be proved wrong. But it’s a fundamental question that a science has to address.

March 8, 2012 8:46 am

Willis,
On the positive side of things, from my perspective, two quotes from the report were something that might lead to a reevaluation of what areas of of research one could/should focus our limited resources on in the field of climate science.
1) “Trustworthy answers can be obtained only through comprehensive numerical modeling of the general circulations of the atmosphere and oceans together with validation by comparison of the observed with the model-produced cloud types and amounts.”
2) “At present, we cannot simulate accurately the details of regional climate and thus cannot predict the locations and intensities of regional climate changes with confidence. This situation may be expected to improve gradually as greater scientific understanding is acquired and faster computers are built.”
The second quote is the one that interests me the most. As you have noted the development of computers (information processing) has increased dramatically over the years in terms of computations per second and in the field we can expect further improvements in the scientific and technological development over time in computations per second.
My suggestion would be for climate science research to focus a bit more, ok a lot more, on “greater scientific understanding” and a lot less on running the models that we already know cannot “predict the locations and intensities of regional (let alone local) climate change with confidence.”
Thanks for the post.

March 8, 2012 8:51 am

Rick says:
March 8, 2012 at 7:45 am
OT but I was listening to a March 8th CBC radio program “Quirks and Quarks” and W.Richard Peltier, the most recent winner of the Gerhard Herzberg Medal and prize, was being interviewed. . . Here we have a very prestigious award, worth a million dollars, awarded to a man with obvious talents and he spends most of his intervew time whining about “deniers”.

And that is the problem. For all the cheering here and elsewhere at the tattered and embarrassing state of self-styled ‘climate science’, for all the signs that governments in Europe and America are backing away from additional ‘climate’ expenditures, the institutional leviathan rolls on, dominating the discussion at every level, stifling market initiative, and skewing scientific inquiry by demanding it study nothing but ‘climate change’.
We’ll know the monster has been stopped when a prestigious academic like this Peltier character, “a mentor” (says Rick) “of more than 30 doctoral students and an equal number of post doctoral fellows” turns around and tells his chairman and the world, “This ‘climate change’ nonsense is based on utterly fallacious and untestable assumptions; my students are henceforth going to do real science, and the granting agencies had better wise up and support them.”
/Mr Lynn

March 8, 2012 8:56 am

Willis says: “Now we can debate what that fundamental misunderstanding might be.”
Let’s start with the fundamental that drives all else.
Can CO2 do what they day it does?
I say no.

Michael Palmer
March 8, 2012 9:02 am

Michael Hart,
you make some excellent points. From the way you make them, as in: “A synthetic chemist can go into the lab to test their great new idea. From bitter experience I can tell you that it will usually fail the first time,” I take it that you must be younger than me. I no longer find the experience bitter—instead, I fully expect and look forward to it; I am fully focused on observing the way it fails, and am almost disappointed if something works as planned the first time.
You correctly say, “Reality can be a tough disciplinarian in science”. This, to me, is the single most valuable aspect of my scientific education. If you have the smarts, you can easily work a spreadsheet (with the possible exception of trend lines, of course ;), whip up some code, subscribe to some dogma or even invent a new one. However, only rigorously comparing your predictions to reality will make it science, and will make you a better person.

JMW
March 8, 2012 9:28 am

I guess this is a classic example of “Policy driven Science”.
They knew the answer they needed to have and hence they had to build on the assumption of AGW and CO2 driven runaway warming.
You say it is OK for them to say “hey, the question you ask is too difficult, so we’ll answer the question we want to answer and not the question you want answered.”
I not sure I agree.
But then again, maybe it is right.
If they were to answer the question they did answer then they have had plenty of time – since 1979, shed loads of money and a massive increase in computing power and numbers of researchers and research projects has expanded dramatically.
So, given the nature of the question they did try to answer, resources and time were never an issue.
That being the case, they could equally have decided to answer the question they were asked and simply said “OK. We can do that but it will take time and a shed-load of tax payers money. Don’t expect an answer soon.”
But there are two things wrong with this.
Firstly the implication is that if they tried to answer the question stated that because there was no fundamental flaw, the answers (and the science) should have matured a lot faster and a lot cheaper. OK, we can’t know. We might suppose and expect but we might still come up with “we don’t know.”
Secondly they would then be answering a question instead of trying to support a hypothesis designed to support a policy. In that case, maybe the funds and researchers would not have flowed like water from the tax payers pockets. In which case they probably wouldn’t actually be any further advanced. In all likely-hood the question would have been asked of another organisation slightly more savvy and who would then do exactly as they have done.
These “what would happen IF….?” questions are a nice diversion but I guess all we really can do is exactly as you have done.
Ask “where’s the beef?”

March 8, 2012 9:36 am

On the very iconoclastic UK site “Number Watch” is a segment on “The Laws”. One of relevance is:

The law of computer models
The results from computer models tend towards the desires and expectations of the modellers.
Corollary
The larger the model, the closer the convergence.

So all those gigaflops of added complexity are causing convergence with the modellers’ assumptions. I wonder if it’s asympototic, or linear …

March 8, 2012 9:44 am

Wonder if that new Cray supercomputer called “Gaea” has been already ideologically programmed.

Richard S Courtney
March 8, 2012 9:46 am

michael hart:
In your post at March 8, 2012 at 8:41 am you address a question which you pose’ viz.
Now, what if the experiment takes several years, or even decades?
But that question is a ‘red herring’ because anybody who examines the outputs of climate models can see that none – not any – of the climate models emulates the climate system of the Earth. The examination takes minutes (n.b. NOT years) and the conclusion is indisputable. (This is explained in my above post at March 8, 2012 at 3:17 am).
Therefore, each of the models should be rejected as a predictive tool because its inability to emulate existing climate indicates it is very unlikely to emulate responses of existing climate to altered inputs.
The climate modellers know their models do not emulate the Earth’s climate system but wish to obscure the fact from wider circulation. So, they do “ensemble runs” and take the average of the runs. Of course, this practice is merely fakery because average wrong is wrong.
Richard

Septic Matthew/Matthew R Marler
March 8, 2012 9:51 am

Willis, I liked your quote from Tom O’Bedlam’s song. We must have at some time purchased books from the same bookstore or something.
The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error.
The parts of the climate science that give readily computable answers are the simplified thermodynamic models, illustrated in great detail by Raymond T. Pierrehumbert’s book “Principles of Planetary Climate”. Most of that science was “settled” a while back, and the answers are mostly consistent decade after decade. The omissions and necessary elaborations have not changed that much. An alternative to your inference that climate science is based on a basic and far-reading error is that the climate is a complex of many locally acting and some globally acting processes, and that it is merely hard to learn the whole system; a task requiring some more decades, at least, of patient and persistent empirical research (like the Georgia Tech paper you thrashed recently), and patient and persistent model building and testing. You could be a p-47 pilot complaining that the F-22 can’t be built because of fundamental error in the underlying paradigm — but as we have seen, many detailed steps were required, not a fundamental paradigm shift, in between, including supercomputers to give better answers to the Navier-Stokes equations, and other improved mathematical/computer modeling of fluid flow, and better materials.
Someday someone like Isaac Held, with a more complex and detailed model running on a more powerful computer will have an answer (tentative, a la Fred Mooten of Climate Etc) to the question of what will extra CO2 do to cloud and rain formation in open Pacific Ocean.
Maybe something as clearly revolutionary as the Hahn-Meitner-Strassman discovery of uranium fission will occur, but the solutions we seek may well come from persistent dedicated efforts of the kinds now underway and planned. And, like many fundamental discoveries, the discovery of fission was an unplanned outcome of a well-planned and well-executed series of experiments over a long time span. They did not start off trying to create a new weapon or new power station.

March 8, 2012 9:52 am

Great post.
However I have another idea as to what is stultifying the science: mutually-exclusive assumptions.
The whole edifice is based on the assumption that the climate was stable before human intervention. The hockey stick is the most obvious expression of this, but it is the underlying assumption, down to pre-emptive demands in secret emails that they must get rid of the MWP.
However it is at odds with the idea of positive feedback. And that is also one of the foundations of the edifice, as of course without positive feedback AGW can never be catastrophic.
However no system showing positive feedback can be stable. So there is an inherent contradiction between two basic assumptions, and work on one is necessarily pulled down by work on the other. Progress cannot be made until the researchers give up one of these.

D. J. Hawkins
March 8, 2012 9:55 am

The Pompous Git says:
March 8, 2012 at 12:24 am
Richard deSousa sais March 7, 2012 at 11:30 pm
The ghost writer for NAS report has to be James Hansen.
Ah yes, the Ghost Who Talks 😉

Maybe he can help Trenberth with the “phantom” heat he’s been looking for ;-).

mojo
March 8, 2012 10:06 am

Nitpick:
“A knight of ghosts and shadows”, surely?
[Per Wiki: “Both “Tom O’ Bedlam” and “Mad Maudlin” are difficult to give a definitive form, because of the number of variant versions and the confusion between the two within the manuscripts.” -w]

1 4 5 6 7 8 11