Under the radar – the NAS Report

Guest Post by Willis Eschenbach

Under the radar, and un-noticed by many climate scientists, there was a recent study by the National Academy of Sciences (NAS), commissioned by the US Government, regarding climate change. Here is the remit under which they were supposed to operate:

Specifically, our charge was

1. To identify the principal premises on which our current understanding of the question [of the climate effects of CO2] is based,

2. To assess quantitatively the adequacy and uncertainty of our knowledge of these factors and processes, and

3. To summarize in concise and objective terms our best present understanding of the carbon dioxide/climate issue for the benefit of policymakers.

Now, that all sounds quite reasonable. In fact, if we knew the answers to those questions, we’d be a long ways ahead of where we are now.

Figure 1. The new Cray supercomputer called “Gaea”, which was recently installed at the National Oceanic and Atmospheric Administration. It will be used to run climate models.

But as it turned out, being AGW supporting climate scientists, the NAS study group decided that they knew better. They decided that to answer the actual question they had been asked would be too difficult, that it would take too long.

Now that’s OK. Sometimes scientists are asked for stuff that might take a decade to figure out. And that’s just what they should have told their political masters, can’t do it, takes too long. But noooo … they knew better, so they decided that instead, they should answer a different question entirely. After listing the reasons that it was too hard to answer the questions they were actually asked, they say (emphasis mine):

A complete assessment of all the issues will be a long and difficult task.

It seemed feasible, however, to start with a single basic question:  If we were indeed certain that atmospheric carbon dioxide would increase on a known schedule, how well could we project the climatic consequences?

Oooookaaaay … I guess that’s now the modern post-normal science method. First, you assume that there will be “climatic consequences” from increasing CO2. Then you see if you can “project the consequences”.

They are right that it is easier to do that than to actually establish IF there will be climatic consequences. It makes it so much simpler if you just assume that CO2 drives the climate. Once you have the answer, the questions get much easier …

However, they did at least try to answer their own question. And what are their findings? Well, they started out with this:

We estimate the most probable global warming for a doubling of CO2 to be near 3’C with a probable error of ± 1.5°C.

No surprise there. They point out that this estimate, of course, comes from climate models. Surprisingly, however, they have no question and are in no mystery about whether climate models are tuned or not. They say (emphasis mine):

Since individual clouds are below the grid scale of the general circulation models, ways must be found to relate the total cloud amount in a grid box to the grid-point variables. Existing parameterizations of cloud amounts in general circulation models are physically very crude. When empirical adjustments of parameters are made to achieve verisimilitude, the model may appear to be validated against the present climate. But such tuning by itself does not guarantee that the response of clouds to a change in the CO2 concentration is also tuned. It must thus be emphasized that the modeling of clouds is one of the weakest links in the general circulation modeling efforts.

Modeling of clouds is one of the weakest links … can’t disagree with that.

So what is the current state of play regarding the climate feedback? The authors say that the positive water vapor feedback overrules any possible negative feedbacks:

We have examined with care ail known negative feedback mechanisms, such as increases in low or middle cloud amount, and have concluded that the oversimplifications and inaccuracies in the models are not likely to have vitiated the principal conclusion that there will be appreciable warming. The known negative feedback mechanisms can reduce the warming, but they do not appear to be so strong as the positive moisture feedback.

However, as has been the case for years, when you get to the actual section of the report where they discuss the clouds (the main negative feedback), the report merely reiterates that the clouds are poorly understood and poorly represented … how does that work, that they are sure the net feedback is positive, but they don’t understand and can only poorly represent the negative feedbacks? They say, for example:

How important the overall cloud effects are is, however, an extremely difficult question to answer. The cloud distribution is a product of the entire climate system, in which many other feedbacks are involved. Trustworthy answers can be obtained only through comprehensive numerical modeling of the general circulations of the atmosphere and oceans together with validation by comparison of the observed with the model-produced cloud types and amounts.

In other words, they don’t know but they’re sure the net is positive.

Regarding whether the models are able to accurately replicate regional climates, the report says:

At present, we cannot simulate accurately the details of regional climate and thus cannot predict the locations and intensities of regional climate changes with confidence. This situation may be expected to improve gradually as greater scientific understanding is acquired and faster computers are built.

So there you have it, folks. The climate sensitivity is 3°C per doubling of CO2, with an error of about ± 1.5°C. Net feedback is positive, although we don’t understand the clouds. The models are not yet able to simulate regional climates. No surprises in any of that. It’s just what you’d expect a NAS panel to say.

Now, before going forwards, since the NAS report is based on computer models, let me take a slight diversion to list a few facts about computers, which are a long-time fascination of mine. As long as I can remember, I wanted a computer of my own. When I was a little kid I dreamed about having one. I speak a half dozen computer languages reasonably well, and there are more that I’ve forgotten. I wrote my first computer program in 1963.

Watching the changes in computer power has been astounding. In 1979, the fastest computer in the world was the Cray-1 supercomputer. In 1979, a Cray-1 supercomputer, a machine far beyond anything that most scientists might have dreamed of having, had 8 Mb of memory, 10 Gb of hard disk space, and ran at 100 MFLOPS (million floating point operations per second). The computer I’m writing this on has a thousand times the memory, fifty times the disk space, and two hundred times the speed of the Cray-1.

And that’s just my desktop computer. The new NASA climate supercomputer “Gaea” shown in Figure 1 runs two and a half million times as fast as a Cray-1. This means that a one-day run on “Gaea” would take a Cray-1 about seven thousand years to complete …

Now, why is the speed of a Cray-1 computer relevant to the NAS report I quoted from above?

It is relevant because as some of you may have realized, the NAS report I quoted from above is called the “Charney Report“. As far as I know, it was the first official National Academy of Science statement on the CO2 question. And when I said it was a “recent report”, I was thinking about it in historical terms. It was published in 1979.

Here’s the bizarre part, the elephant in the climate science room. The Charney Report could have been written yesterday. AGW supporters are still making exactly the same claims, as if no time had passed at all. For example, AGW supporters are still saying the same thing about the clouds now as they were back in 1979—they admit they don’t understand them, that it’s the biggest problem in the models, but all the same but they’re sure the net feedback is positive. I’m not sure clear that works, but it’s been that way since 1979.

That’s the oddity to me—when you read the Charney Report, it is obvious that almost nothing of significance has changed in the field since 1979. There have been no scientific breakthroughs, no new deep understandings. People are still making the same claims about climate sensitivity, with almost no change in the huge error limits. The range still varies by a factor of three, from about 1.5 to about 4.5°C per doubling of CO2.

Meanwhile, the computer horsepower has increased beyond anyone’s wildest expectations. The size of the climate models has done the same. The climate models of 1979 were thousands of lines of code. The modern models are more like millions of lines of code. Back then it was atmosphere only models with a few layers and large gridcells. Now we have fully coupled ocean-atmosphere-cryosphere-biosphere-lithosphere models, with much smaller gridcells and dozens of both oceanic and atmospheric layers.

And since 1979, an entire climate industry has grown up that has spent millions of human-hours applying that constantly increasing computer horsepower to studying the climate.

And after the millions of hours of human effort, after the millions and millions of dollars gone into research, after all of those million-fold increases in computer speed and size, and after the phenomenal increase in model sophistication and detail … the guesstimated range of climate sensitivity hasn’t narrowed in any significant fashion. It’s still right around 3 ± 1.5°C per double of CO2, just like it was in 1979.

And the same thing is true on most fronts in climate science. We still don’t understand the things that were mysteries a third of a century ago.  After all of the gigantic advances in model speed, size, and detail, we still can say nothing definitive about the clouds. We still don’t have a handle on the net feedback. It’s like the whole realm of climate science got stuck in a 1979 time warp, and has basically gone nowhere since then. The models are thousands of times bigger, and thousands of times faster, and thousands of times more complex, but they are still useless for regional predictions.

How can we understand this stupendous lack of progress, a third of a century of intensive work with very little to show for it?

For me, there is only one answer. The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error.

Now we can debate what that fundamental misunderstanding might be.

But I see no other explanation that makes sense. Every other field of science has seen huge advances since 1979. New fields have opened up, old fields have moved ahead. Genomics and nanotechnology and proteomics and optics and carbon chemistry and all the rest, everyone has ridden the computer revolution to heights undreamed of … except climate science.

That’s the elephant in the room—the incredible lack of progress in the field despite a third of a century of intense study.

Now me, I think the fundamental misunderstanding is the idea that the surface air temperature is a linear function of forcing. That’s why it was lethal for the Charney folks to answer the wrong question. They started with the assumption that a change in forcing would change the temperature, and wondered “how well could we project the climatic consequences?”

Once you’ve done that, once you’ve assumed that CO2 is the culprit, you’ve ruled out the understanding of the climate as a heat engine.

Once you’ve done that, you’ve ruled out the idea that like all flow systems, the climate has preferential states, and that it evolves to maximize entropy.

Once you’ve done that, you’ve ruled out all of the various thermostatic and homeostatic climate mechanisms that are operating at a host of spatial and temporal scales.

And as it turns out, once you’ve done that, once you make the assumption that surface temperature is a linear function of forcing, you’ve ruled out any progress in the field until that error is rectified.

But that’s just me. You may have some other explanation for the almost total lack of progress in climate science in the last third of a century, and if so, all cordial comments gladly accepted. Allow me to recommend that your comments be brief, clear and interesting.

w.

PS—Please do not compare this to the lack of progress in something like achieving nuclear fusion. Unlike climate science, that is a practical problem, and a devilishly complex one. The challenge there is to build something never seen in nature—a bottle that can contain the sun here on earth.

Climate, on the other hand, is a theoretical question, not a building challenge.

PPS—Please don’t come in and start off with version number 45,122,164 of the “Willis, you’re an ignorant jerk” meme. I know that. I was born yesterday, and my background music is Tom o’Bedlam’s song:

By a host of furious fancies

Whereof I am commander

With a sword of fire, and a steed of air

Through the universe I wander.

By a ghost of rags and shadows

I summoned am to tourney

Ten leagues beyond the wild world's end

Methinks it is no journey.

So let’s just take my ignorance and my non compos mentation and my general jerkitude as established facts, consider them read into the record, and stick to the science, OK?

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
272 Comments
Inline Feedbacks
View all comments
March 8, 2012 1:12 pm

“Greetings, Professor Falken. Want to play a game???”

Al Gored
March 8, 2012 1:13 pm

For me this lack of progress in Consensus “climate science” just reveals that:
1) It is not science but ideology. Given all the new discoveries which have been made – despite the obstacles to that – and the questions they raise, the Consensus opinion has not budged. Real science would have moved but ideology doesn’t… because it is ideology that must be defended.
2) These ideologues ‘found’ the ‘answer’ they wanted before the whole project started and have too much based on it to change. On a much smaller scale the same effect happened in the great ‘Clovis First’ controversy in archaeology when researchers would simply stop digging when they hit the Clovis-dated era lest they find something older that upset that orthodoxy. It took a long time to get past that, with similar groupthinking smears of the early ‘heretics.’ And that field has real solid evidence to work with (or deny).
They have been working on this for a long time:
http://inthesenewtimes.com/2009/11/29/1975-endangered-atmosphere-conference-where-the-global-warming-hoax-was-born/

March 8, 2012 1:14 pm

Willis, you say: “For me, there is only one answer. The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error.”
=========
That’s exactly my point too. – I do not believe the theory of the Atmospheric Greenhouse Effect (AGHE) to be a “misnomer”. I think it is just plain wrong and has been so ever since the day in 1896 when Arrhenius invented “back-radiation”. – A belief in “Back-radiation” is, of course, quite reasonable – so I am not knocking it per se. –
However CAGW enthusiasts and “Skeptics” alike all believe in this “Misnomer”. – The difference is, the skeptics say, that any “warming” attributed to CO2 concentration is likely to be so small it can quite easily be lost “in the noise”. (At least that was the case was before they put a “circa 1 deg. C” number on it).
I am fine with that – cool – in fact. – But, my observation is; “Just because both sides cannot be correct – does not, automatically, mean that one side has got it completely right. – My argument is that both sides and their beloved GHG “misnomer”- may all be in the wrong. – This means that skeptics cannot “win the argument” until Hell – and the rest – freezes over. – By which time it will all be too late.
The AGHE theory, seems to me, to be reliant on the belief, or assumption, that “Heat” is radiated from the surface to be absorbed by atmospheric GHGases which – radiate them straight back. – Thus conserving energy and raising the surface temperature at the same time. –
Unless, there are more types of radiation than the electromagnetic (EM) one, then all “the evidence” is that heat cannot be radiated. – So let’s look at just a few things:
The Sun radiates energy to the earth, in the form of light’ in less than 9 minutes. – (At least that’s the story I have heard).
Electric energy can flow through the appropriate cable (from + to – or vice versa) similarly, it happens at light speed. – At a break in the cable where electricity meets the air, the flow stops. – You can force the electric energy to jump across that break in the cable (through the air) by for example creating an arc with a couple of bits of carbon. The “Arc” created, always emits visible light. – My advice to anyone, is not to touch that light as it will severely burn your fingers. – And, by the way an electric lead, or cable always creates a magnetic field around itself as long as the current is flowing through it.
– Furthermore, with the appropriate equipment setup you can talk into a microphone in America and someone with the correct receiver in Africa can hear your words – instantly. —– These are just a few examples of energy-flow, light and heat creation.
Heat never moves all that fast, (at least, not at light speed) and it probably would be “static” if it was not for Conduction and Convection. ——- Now then, – think of this: If the Sun is a ball using its potential energy to produce DC positive electricity and all the planets are “overall” negatively charged, then the Solar System would be like a “max. Atom” – and thinking about it – compared to the universe – we are not all that big, at all.
————–
And lastly you say: “PPS—Please don’t come in and start off with version number 45,122,164 of the “Willis, you’re an ignorant jerk” meme. I know that. I was born yesterday, and my background music is Tom o’Bedlam’s song:
I hope I didn’t, but I too am often described by many as “an ignorant jerk” and I am proud of it, as it tells me I am still alive. — sorry, I’m not a singer

March 8, 2012 1:19 pm

I am skeptical of skepticism! – Hurrah – That’s my niche.

March 8, 2012 1:35 pm

“Willis Eschenbach says:
March 8, 2012 at 11:01 am
Michael Palmer says:
Firstly, in a trivial, tautological sense, the assumption must be true—a forcing is anything that changes the surface temperature, and the greater the change, the stronger the forcing.

That is assuredly not the definition of a forcing that is used in climate science.”
True. This is why I prefixed my paragraph with a caveat.
“In climate science, a ‘forcing’ is generally taken to be a change in the downwelling radiation at the top of the atmosphere.”
That, on the other hand, seems an overly narrow definition of the term “forcing”.
“Whether this change in downwelling radiation ends up changing the temperature, or whether it is simply balanced out by e.g. a change in the cloud albedo, is the huge unanswered question in climate science. And the fact that you think this question has been answered means you are not following the story. You can’t simply claim that the biggest unanswered question in the field is answered.”
I made no such claim, of course.
PS Willis, quite apart from this question: I can’t find your email address anywhere—I would like to discuss a technical aspect that might benefit your research. If you are interested, please email me – I promise to not misuse your email address.

Richard S Courtney
March 8, 2012 1:44 pm

Leigh B. Kelley:
Your post at March 8, 2012 at 12:30 pm asks:
“There is something that puzzles me I wish that you (or anyone else) might address. Why has the climate sensitivity range in the (what is it?) 23 AOGCM’s used by the IPCC not been “tuned”, “fudged” or adjusted to show more convergence rather than being left just where it was in 1979? “
I answer that this was politically impossible. Whose values of climate sensitivity and aerosol forcing should be preferred?
There is no justification for any of the climate sensitivities used in the models. Indeed, all empirical studies indicate much lower climate sensitivities. So, adoption of any particular value would be an agreed preference for the model which used that value and, therefore, each model Team would argue for adoption of the value it used. An average of all the used values would not solve this because the average would be nearest to the value used by one model.
Simply, any attempt to adopt an agreed value for use in all models would initiate a continuous squabble between the model Teams which could only harm them all. But the existing situation benefits them all.
Richard

IAmDigitap
March 8, 2012 1:45 pm

In climate science a forcing is leveraging temperature or any other variable, one way or the other – not only radiative.

Colin in BC
March 8, 2012 1:49 pm

anticlimactic says:
March 8, 2012 at 3:23 am
AGW adherents remind me of those who were thought that the Earth was at the centre of the solar system.

Bingo!
I’ve made this very analogy myself. The hubris displayed in both examples regarding the significance of Man is extraordinary.

Gary Hladik
March 8, 2012 1:58 pm

Well done, Willis!
And kudos to the commenters on this thread for their added insights. Special thanks to Bill Hunter (March 8, 2012 at 3:56 am) for his eloquent comment on accountability.

Big Bob
March 8, 2012 1:59 pm

It seems to me that if feed backs were at all positive the earth would have burnt to crisp long ago. I don’t see that it matters whether the initial source is CO2 or anything else. Positive is positive. Any increase in temp for any reason whatever wouild cause thermal runaway. Obvously it has not.

johanna
March 8, 2012 2:03 pm

Jim Turner says:
March 8, 2012 at 6:27 am
The point about the rate of scientific progress is an interesting one – it is certainly not ‘even’, some areas have advanced enormously and others not. I think that this can only partly be explained by the effort expended. In my own area (pharmaceutical research) there is much effort, and continual impressive progress is being made in the understanding of underlying processes. In say, the last fifty years we have gained a huge amount of understanding of genetics, biochemical processes and cell signalling; however our actual ability to treat diseases like cancer has improved rather modestly by comparison.
I think part of the explanation may be that how much more we know than we did before is less important than how much less ignorant we are. For example, our knowledge of something may double – but it may be that our knowledge of all that it is possible to know has actually only increased from 0.1% to 0.2%, so we are still largely ignorant.
————————————————————————
I thought of ‘the war on cancer’ as well when I read this post. However much has been spent on unproductive climate research, it is a small fraction of what has been spent on unproductive cancer research.
For a long time, it was assumed that cancer is one disease, not many. and that if we could only crack the magic code, we could find a universal cure. Real progress was not made until it was understood that it is not one disease, just one set of symptoms (to over-simplify for the purpose of discussion). So, we now know that many cervical cancer cases are triggered by a genital wart virus, and a vaccine has been developed. I think it is likely that further biological triggers for other types of cancer will be found. We know that age is probably the biggest risk factor, as our cells gradually become less functional. We know that some chemicals, and high levels of radiation exposure, may be triggers in some people. Detection, surgery, chemotherapy and radiotherapy have improved greatly, but a ‘cure’ is essentially as elusive as ever.
These prosaic facts are a long way away from the ‘cure for cancer’ objective that characterised research for many decades, with negligible results.
I think that most climate scientists are still stuck in the same universal remedy mindset, and like the cancer researchers of the past, keep doing the same thing over and over, but in more high-tech ways, and expecting different results.
In tackling a complex problem, I have always found that the best way is to break it down to bite-size chunks and start with compiling the knowns as a basis for working on the unknowns. As Willis’ post points out, the foundations of modern climatology are exactly the other way around.

David A
March 8, 2012 2:07 pm

Septic Matthew/Matthew R Marler says:
March 8, 2012 at 9:51 am
“The lack of progress means that there is some fundamental misunderstanding at the very base of the modern climate edifice. It means that the underlying paradigm that the whole field is built on must contain some basic and far-reaching theoretical error.”
==========================================================
The parts of the climate science that give readily computable answers are the simplified thermodynamic models, illustrated in great detail by Raymond T. Pierrehumbert’s book “Principles of Planetary Climate”. Most of that science was “settled” a while back, and the answers are mostly consistent decade after decade. The omissions and necessary elaborations have not changed that much.
====================================================
Curious, thermodynamic models that are consistently not reflected in real world observations “are mostly CONSISTENT decade after decade.”
Perhaps Emerson would have something to say on that, “A foolish consistency is the hobgoblin of small minds”

1DandyTroll
March 8, 2012 2:21 pm

Hudson: Stop your grinnin’ and drop your linen!
I found it, the reason for the time warp.
The year was 1979, electricity was in the air, oil and coal was again abundant after all, economies was booming, the ice age never came, probably a department or two were down sized. But did this stop the industrious climate-to-be entrepreneurs, after all they had felt they were on the right track with the ice age, they just need enough computer power to really push those numbers up and up.
One day, “whom” ever it was, saw the ad, that would change everything, so powerful was it ripped the very fabric of time and space, and it read:
*Fast and powerful
*Systems grow with your needs
*Easy to operate
*Affordable
http://www.trs-80.com/covers/cat-rs-trs80model2(1979).jpg
But, alas, they needed a good predictable software to go with it, to really complete the two to one wounderous digital soul.
As the story is told, a group of people, not a whole lot of different then the ice age mongers themselves and in fact, for all intent and purposes, they were literally in the same field, like soul friends even, inveted what was to become the one side that would complete the machine into one coded super soul, enter: A S T R O L A B E ‘ s A S T R O L O G Y S O F T W A R E ! ! !
The rest is like they say, a history worthy of a recap: By 1988, with their newly accuired, almost too expensive, “64-bit wink wink nudge nudge” system labeled C-64, and with the permanent UN based memory module installed, they’ve been literally PEEKing and POKEing the IPCC-ROM ever since.
(No pun intended to the good folks of astrolabe or the dandy Tandy.)

IAmDigitap
March 8, 2012 2:24 pm

[SNIP: Please address the issues. Thee is no point in provoking him. -REP]

March 8, 2012 2:27 pm

O H Dahlsveen says:
March 8, 2012 at 1:14 pm
I do not believe the theory of the Atmospheric Greenhouse Effect (AGHE) to be a “misnomer”. I think it is just plain wrong
_______________________________________
Basically you are correct. There can be no transfer of thermal energy from the cooler atmosphere to the warmer surface by any physical process, radiation or otherwise. However, we have to acknowledge that radiation from the atmosphere does slow the rate of radiative energy transfer from the surface to the atmosphere. This is why it can be warmer on moist nights. However, on balance, other processes, mostly evaporation and diffusion (conduction) will make up for any reduction in radiative flux, because of the stabilising effect of the massive store of thermal energy beneath the outer crust, which is not due to the very slow rate of terrestrial energy flow.
There is also a cooling effect due to water vapour and CO2 etc as these absorb downwelling IR radiation from the Sun and send upward backradiation to space.
The temperature gradient in the atmosphere is determined by the mass of the atmosphere and the acceleration due to gravity, both close enough to being constants. All the claims about 255K are based on the false assumption that the surface is anything like a blackbody. It’s not because it’s not insulated from losses by diffusion and evapoation. Less than half the energy exits by radiation. So, not only is that 33 degree figure based on a totally incorrect 255K figure, but it also ignores the fact that there is an adiabatic lapse rate that has nothing to do with backradiation.
This is a very brief summary of my peer-reviewed paper being published next week.
[Please take it elsewhere, Doug. Your ideas are not welcome here, the thread is about something completely different. If you want to discuss your fantasies about the climate, please take it to Tallblokes. It is not welcome here. -w]

Septic Matthew/Matthew R Marler
March 8, 2012 2:43 pm

Willis: You’re missing my point. You seem to think that “the answers are consistent decade after decade” means something other than that they are asking the wrong question. My point is, the answers are just the same now as they were in 1979.
Your statement that “The answers are consistent decade after decade” is merely another way of saying what I said, that there has been very little progress in the field for a third of a century.
w.
PS—Ray Pierrehumbert is one of the most committed of the AGW alarmists, and one of the people behind RealClimate. Believe anything he says at your own peril.

I did not intend to defend consistency of the thermydynamics-based forecasts, merely to account for why they have remained consistent. Pierrehumbert’s book “Principles of Planetary Climate” should be read, in my opinion, by everyone who wishes to understand the skeptics’ case because he articulates it so well, but in passing and not intending to support skepticism. Repeatedly he draws attention to omitted details and the inaccuracies of mathematical approximations to shared-world relationships. I have pointed out a few of these to readers of Climate Etc. The book is a good example of much rich science which is not quite accurate enough and complete enough to substantiate long term predictions.
A good complementary book is “Dynamic Analysis of Weather and Climate” by Marcel Leroux. It’s full of presentations of energy flows such as were presented in the Georgia Tech paper that you critiqued. Denying the intellectual content of Pierrehumbert’s book because he is a warmer is a mistake, in my opinion. In my days as “Septic Matthew” I had a few good interchanges with him at RealClimate. I respect his work, though I think his conclusion CO2 induced future warming is inadequately supported by the evidence..
Another example of a complex science full of partial knowledge and mathematical approximations over several time scales and spatial scales is brain science. Like atmospheric science, of which I take climate science to be a subset, I think it is an example of an area of research in need of much more study, but no new “paradigm” in the Kuhnian sense (I say “Kuhnian” sense because according to Kuhn new paradigms are rare; nowadays, even the invention of a new measuring instrument may be called a “paradigm shift”.) I think that there has been much progress in the field of atmospheric science, but the equilibrium thermodynamic arguments are about the same as ever. It is a fundamentally important question whether cloud changes consequent on CO2 or temp increase will provide net negative or net positive feedback (e.g. net retention of warmth at night, net increased albedo in daytime, in most but not all of the earth), but no new overarching Paradigm or Methatheory will be necessary to understand it.

braddles
March 8, 2012 2:53 pm

Imagine that you wanted to change the world, and climate alarmism offered a way. What prediction of future temperature change would suit best for this purpose? The prediction, with an average and a range, would have to:
– have an average value not too extreme, but high enough to create disaster.
– have a lower limit just barely consistent with observations (with a bit of fudging), but not too low to worry about.
– have an upper limit that is not patently absurd.
I suggest that a prediction of 3 degrees plus or minus 1.5 would be perfect to fit these criteria. Is this why the prediction has not changed in 30 years?

Septic Matthew/Matthew R Marler
March 8, 2012 2:54 pm

Richard S. Courtney: Extreme scepticism would confront anybody who claimed to have constructed a computer model of the human brain that could predict brain behaviour,
I am glad that you mentioned brain science.

tallbloke
March 8, 2012 2:59 pm

The Pompous Git says:
March 8, 2012 at 10:55 am
The measured decrease in evapotranspiration over the last 50 yr is supposed to be due to an increase in cloud cover. The increase in cloud cover was supposedly confirmed by measurement of earthshine from the moon.

What a mine of disinformation you are. Go read Palle et al’s work and report back.

u.k.(us)
March 8, 2012 3:21 pm

So, after 30+ years of research, all that has been accomplished, is the creation of a well funded niche that is currently being exploited for political /industrial /bureaucratic gain.
Funded by taxpayers (voters), that must fight to see how their money is being spent.
!*+#%$……., but rant/.

Jurgen
March 8, 2012 3:30 pm

As for ideas “why” I was pondering this.
Any model construction presupposes some mechanisms and relations, and being a “model” (a more or less complex set of interrelations) it’s testability, and by implication it’s adaptability and flexibility for improvement are hampered from the start.
This line of thought goes beyond your theory about the linear surface air temperature forcing assumption as a possible cause, as it states doubt about the choice of starting with a model in the first place.
This may sound like too vague an argument to be of practical value, so I’ll clarify a bit.
In astronomy there is the well-known 3 body problem – I’ll cite what scholarpedia says about it:
http://www.scholarpedia.org/article/Three_body_problem
“While the two-body problem is integrable and its solutions completely understood (see [2],[AKN],[Al],[BP]), solutions of the three-body problem may be of an arbitrary complexity and are very far from being completely understood. ”
Well, there you have it. Here all you have are “just” three bodies with known physical properties, and their initial movements are also known, but still, even at this very basic level and in the ideal theoretical situation of no other influences but their own attraction, solutions “are very far from being completely understood”.
Mind you, there is an intrinsic problem here, and computers have nothing to do with it.
There seems to be a pretty naive assumption that computers by increasing their capacity may cross a boundary beyond which they become capable of solving a problem. Well you may try to square the circle with one or with a thousand million supercomputers – mathematically it is just not possible. It’s an intrinsic problem, and calculation power has nothing to do with it.
Now you can argue that simulation is not “problem solving” and that with simulation capacity matters. Well it matters if the model is a good one and a practical one, like with simulating a jet-engine. Here a model and a simulation are successfully applied. Well, an engineer knows his engine up to the tiniest part in and out. That is what makes this possible.
Compared with a model of a jet-engine a climate model is a wild gamble, a complete shot it the dark. Climate is an open system with many variables and non-linear relations you just make guesses about.
You can summarize the parallel in both your argument and mine: to start with some assumed knowledge about climate is the very thing that gets in the way of finding answers. You think you know what you are looking for but actually you don’t, so you will never find it.
As for a solution of this paradox I would completely turn around my strategy. Accept it you know not a lot if anything about how climate works. Just start with the data you have. There are a lot of data out there about many subjects related to climate. And start what’s called “data-mining”.
It’s a challenge (see e.g. http://www.scribd.com/doc/33728981/Applicability-of-Data-Mining-Techniques-for-Climate-Prediction-%E2%80%93-A-Survey-Approach) – but of course it is – we’re talking climate here. It won’t give spectacular answers to the public soon, but in my opinion it’s the only sensible approach.

March 8, 2012 3:46 pm

tallbloke said March 8, 2012 at 2:59 pm

The Pompous Git says:
March 8, 2012 at 10:55 am
The measured decrease in evapotranspiration over the last 50 yr is supposed to be due to an increase in cloud cover. The increase in cloud cover was supposedly confirmed by measurement of earthshine from the moon.

What a mine of disinformation you are. Go read Palle et al’s work and report back.

tallbloke, here’s a link: http://www.mindfully.org/Air/2002/Decreased-Pan-Evaporation1nov02.htm
There is no mention of Palle et al in that paper.