Love him or hate him, it is worthwhile to understand where he is coming from, so I present this video: The emergent patterns of climate change
According to TED:
You can’t understand climate change in pieces, says climate scientist Gavin Schmidt. It’s the whole, or it’s nothing. In this illuminating talk, he explains how he studies the big picture of climate change with mesmerizing models that illustrate the endlessly complex interactions of small-scale environmental events.
Video follows, comments welcome.
The transcript is here: http://www.ted.com/talks/gavin_schmidt_the_emergent_patterns_of_climate_change/transcript
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Gavin has found himself a new propaganda platform and dispenses them the same old climate fairy tale of global warming. There was no science in his talk, just inspirational messages about orders of magnitude and skillful models. I did find one useful tidbit of data, however, namely that climate models today have over a million lines of code. Now consider this: they have been refining their climate models for twenty four years since Hansen presented his first one in 1988. Despite having switched to supercomputers with huge memories their model predictions are no better than Hansen’s that was done on an IBM mainframe. If you take a look at a CMIP5 output using 44 individual predictions (or “projections” as they like to disguise them), not one of them is even close to the reality of the twenty-first century they pretend to project. It is time to admit that climate modeling simply does not work and close it down. Those 44 all have the alleged volcanic cooling for El Chichon and Pinatubo also built in but they don’t even get that right. Pinatubo eruption really was followed by cooling period thanks to a convenient La Nina that followed. But El Chichon was followed by an El Nino peak and yet their moronic code shows that one too as a cooling. The temperature break at the turn of the century introduces fourteen years of no-warming and that also is totally ignored by their software whose million lines of code direct it to predict warming there. Their entire enterprise of predicting the future climate is handicapped by having greenhouse warming built into their code. That is because Hansen announced in 1988 that he had observed the greenhouse effect. He was wrong but nobody checked his science and he has been getting away with it for all these years. What he did in 1988 was to show a rising temperature curve, from 1880 to 1988. Its peak in 1988, he said, was the warmest point within the last 100 years. According to him there was only a one percent chance that it could happen by accident. Hence, there was a 99 percent probability that the greenhouse effect had been detected. Only problem is that his 100 year greenhouse warming includes the non-greenhouse warming in the early century that started in 1910 and stopped in 1940. Radiation laws of physics demand that if you are going to start an enhanced greenhouse warming you must simultaneously increase the amount of greenhouse gas in the atmosphere. There was no increase of atmospheric carbon dioxide in 1910. Hence, the warming of 1910 cannot be greenhouse warming. It must be removed from Hansen’s 100 year warming.This lops off the last 60 years of it and leaves a see-saw temperature curve, consisting of 25 years of cooling and 23 years of warming, as a remnant of his 100 years of warming. You don’t have to be a rocket scientist to know that no way can this be used to prove the existence of the greenhouse effect.
A million lines of Fortran, the mind boggles. Glad to hear that the’re not using punched cards anymore, saved a lot of trees..
Why do I suspect that somewhere in that million lines, an actual skillful programmer has hidden a call to a random number generator. Seems to me that the problem with climate modelling is that there would be absolutely no way to tell short of waiting 100 years.
A million lines of code – without any erroneous assumptions….
Yeah, and XP /Win7/Win8 doesn’t crash…
Steve McIntyre says:
May 3, 2014 at 2:33 pm
“a mapping failure that seems to originate from a kind of academic stubbornness in the modeling community”
Corruption.
“The art of diplomacy is to say nothing so convincingly that nobody notices” (Isaac Asimov “Foundation”)
“you can’t chop it into one little bit”
“I am going to chop it into lots of little boxes”
“the climate has a scale of 14 magnitudes”
“climate models now have 4 orders of magnitude – we have 14 to go ” (sic) Maths?
“models are always wrong”
“our models are skilful”
“IT IS THE WHOLE OR NOTHING”
Well, since he is a few magnitudes short he certainly hasn’t the whole so therefore, by his own logic, he has nothing.
Again from “Foundation” “after eliminating meaningless statements, vague gibberish, useless qualifications – he had nothing left, everything cancelled out.”
In the few model results, such as air pressure beneath the ozone hole, that Schmidt claimed showed skill, there were significant differences to reality (even though he claimed they showed skill). If the models don’t exactly mimic reality, there is no chance that they will get any closer if left to run longer. The truth is, the model results were headed off in some other direction. The models show skill in mimicking the past because they were tweaked to do so, but none in predicting the future.
Schmidt illustrated the magnitude of the problem at the beginning of his talk. He should have stopped there and admitted that there is no way to duplicate the chaos and complexity of the climate in computer code. His conclusion was that we had better take drastic action to stop emissions of CO2 because his models, which we know have little skill, might be correct. He is a fool and hopes that the rest of us are fool enough to follow him.
Mosh
So in 1938 we had a 1 degree rise prediction from your scientists by 2014?
It’s currently at 0.6. According to hansens 1987 paper it was just over 0.2 in 1938.
https://groups.google.com/d/topic/ccc-gistemp-discuss/A23tgVfYQmE
So that’s a difference of 0.4c
So the sceptic that predicted no change is surely closer to the Right answer than the scientist?
Tonyb
Gavin discovers that climatology is a generalist discipline. What he must now acknowledge is it is being dismembered and misused by specialist who call themselves climate scientists. The adage about not seeing the forest for the trees comes to mind.
My cursory view is that each model is testing a different hypothesis so how can they be combined into a multi-model mean?
His lips are moving. Worse, his models are running. I agree with Steve Mc – all models are not crap. I’m prepared to believe though that all climate models are crap. I get a lot of help in that belief from the climate models and which is supported by observed phenomena.
The models illustrate a fantasy world of make believe. Here is Gavin at an earlier time. Note the date and what he said in reply to a question. Karl Popper would have been proud, but maybe not today. Maybe he is no longer worried about the state of understanding.
Now understand this from less than a year ago.
They are still fiddling in the dark.
I’m a software developer, that is I actually write code and have for 35 years. Computer models in science DO NOT contribute to the data they only express a hypothesis. Computer models in science in NO WAY can be considered experiments because model do not do science.
Computer models in ENGINEERING are an entirely different thing, I think the science cAGW modeller is a very confused or deliberately misleading person when they portray climate models as science.
I know it is only a very small start but should we club together to buy a real see through window for Gavin and his office?
I hate the idiot, they’ve never made a prediction that works, they have managed to *somehow* make their models predict the past. Oh I wish I could get paid to predict the past whilst claiming skill. What an an absolute idiot – as we already knew
The models really are fascinating. The problems for the IPCC’s AR5 projections were seen earlier than we previously thought.
I read the transcript. Boy those TED talks can really suck.
I’m sorry, but there is absolutely no defence for the rubbish spouted by Schmidt – and no mater how many times Mosh calls models good for other things (usually stuff with like, two (wow!), variables!) there is absofrigginglutely no way to model climate in anything like a realistic way – there are just too many variables and too many scenarios. One day, these muppets who think computers CAN do everything, will realise that they actually CAN’T and no amount of sales pitch will convince me otherwise.
Steve McIntyre: ” I do not share the kneejerk antagonism to “models” of many commenters”
To dismiss the “antagonism” as “kneejerk” is insulting to the many commenters whose knowledge of physics and numeric methods enables them to see clearly that the climate models could not possibly model the major climate determinants well enough to rule out everything but CO2 as the cause of most warming. The claim that the models can do so is an extraordinary claim that requires extraordinary proof.
If you’ve seen that proof, please share it with us.
The oceans, combined with oceanic/atmospheric oscillation cloud cover, especially over the tropics, have a FAR greater capacity for soaking up shortwave IR and keeping hold of it than atmospheric anthropogenic CO2 absorption capacity has in the spat out longwave IR spectrum. It is, at the very LEAST, just as likely that intrinsic natural variables, ones that have decades long oscillations, are the reason for the decadal energy imbalance Gavin speaks of. AND HE KNOWS IT! But he beats the CO2 drum because he has to sing whatever tune they give him for his dinner.
After over 50 years of fiddling what do we have?
And all the time new papers come out questioning the models. This is an exercise in failure. The more they fail the more money they want to continue failure. The UK Met Office’s latest computers are a prime example of failure. 12 out of 13 over projected warming. GIGO.
Steve Mosher says:
“In 1938 imagine two people were asked the following question: How warm will it be in 2014?
One, a skeptic, said… my best prediction is the temperature will be unchanged. This is a naive forecast.
The other a climate scientist using a model said: if we increase c02 athe current rate, temperature will be 1C warmer.
You then measure how much closer the modelled answer is to the truth than the naive answer.”
Steve: In 1930 imagine two people were asked the following question: How warm will it be in 1979?
Now how much closer is the modelled answer to the truth than the naive answer?
In 1878 imagine asking the question How warm will it be in 1977?
The global temperature rise in those 100 years was approx zero according to Hadcrut4, but CO2 had risen by 45ppm.
In 1977 imagine asking the question How warm will it be in 1998?
The temperature rise in those 20 years was approx 0.5 deg C according to Hadcrut 4 and CO2 had risen by 20ppm.
In each case the sceptic would have said “It can’t be predicted” but the climate scientist would have said what? 0 deg C and 0.5 deg C, I doubt it no matter how much he took account of CO2..
Climate Change and the Death of Science
“Even though it was concealed from those who constructed the models, the purpose of climate models was to provide the power of metaphor to political rhetoric:
…climate change models are a form of “seduction”…advocates of the models…recruit possible supporters, and then keep them on board when the inadequacy of the models becomes apparent. This is what is understood as “seduction”; but it should be observed that the process may well be directed even more to the modelers themselves, to maintain their own sense of worth in the face of disillusioning experience.
…but if they are not predictors, then what on earth are they? The models can be rescued only by being explained as having a metaphorical function, designed to teach us about ourselves and our perspectives under the guise of describing or predicting the future states of the planet…A general recognition of models as metaphors will not come easily. As metaphors, computer models are too subtle…for easy detection. And those who created them may well have been prevented…from being aware of their essential character.”
http://buythetruth.wordpress.com/2009/10/31/climate-change-and-the-death-of-science/
The “anthropogenic” models are not really models. They use pretty good general circulation models that they then tweak and twerp adding a predetermined warming rate calculation along with a sprinkling of aerosoles to mimic the training period and then let them run.
So I am of the same opinion as McIntyre. General circulation models are a work in progress and well worth the investment. It is what biased researchers do to them that reeks of rent seeking behavior.
Could your co2 guy then explain the subsequent cooling? If yes please explain in full.