Is It Time To Stop The Insanity Of Wasting Time and Money On More Climate Models?

Guest Opinion: Dr. Tim Ball

Nearly every single climate model prediction, projection or whatever else they want to call them has been wrong. Weather forecasts beyond 72 hours typically deteriorate into their error bands. The UK Met Office summer forecast was wrong again. I have lost track of the number of times they were wrong. Apparently, the British Broadcasting Corporation had enough as they stopped using their services. They are not just marginally wrong. Invariably, the weather is the inverse of their forecast.

Short, medium, and long-term climate forecasts are wrong more than 50 percent of the time so that a correct one is a no better than a random event. Global and or regional forecasts are often equally incorrect. If there were a climate model that made even 60 percent accurate forecasts, everybody would use it. Since there is no single accurate climate model forecast, the IPCC resorts to averaging out their model forecasts as if, somehow, the errors would cancel each other out and the average of forecasts would be representative. Climate models and their forecasts have been unmitigated failures that would cause an automatic cessation in any other enterprise. Unless, of course, it was another government funded, fiasco. Daily weather forecasts are improved from when modern forecasting began in World War I. However, even short term climate forecasts appear no better than the Old Farmers Almanac, which appeared in 1792, using moon, sun, and other astronomical and terrestrial indicators.

I have written and often spoken about the key role of the models in creating and perpetuating the catastrophic AGW mythology. People were shocked by the leaked emails from the Climatic Research Unit (CRU), but most don’t know that the actual instructions to “hide the decline” in the tree ring portion of the hockey stick graph were in the computer code. It is one reason that people translate the Garbage In, Garbage Out (GIGO) acronym as Gospel in, Gospel Out when speaking of climate models.

I am tired of the continued pretense that climate models can produce accurate forecasts in a chaotic system. Sadly, the pretense occurs on both sides of the scientific debate. The reality is the models don’t work and can’t work for many reasons, including the most fundamental; lack of data, lack of knowledge of major mechanisms, lack of knowledge of basic physical processes, lack of ability to represent physical mechanisms like turbulence in mathematical form, and lack of computer capacity. Bob Tisdale summarized the problems in his 2013 book Climate Models Fail. It is time to stop wasting time and money and put people and computers to more important uses.

The only thing that keeps people working on the models is government funding, either at weather offices or in academia. Without this funding computer modelers would not dominate the study of climate. Without the funding, the Intergovernmental Panel on Climate Change could not exist. Many of the people involved in climate modeling were not familiar with or had no training in climatology or climate science. They were graduates of computer modeling programs looking for a challenging opportunity with large amounts of funding available and access to large computers. The atmosphere and later the oceans fit the bill. Now they put the two together to continue the fiasco. Unfortunately, it is all at massive expense to society. Those expenses include the computers and the modeling time but worse the cost of applying the failed results to global energy and environmental issues.

Let’s stop pretending and wasting money and time. Remove that funding and nobody would spend private money to work on climate forecast models.

I used to argue that there was some small value in playing with climate models in a laboratory, with only a scientific responsibility for the accuracy, feasibility, and applicability. It is clear they do not fulfill those responsibilities. Now I realize that position was wrong. When model results are used as the sole basis for government policy, there is no value. It is a massive cost and detriment to society, which is what the Intergovernmental Panel on Climate Change (IPCC) was specifically designed to do.

The IPCC has one small value. It illustrates all the problems identified in the previous comments. Laboratory-generated climate models are manipulated outside of even basic scientific rigor in government weather offices or academia, and then become the basis of public policy through the Summary for Policymakers (SPM).

Another value of the IPCC Physical Science Basis Reports is they provide a detailed listing of why models can’t and don’t work. Too bad few read or understand them. If they did, they would realize the limitations are such that they preclude any chance of success. Just a partial examination illustrates the point.

Data

The IPCC people knew of the data limitations from the start, but it didn’t stop them building models.

In 1993, Stephen Schneider, a primary player in the anthropogenic global warming hypothesis and the use of models went beyond doubt to certainty when he said,

“Uncertainty about important feedback mechanisms is one reason why the ultimate goal of climate modeling – forecasting reliably the future of key variables such as temperature and rainfall patterns – is not realizable.”

A February 3, 1999, US National Research Council Report said,

Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.

To which Kevin Trenberth responded,

It’s very clear we do not have a climate observing system….This may come as a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.

Two Directors of the CRU, Tom Wigley, and Phil Jones said,

Many of the uncertainties surrounding the causes of climate change will never be resolved because the necessary data are lacking.

70% of the world is oceans and there are virtually no stations. The Poles are critical in the dynamics of driving the atmosphere and creating climate yet there are virtually no stations in 15 million km2 of the Arctic Ocean or for the 14 million km2 of Antarctica. Approximately 85% of the surface has no weather data. The IPCC acknowledge the limitations by claiming a single station data are representative of conditions within a 1200km radius. Is that a valid assumption? I don’t think it is.

But it isn’t just lack of data at the surface. Actually, it is not data for the surface, but for a range of altitudes above the surface between 1.25 to 2 m and as researchers from Geiger (Climate Near the Ground) on show this is markedly different from actual surface temperatures as measured at the few microclimate stations that exist.  Arguably US surface stations are best, but Anthony Watts diligent study shows that only 7.9 percent of them accurate to less than 1°C. (Figure 1) To put that in perspective, in the 2001 IPCC Report Jones claimed a 0.6°C increase over 120 years was beyond a natural increase. That also underscores the fact that most of the instrumental record temperatures were measured to 0.5°C.

clip_image002

Figure 1

Other basic data, including precipitation, barometric pressure, wind speed, and direction are worse than the temperature data. For example, in Africa there are only 1152 weather watch stations, which are one-eighth the World Meteorological Organization (WMO) recommended minimum density. As I noted in an earlier paper, lack of data for all phases of water alone guarantees the failure of IPCC projections.

The models attempt to simulate a three-dimensional atmosphere, but there is virtually no data above the surface. The modelers think we are foolish enough to believe the argument that more layers in the model will solve the problem, but it doesn’t matter if you have no data.

Major Mechanisms

During my career as a climatologist, several mechanism of weather and climate were either discovered or measured, supposedly with sufficient accuracy for application in a model. These include, El Nino/La Nina (ENSO), the Pacific Decadal Oscillation (PDO), the Atlantic Multidecadal Oscillation (AMO), the Antarctic Oscillation (AAO), the North Atlantic Oscillation (NAO), Dansgaard-Oeschger Oscillation (D-O), Madden-Julian Oscillation (MJO), Indian Ocean Dipole (IOD), among others.

Despite this, we are still unclear about the mechanisms associated with the Hadley Cell and the Inter-tropical Convergence Zone (ITCZ), which are essentially the entire tropical climate mechanisms. The Milankovitch Effect remains controversial and is not included in IPCC models. The Cosmic Theory appears to provide an answer to the relationship between sunspots, global temperature, and precipitation but is similarly ignored by the IPCC. They do not deal with the Monsoon mechanism well as they note,

In short, most AOGCMs do not simulate the spatial or intra-seasonal variation of monsoon precipitation accurately.

There is very limited knowledge of the major oceanic circulations at the surface and in the depths. There are virtually no measures of the volumes of heat transferred or how they change over time, including measures of geothermal heat.

Physical Mechanisms.

The IPCC acknowledge that,

“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.”

That comment is sufficient to argue for cessation of the waste of time and money. Add the second and related problem identified by Essex and McKitrick in Taken By Storm and it is confirmed.

Climate research is anything but a routine application of classical theories like fluid mechanics, even though some may be tempted to think it is. It has to be regarded in the “exotic’ category of scientific problems in part because we are trying to look for scientifically meaningful structure that no one can see or has ever seen, and may not even exist.

In this regard it is crucial to bear in mind that there is no experimental set up for global climate, so all we really have are those first principles. You can take all the measurements you want today, fill terabytes of disk space if you want, but that does not serve as an experimental apparatus. Engineering apparatus can be controlled, and those running them can make measurements of known variables over a range of controlled physically relevant conditions. In contrast, we have only today’s climate to sample directly, provided we are clever enough to even know how to average middle realm data in a physically meaningful way to represent climate. In short, global climate is not treatable by any conventional means.

Computer capacity

Modelers claim computers are getting better, and all they need are bigger, faster computers. It can’t make any difference, but they continue to waste money. In 2012, Cray introduced the promotionally named Gaea supercomputer (Figure 2). It has a 1.1 petaflops capacity. FLOPS means Floating-Point Operations per Second, and peta is 1016 (or a thousand) million floating-point operations per second. Jagadish Shukla says the challenge is

We must be able to run climate models at the same resolution as weather prediction models, which may have horizontal resolutions of 3-5 km within the next 5 years. This will require computers with peak capability of about 100 petaflops

Regardless of the computer capacity it is meaningless without data for the model.

clip_image004

Figure 2: Cray’s Gaea Computer with the environmental image.

Failed Forecasts, (Predictions, Projections)

Figure 3 shows the IPCC failed forecast. They call them projections, but the public believes they are forecasts. Either way, they are consistently wrong. Notice the labels added to Hayden’s graph taken from the Summary for Policymakers. As the error range increase in the actual data the Summary claims it is improving. One of the computer models used for the IPCC forecast belongs to Environment Canada. Their forecasts are the worst of all of those averaged results used by the IPCC (Figure 4).

clip_image006

Figure 3

clip_image008

Figure 4 Source; Ken Gregory

The Canadian disaster is not surprising as their one-year forecast assessment indicates. They make a one –year forecast and provide a map indicating the percentage of accuracy against the average for the period 1981-2010 (Figure 5).

clip_image010

Figure 5

The Canadian average accuracy percentage is shown in the bottom left as 41.5 percent. That is the best they can achieve after some thirty years of developing the models. Other countries results are no better.

In a New Scientist report Tim Palmer, a leading climate modeller at the European Centre for Medium-Range Weather Forecasts in Reading England said:

I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.

The Cost

Joanne Nova has done most research on the cost of climate research to the US government.

In total, over the last 20 years, by the end of fiscal year 2009, the US government will have poured in $32 billion for climate research—and another $36 billion for development of climate-related technologies. These are actual dollars, obtained from government reports, and not adjusted for inflation. It does not include funding from other governments. The real total can only grow.

There is no doubt that number grew, and the world total is likely double the US amount as this commentator claims.

However, at least I can add a reliable half-billion pounds to Joanne Nova’s $79 billion – plus we know already that the EU Framework 7 programme includes €1.9 billion on direct climate change research. Framework 6 runs to €769 million. If we take all the Annex 1 countries, the sum expended must be well over $100 billion.

These are just the computer modeling costs. The economic and social costs are much higher and virtually impossible to calculate. As Paul Driessen explains

As with its polar counterparts, 90% of the titanic climate funding iceberg is invisible to most citizens, businessmen and politicians.

It’s no wonder Larry Bell can say,

The U.S. Government Accounting Office (GAO) can’t figure out what benefits taxpayers are getting from the many billions of dollars spent each year on policies that are purportedly aimed at addressing climate change.

If it is impossible for a supposedly sophisticated agency like US GAO to determine the costs, then there is no hope for a global assessment. There is little doubt the direct cost is measured in trillions of dollars. That does not include the lost opportunities for development and lives continuing in poverty. All this because of the falsified results from completely failed computer model prediction, projections or whatever they want to call them.

It is time to stop the insanity, which in climate science is the repetition of creating computer models that don’t and can’t work? I think so.

“Those who have knowledge don’t predict. Those who do predict don’t have knowledge.” Tzu, Lao (6th Century BC)


Note: this article was updated shortly after publication to fix a text formatting error.

5 1 vote
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

218 Comments
Inline Feedbacks
View all comments
Salvatore Del Prete
September 14, 2015 9:57 am

They can spend as much money as they want on climate models but the bottom line is they will never work because they do not put in correct, and complete data.
Let them keep wasting and knocking themselves out so when the day of reckoning comes they fall that much harder. That day of reckoning probably before this decade ends.

Mickey Reno
Reply to  Salvatore Del Prete
September 14, 2015 10:20 am

I’d stop them from wasting, immediately, if I had my druthers. And why do you suppose that if you allow them to continue to waste you’re money, that things will collapse? And that collapse is a manageable event, that the collapse would lead to a new direction you’d favor? What if they collapse in a direction that the Cloward’s and Piven’s of this world envision?
Let’s send Kevin and Gavin to the unemployment line, now. It galls me to think of the pensions they’re going to draw for a lifetime of “service” that helped no one.

Mickey Reno
Reply to  Mickey Reno
September 14, 2015 10:21 am

your money – not you’re money. [blush]

Reply to  Mickey Reno
September 14, 2015 3:15 pm

Are you advocating that a person in a position of authority inform the climate modeler’s:

“YOU’RE FIRED!”


?

schitzree
Reply to  Mickey Reno
September 14, 2015 6:21 pm

Stormy… I see what you did there. ^_^

Expat
Reply to  Salvatore Del Prete
September 14, 2015 10:38 am

Unfortunately, El Nino will most likely drive up temps in the short term and lessen skeptics ability to influence anything. My guess is more money will be put into climate models, not less.

Somebody
Reply to  Salvatore Del Prete
September 14, 2015 10:52 am

They will never work because of the Lyapunov exponents. End of story.

RD
Reply to  Salvatore Del Prete
September 14, 2015 11:36 am

Exactly right. Let them spend and model to their heart’s content. We will have the real world to compare and contrast with their imaginary pink unicorns.

Marcuso8
Reply to  Salvatore Del Prete
September 14, 2015 12:00 pm

I predict ( with a 99.9% accuracy ) that the Glo.Bull Warming hoax will come to a crashing halt in 2017 !!!!

csanborn
Reply to  Salvatore Del Prete
September 14, 2015 12:16 pm

As I understand it from Dr. Christopher Essex – University of Western Ontario – climate models can never be correct because of: 1) calculation residual, 2) machine epsilon, and 3) model parameterization.

Phaedrus
Reply to  csanborn
September 14, 2015 5:36 pm

If you have an hour this is a must!! Clear and concise in what is a very complex discussion!

Hivemind
Reply to  csanborn
September 14, 2015 10:25 pm

4) incorrect assumptions (assuming the Earth is like an onion – ie “radiative forcing”)
5) incomplete physics (they can’t even predict the Sahara Desert, they need to program it in instead)
6) a need to force them to fit existing (to 1978) data, instead of making genuine predictions

Reply to  Salvatore Del Prete
September 14, 2015 1:06 pm

The day of reckoning by 2020?
I wouldn’t bet on that one. They are getting away with it and they know it. The current game plan seems to be that every weather calamity is blamed on Climate Change. The longer they keep that up, the more believable it becomes. As personal employment becomes more and more dependent on combating Climate Change, the more intrenched it will become. Any day of reckoning will only occur when the economy will no longer support the scheme.

Reply to  Steve Case
September 14, 2015 3:48 pm

I forgot to point out that a week or so ago there was a post about the Climate Change industry being a $1.5 Trillion business world-wide.
With that kind of money driving it, I wouldn’t bet on a quick and decisive demise any time soon.

george e. smith
Reply to  Salvatore Del Prete
September 14, 2015 1:36 pm

Right on Dr. Tim !!
And add this to your list of ” well we just took a wild a*** guess because we can’t possibly get enough meaningful samples (from the ground), so we just made it up and some of it is something like something that happens sometimes. ”
…. http://www.aip.org/history/climate/impacts.htm … for the full version of historian emeritus Spencer Weart’s concise history of climate science. He wrote a book called “The discovery of global warming” or something close to that” so he pushes the gospel to sell his book.
From what you are saying Tim; this isn’t even a good approximation for throwing darts at a wall to see if anything sticks.
g
PS Weart does have an impressive resume in Physics. I’m not in any way suggesting his credentials are unsound; quite the contrary.
But you will find the word ” consensus ” in his essay, more times than I care to count. And mostly it is about the hidden inner machinations of committees, rather than scientific evidence supported by peer reviewed experimental data.
It certainly adds to the pile of shameful substitute for scientific rigor.

george e. smith
Reply to  george e. smith
September 14, 2015 1:55 pm

I noticed that fig. 3 is a Dr Roy Spencergraph, with subtitles.
I did like the fact that the further they go back in the past on their wayback machine, they get less and less confident, that they even know what happened back then.
Well that’s in line with Spencer Weart’s assertion that they basically took a guess.
I guess this whole line of research is fully in keeping with Lord Rutherford’s admonition:
” If you have to use statistics, you should have done a better experiment. ”
Or in this case, made better wild arse guesses.
g

george e. smith
Reply to  Salvatore Del Prete
September 14, 2015 2:17 pm

I always thought that statistics could only tell you what fraction of some large number of experiments would meet some criterion.
And most of the math is only valid if the variables show a normal (izzat Gaussian ??) distribution.
As far as I know any single experiment could come up with ANY physically possible result; which cannot be predicted.
Hey news flash, earth to scientists : This is not a dress rehearsal; we are only going to get ONE SHOT at the future; there will not be any instant replay , and no refunds will be issued if you don’t like the outcome.
So stop wasting our time with statistics. Observe and report what happens; and don’t do any 13 month running averages, or nine point filters, or any other information destroying filtering.
Read the meter, and write down the reading !!
g

indefatigablefrog
Reply to  Salvatore Del Prete
September 14, 2015 3:33 pm

Re: “they do not put in correct, and complete data”.
But, but, but…those are the official global temperatures as measured by real professional temperature measuring sailors in the open ocean, using real buckets on ropes. As then assessed, interpreted and suitably adjusted by the very trustworthy Phil Jones et al in the bucket modelling dept at East Angular.
Then readjusted later for various spurious reasons.
It is fortunate for the individual sailor who performed this bizarre duty, the he was never told that the output of his diligently (or not so diligently) performed duty would one day form the basis for an attempt to exactly assess the average temperature of the globe during the first half of the 20th century to within ONE TENTH OF A DEGREE.
Not that such an onerous responsibility would necessarily have bothered the sailor in question.
But, simply because he would have become, in that moment, aware that his own descendants were destined to be a bunch of complete idiots.
I maintain some temperature recording sheets as a part of my job. Since we do not regard the temperature recording to be of any great significance, the numbers recorded are mostly fictional. It is easier to concoct a realistic looking number and write that down, than to actually go to the trouble to reading the usually faulty thermometer.
Occasionally, for my own amusement I have attempted to check the accuracy of the thermometers by laying two of them side by side. When I have done this I have usually discovered a very significant discrepancy.
Sometimes I attempt to point this out. But nobody is at all interested.
I expect that a sailor in the Southern Ocean was beset by a similar dilemma. Whether to diligently record the real ocean temperature or whether to just guess something and write it in the log – thereby saving himself the bother of slinging a bucket overboard and hauling it back onto the deck etc.
The poor man never imagined that the future of the human race lay in his hands.

Brett Keane
Reply to  indefatigablefrog
September 15, 2015 5:14 am

Well put! Having done that job, I doubt the samples would be too far wrong, on average. Water doesn’t change T quickly, and the drying outside of bucket will supply slight evaporative cooling, But, ahem, much depends on Officer and Petty Officer vigilance. Not much reason for crewmen to care much, otherwise. Yes, real malevolent stupidity is left for the present day climate clowns like Karl and Coy.

Latitude
Reply to  Salvatore Del Prete
September 14, 2015 5:06 pm

Salvatore, they will never work because they keep adjusting the past data…
BTW…modeling climate should be magnitudes simpler than modeling weather

E.M.Smith
Editor
Reply to  Salvatore Del Prete
September 14, 2015 5:15 pm

It is not just the wrong data, but the wrong process coded into the programming as well. Crap assumptions in crap programs will just give more crap faster on bigger computers.

Vboring
Reply to  Salvatore Del Prete
September 14, 2015 5:44 pm

If it were play money, I would agree.
But that kind of cash could do real good. Total funding for novel fusion reactors in the US this year is less than $100M. For a technology that could bring virtually free energy to everyone and make space exploration practical, this is a travesty.

James Francisco
Reply to  Salvatore Del Prete
September 14, 2015 6:41 pm

Maybe they can use the climate modeling computers to figure out the total amount of money that was wasted on this fiasco.

george e. smith
Reply to  Salvatore Del Prete
September 14, 2015 6:42 pm

Tim poses a lot of arguments as to the sparcity of the ” surface / near surface / wherever data.”
No stations for 85% of the earth surface; few stations in the Arctic.
I seem to recall from some years back when I started following the climate horror story at Tech Central Station, reading an account, that said that around the turn of the century; which at that time meant 19 going on 30, there was something like 76 “weather stations ” in the Arctic, AKA > +60 deg. lat , but that number had grown to a bit less than 100, and stations had moved; but with the collapse and implosion of the Soviet Union, the number of Arctic weather stations had dropped to about 12 today.
Those numbers may not be the right ones, but it was that order of a progression that I recall.
But it is actually much worse than Dr. Ball suggests; because most of that ” data ” is not data at all, but simply noise.
The assumption of sampled data theory, is that a multivariable continuous but band limited (in all variables) function, can be represented by a set of ” snapshots ” of the state of the function; the contents of that snapshot being the value of each variable at the point of recording the snapshot.
Now time is not necessarily a variable in all sampled data systems; but it certainly is one of the most common variables.
In the case of global weather which integrates over time to climate, the independent variables are essentially time and position.
So a ” datum ” of this two independent variable continuous function consists of the value of the dependent variable ( maybe Temperature ) at a specific instant of time for ALL sample values of the spatial variable (measuring locations), and also for each measuring station, is needed a reading for each value of the time variable.
That means you must get simultaneous readings for all sampled locations all at the same time, and for all stations their readings all need to be made at the same set of time intervals.
If you think about the ” weather ” in your back yard, and the weather over at your uncle’s house at the same instant, it is differences in the weather variables such as Temperature, that will determine from the laws of physics, just what energy exchanges are going on between those locations. Is the wind blowing from your house over to his, or verse vicea.
If you don’t both look simultaneously, it is meaningless. So your uncle got rained on last week , but today you got some sprinkles. Can’t tell a damn thing about what is happening from that. Your house may have burned down since last week, so rain today isn’t going to help you.
A photograph is a map that shows you everything that is in a certain physical space all at the same moment. A video will show you a sequence of photographs each showing everything that is wherever, all at the same instant when the frame was recorded.
So this hodge podge of each station recording a Temperature whenever they feel like it, so long as the get two numbers some time each day, is simply garbage it isn’t a video of anything. It’s more like the neighborhood garbage dump with all kinds of bric a brac at a land fill, with nobody knowing where all the junk came from or when it was dumped at the land fill.
Prof John Christy reported in Jan 2001 the results of some simultaneous ocean water and ocean near surface air temperatures ( from about -1 m and + 3 m respectively ) over about 20 years from some ocean buoys, and found that water and air Temperatures aren’t the same, and they aren’t correlated so you can’t regenerate one from the other.
So prior to about 1980, none of the ocean water based temperature numbers are of any use for plotting on a global lower troposphere map; it’s just garbage at a land fill.
So it is time that all these otherwise unemployable statisticians, started boning up on the general theory of sampled data systems, and then try to comply with those rules.
g

Alan the Brit
Reply to  Salvatore Del Prete
September 15, 2015 12:12 am

The problem is this, because they are in the public employ, paid by taxpayers generous funding, they are answerable to no one, therefore are responisble to no one, therefore they carry on as they do. Until the money is withdrawn, they continue. The system is geared up so that they will be allowed to take early retirement, on great pensions, or allowed to move sideways into less prominent positions, cozy, cushy, stress free, with the only repost available to them, “We did what we thought was right, based on the best available science!”. The really big lie!

Reply to  Salvatore Del Prete
September 15, 2015 3:06 pm

It doesn’t matter that the data is sparse and inaccurate, because even if the data was perfect, the computer would convert the numbers into an approximation called a floating point arithmetic and once the numerical representation is corrupted in the most minute way, you fly headlong into chaos.

catweazle666
Reply to  Salvatore Del Prete
September 16, 2015 4:54 pm

It doesn’t matter what data they put in, the models will always fail.
Anyone who claims that an effectively infinitely large open-ended non-linear feedback-driven (where we don’t know all the feedbacks, and even the ones we do know, we are unsure of the signs of some critical ones) chaotic system – hence subject to inter alia extreme sensitivity to initial conditions – is capable of making meaningful predictions over any significant time period is either a charlatan or a computer salesman.
Ironically, the first person to point this out was Edward Lorenz – a climate scientist.
You can add as much computing power as you like, the result is purely to produce the wrong answer faster.

September 14, 2015 9:59 am

Reblogged this on WeatherAction News and commented:
When model results are used as the sole basis for government policy, there is no value. It is a massive cost and detriment to society

Resourceguy
September 14, 2015 9:59 am

The Great Wall of China was a similarly scaled failure of trust and public policy.

Bill 2
September 14, 2015 10:07 am

Dr. Ball, why is your twitter account a spambot? https://twitter.com/pplonia

Reply to  Bill 2
September 14, 2015 10:55 am

My wife set it up and operates it. Thanks for advising of the problem, we will get it corrected. Thanks

Reply to  Bill 2
September 14, 2015 11:09 am

We use Apple and Spambot is not supposed to be a problem, although some are now reporting the problem. Will continue to resolve the problem.

KTM
September 14, 2015 10:10 am

“Remove that funding and nobody would spend private money to work on climate forecast models.”
We should follow the Human Embryonic Stem Cell model for funding this. When the federal government reinforced its moratorium, California stepped forward with $30 billion to fund the research, saying it would make California the world leader for research and innovation in the area.
California is already vowing to spend billions and inflict billions more dollars of damage to their own economy to support and promote the global warming ideology. Let them pick up the banner and devote a few billion to modeling.
As a note about the whole HESC issue, that work is now all scientifically obsolete due to the development of IPS cells that have no ethical concerns and much more clinical promise.

MarkW
Reply to  KTM
September 14, 2015 10:24 am

There was never a federal moratorium on Human Embryonic Stem Cell research.
There was a ban on federal funding for research using any embryonic stem cell lines created after a certain date. Funding on stem cell lines created prior to that date were always allowed.

September 14, 2015 10:11 am

The IPCC Figure 4 (the first graph in this post) has a problem in that new measurements are being obscured by the (Observations) dialogue box … this setup will remain in effect until ~2035.

Crispin in Waterloo
September 14, 2015 10:13 am

The Canadian model living virtually on Vancouver Island is an embarrassment to science and industry. Its predictions are laughable, sophomoric. It is ‘kept’ to keep up the IPCC’s ‘average’ predictions. We are paying for this junk science. Eliminating (defunding) the worst climate model each year would bring a sense of rigour to the forecasting industry and a sensitivity number below 1 degree per doubling of CO2.
If any P.Eng ran their operations like that they would be defrocked. Instead, the fabricators of junk climate science are canonized.

September 14, 2015 10:18 am

yes

September 14, 2015 10:20 am

I think the issue with climate models is that they are tuned to hindcast the past (from 2005 IIRC for the CMIP5 models) as best as they can, with the assumption that none of the rapid warming from the early 1970s to 2005 or so is from any natural cycles other than changes in volcanic emissions. If a contribution to that warming period by multidecadal cycles is determined and accounted for, and the models retuned accordingly, then I think they will become accurate.

September 14, 2015 10:25 am

Regarding: “It has a 1.1 petaflops capacity. FLOPS means Floating-Point Operations per Second, and peta is 1016 (or a thousand) million floating-point operations per second.” A petaflop is 10^15 floating point operations per second.

urederra
Reply to  Donald L. Klipstein
September 14, 2015 12:27 pm

I might be wrong, but that is not capacity. It is speed or potency.

george e. smith
Reply to  Donald L. Klipstein
September 14, 2015 2:03 pm

Peta is a thousand trillion; that is with a T not a thousand million with an m or even with a B.
g
Why did I always think that the climate models started with the known laws of physics; including the supposition (which becomes obvious every morning, when the sun rises in the east) that the earth rotates on its axis every 24 hours or so ??
just asking !

MarkW
Reply to  Donald L. Klipstein
September 14, 2015 2:53 pm

I’m pretty sure it’s 1024 million flops, not 1016. 2 to the 10th power is 1024.

Reply to  MarkW
September 15, 2015 4:02 pm

it’s 1*10^15 flops, your pc is probably hitting 1024 million flops, 10^9 flops, 4 cores doing 4 flops/ cycle at 2.4Ghz.. I believe that it’s fairly straight forward to download a GCM, compile it and run one on your PC, it would just run really slow.

Sam Prather
September 14, 2015 10:26 am

This is an excellent article!! One minor correction: a petaflop is a quadrillion (thousand trillion) floating point operations per second (FLOPS) which is a thousand teraflops, or 10 to the 15th power FLOPS.

MarkW
Reply to  Sam Prather
September 14, 2015 2:54 pm

When the 8080 first came out, it’s clock speed was around 100KHz, at it took several seconds to do a single FLOP.

neutronman2014
Reply to  MarkW
September 14, 2015 4:11 pm

(When the 8080 first came out, it’s clock speed was around 100KHz, at it took several seconds to do a single FLOP.)
Mark, the 8080 was the successor to the 8008. it ran at 2 Mhz clock speed and performed close to 300,000 flops per second.Microprocessor architecture and bus width were different in those early chips and it is difficult to compare their operation with today’s designs.

Reply to  MarkW
September 14, 2015 8:16 pm

Neutronman. An 8080 had no native floating point abilities whatsoever. You had to code them yourself, and a floating point divide was a brute of a thing.

JoeJ
Reply to  MarkW
September 14, 2015 8:38 pm

Not sure why I can’t reply to neutronman2014, but you are both about 3 orders of magnitude off, but in opposite directions. The 8080 did not have any floating point instructions, so required use of a software library. A single-precision add or subtract took about 0.7 milliseconds, multiply was about 1.5 msec, and divide took about 3.6msec. (that last one would be nearly 278 FLOPS!).
Times are taken from the Intel 8080/8085 Floating Point Arithmetic Library User’s Manual, Appendix C.

MarkW
Reply to  MarkW
September 16, 2015 11:18 am

Nuetronman2014, the 8008 was the successor to the 4004, which was built to run a 4 function calculator.

MarkW
Reply to  MarkW
September 16, 2015 11:20 am

Leo, the 8080 had the ability to do add and subtract directly. There was no native multiply or divide op-code, you had to build up those functions from the add and subtract functions.

george e. smith
Reply to  Sam Prather
September 15, 2015 1:06 pm

Where do people come up with these numbers. I seem to recall that the original IBM PC had a microprocessor with a 4.7 MHz clock frequency.
Before there was an 8080 or an 8008 or a 4004, there were already calculators that could do floating point math; even the cordic algorithm of the HP 35 hand held calculator.
I used to use a Wang calculator, that did multiplication using logs (1965). So it had all the math functions that were on the HP-35 in a weird desktop package. It had a weird magnetic core ROM made by stringing a bunch of wires through (or not through) a stack of ferrite cores. The sequence of cores that a wire went through or bypassed determined the 1-0 pattern of the word stored on that wire.
I never was able to determine whether Wang had figured out the cordic algorithm, or not because I never could find any literature that described that process, until it appeared in the HP-35 (and in the HP-Journal.
g

MarkW
Reply to  george e. smith
September 16, 2015 11:25 am

The first IBM PC used the 8086, two generations past the 8080.
The 4004 was the first chip designed specifically for a calculator, and it was a simple 4 function calculator with no memory.
The HP-35 was introduced in 1980, which is about 5 years AFTER the 8080 came out.

chapprg1
Reply to  Sam Prather
September 20, 2015 2:26 pm

Altho Dr. Ball was too polite to say it succinctly, more peta flops means “garbage in, garbage out faster”.

F. Ross
September 14, 2015 10:35 am

Excellent article Dr. Ball.

Jim
September 14, 2015 10:37 am

The $ value of being able to predict the growing season weather a few months in advance is an astronomical number.

Jimbo
September 14, 2015 10:39 am

No comment.

Abstract
The Key Role of Heavy Precipitation Events in Climate Model Disagreements of Future Annual Precipitation Changes in California
Climate model simulations disagree on whether future precipitation will increase or decrease over California, which has impeded efforts to anticipate and adapt to human-induced climate change……..Between these conflicting tendencies, 12 projections show drier annual conditions by the 2060s and 13 show wetter. These results are obtained from 16 global general circulation models downscaled with different combinations of dynamical methods…
http://dx.doi.org/10.1175/JCLI-D-12-00766.1

Fred Singer: “Successive IPCC summaries have claimed increasing certainty [from 50% in 1996, rising to >95% in 2013] about a human cause of global warming — even as the disparity between observations and IPCC models continues to grow year by year –now for more than 18 years. This is becoming somewhat ridiculous…
http://www.energyadvocate.com/gc1.jpg

Alx
Reply to  Jimbo
September 15, 2015 1:29 pm

Well obviously that is why the climate models are 100% correct. Since they forecast all possibilities they can never be wrong.
In other words I bet any climate modeler $100 that a independently supervised coin flip will be either heads or tails.

Jimbo
Reply to  Alx
September 16, 2015 4:32 am

Some people may not take what you said seriously, but it is a truism. Everytime we get a result that contradics the narrative they say ‘but the models predicted it.’ Well of course they did!

September 14, 2015 10:41 am

Well, it’s one thing to argue that studying the climate through models should be discontinued because it hasn’t produced verifiable predictions yet (or that the predictions the models have produced have failed verification). Problem is, as soon as you start torturing your reasoning by conflating climate modelling and weather forecasts, you’ve lost all credibility for the rest of your argument.

Reply to  bregmata
September 14, 2015 10:53 am

No you haven’t

Reply to  bregmata
September 14, 2015 11:44 am

bregmata, you beg the question.
Climate can only be more predictable than the weather it is composed of if, and only if, an understandably small number of factors dominate the weather over the long-term.
That’s never happened in the past. So why believe it will happen now? We’ve had forest fires and volcanos and CO2 hasn’t dominated.
You need to justify why you believe climate will be simpler than the weather that happens.

Reply to  MCourtney
September 14, 2015 8:12 pm

Clime forecasting vs. weather forecasting: Weather forecasting is like giving a microsecond-by-microsecond forecast of the states of the switching transistors in a class D amplifier. Climate forecasting is like predicting the duty cycle of the switching transistors as a function of input voltage and component changes. This means that a 50-year climate forecast should be much easier than a one-month day-by-day weather forecast. The biggest problem I see now with climate forecasting (such as with a composite of many models) is tuning the models to hindcast the past, with an incorrect understanding of the amounts of past temperature change caused by each of several factors, such as changes of manmade aerosols, volcanic emissions, greenhouse gases, any cloud effects from changes of solar activity such as indirectly through change of cosmic rays, other effects from changes of solar activity, and multidecadal oscillations. Using an incorrect understanding of how much past warming was due to increase of greenhouse gases leads to incorrect determinations of at least some of the feedbacks. Another issue is incorrect consideration (or none at all) of the feedback magnitudes changing with temperature and/or greenhouse gas presence.

Reply to  MCourtney
September 15, 2015 4:13 am

The biggest problem I see now with climate forecasting (such as with a composite of many models) is tuning the models to hindcast the past, with an incorrect understanding of the amounts of past temperature change caused by each of several factors, such as changes of manmade aerosols, volcanic emissions, greenhouse gases, any cloud effects from changes of solar activity such as indirectly through change of cosmic rays, other effects from changes of solar activity, and multidecadal oscillations.

With four parameters I can fit an elephant, and with five I can make him wiggle his trunk. – John Von Neumann.
You have substantially more than 5 variables on your list. None of which we can measure very accurately.
From an information theory viewpoint the entire idea of climate modeling is doomed from the start…
Peter

Reply to  MCourtney
September 16, 2015 10:33 am

I can not predict where an individual bird will fly. I can not even predict the exact location where a skein of geese will fly on a given day for each of the next 5 days. I can, however, predict the pattern that the geese will fly south each fall and return each sprint.
I’m not saying computerized climate models are right (or wrong). I’m saying that to argue they’re wrong because some other unrelated computerized models do not give completely accurate predictions is just a straw man, and a very poor one that in this case only taps into people’s deep-seated ignorance of the fundamental concepts involved (ie. confounding climate and weather).
One certainly does not do rational argument any favours by demonstrating one’s preference for avoiding it.

RACookPE1978
Editor
Reply to  bregmata
September 16, 2015 11:27 am

bregmata

I’m not saying computerized climate models are right (or wrong). I’m saying that to argue they’re wrong because some other unrelated computerized models do not give completely accurate predictions is just a straw man, and a very poor one that in this case only taps into people’s deep-seated ignorance of the fundamental concepts involved (ie. confounding climate and weather).

But, consider the following. EVERY YEAR since 1988, 23 of 23 “geese flying models” have predicted the geese will fly southeast towards the Azores Islands and Bermuda Islands, winter there over the cold months, then fly back to the north. And, every fall, the geese actually fly south-southwest towards the TX and Mississippi and Louisiana gulf coast swamps.
Now, should we discount those “geese flying models” models just because they are “consistently” and “almost right” and spend 92 billion dollars to bring our guns and cameras to the Azores to look for geese? the geese did, after all, on average “fly south.”

MarkW
Reply to  MCourtney
September 16, 2015 11:28 am

Weather models can ignore most of the problems that break the climate models.

MarkW
Reply to  bregmata
September 14, 2015 2:55 pm

Not that tired old line again? Sheesh, can’t you guys come up with new material?

chapprg1
Reply to  MarkW
September 20, 2015 2:34 pm

Global climate models disagree among themselves by 600% and have not improved in 27 years of gargantuan not to mention incredibly expensive efforts. Is this not prima facie evidence that further progress might not be expected.
Dr Ball details the reasons why.

Alx
Reply to  bregmata
September 15, 2015 1:56 pm

Yes we know the story, weather is not climate and climate is not weather, except when an individual catastrophic weather event like Sandy is used as proof of climate change.
Can play the same game with forecasts by calling it trending, projections, forecasts, future cast anomaly patterns or whatever. Doesn’t matter what you call it or the context, a prediction is a prediction that can be proven accurate or not and in the case of climate models they are consistently wrong.
How they are consistently wrong is itself an issue. That climate models have wildly varying results but yet deliver a consistent warming result only proves built-in biases. If there were no biases the trends would be wrong both above and below reality.
Regardless, even if modelers got lucky and their predictions or projections actually occurred they are still highly suspect due to lack of understanding of what they are modeling and the uncertainty and scarcity of global data.

September 14, 2015 10:45 am

[q]peta is 1016 (or a thousand) million floating-point operations per second[/q}
Actually a petaflop is a million billion floating-point operations per second, not a “thousand million” (which is a billion). The Gaea supercomputer that NOAA uses can do 1.1 petaflops, which is 1.1 million times faster than a “thousand million” (or billion).
1,000,000,000 is a billion (“giga”)
1,000,000,000,000 is a trillion (“tera”) or 1,000 billion
1,000,000,000,000,000 is a quadrillion (“peta”) or 1,000,000 (a million) billion
1,000,000,000,000,000,000 is a quintillion (“exa”) or 1,000,000,000 (a billion) billion
https://en.wikipedia.org/wiki/Names_of_large_numbers

MarkW
Reply to  Lauren R.
September 14, 2015 2:57 pm

1024, not 1016

Magma
September 14, 2015 10:46 am

[Fake email address. ~mod.]

Reply to  Magma
September 14, 2015 11:47 am

What about this article?
Can you find a fault with this?
Or can we assume that he is now so expert you can’t find any fault with his current work?
Friendly advice. The science isn’t Dr Ball’s weakness. Attack his political understanding if you want to have a go.

Magma
Reply to  MCourtney
September 14, 2015 1:50 pm

[Fake email address. ~mod.]

Stephen Richards
Reply to  Magma
September 14, 2015 12:25 pm

S bloody what. What is your point? How does it refer to this post?

Jimbo
Reply to  Magma
September 14, 2015 1:01 pm

Magma, forget Tim Ball and his expertise. How accurate have the IPCC surface temperature projections been? FAIL. That is the only qualification one needs, the ability to see projection V observations.
Here is someone with extensive model training and teaching. He is even more stinging than Ball.
http://wattsupwiththat.com/2014/10/06/real-science-debates-are-not-rare/
Here is a friend.

Abstract – 1994
Naomi Oreskes et al
Verification, validation, and confirmation of numerical models in the earth sciences
Verification and validation of numerical models of natural systems is impossible. This is because natural systems are never closed and because model results are always non-unique. Models can be confirmed by the demonstration of agreement between observation and prediction, but confirmation is inherently partial. Complete confirmation is logically precluded by the fallacy of affirming the consequent and by incomplete access to natural phenomena. Models can only be evaluated in relative terms, and their predictive value is always open to question. The primary value of models is heuristic…….
In some cases, the predictions generated by these models are considered as a basis for public policy decisions: Global circulation models are being used to predict the behavior of the Earth’s climate in response to increased CO2 concentrations;…….
Finally, we must admit that
a model may confirm our biases and support incorrect intuitions. Therefore, models are most useful when they are used to challenge existing formulations, rather than to validate or verify them. Any scientist who is asked to use a model to verify or validate a predetermined result should be suspicious.
http://acmg.seas.harvard.edu/students/Oreskes_1994.pdf

4 eyes
Reply to  Magma
September 14, 2015 3:13 pm

Magma,
Attack and dispute the article, not the author. I used to think AGW was something to worry about but about 15 years ago it became apparent the science was not rigorous. And things haven’t improved despite all the money spent. If it was your money wouldn’t you occasionally audit the progress being made. And if no progress had been made I am dead sure you would consider withdrawing support.

Svend Ferdinandsen
September 14, 2015 10:51 am

The problem with lack of data is made worse by the fact, that the data available is constant adjusted to show warming. If you base a climate theory on fabricated warming, you get a useless theory.
Most papers try to corrolate the warming with observations in nature, but when there are hardly any warming the results point in all directions. Try to corrolate anything with a flat line that just wiggles a bit.

Bulldust
Reply to  Svend Ferdinandsen
September 14, 2015 6:33 pm

This aspect has been bothering me for years. The fact that the input data are constantly being revised means previous work is in need of review. I don’t see how you can take climate science seriously if the core data are in question. Another thing I find hilarious, is the desire to model ever smaller cells in the models, despite the temperature data being tortured to represent areas orders of magnitude larger than those modelled. I mean if you want to make pretty pictures, just run a simple Mandelbrot program. Essentially that is all these multi-billion dollar projects are… an expensive way of making abstract graphics that have no practical use whatsoever, unless you count their use in pushing policy agendas.

September 14, 2015 11:02 am

Reblogged this on The Arts Mechanical and commented:
Using computers to model anything chaotic is a crap shot at best. A computer model is at best a guess on boundary conditions and chaotic turbulent systems don’t behave in a linear boundary contained fashion.

KTWO
September 14, 2015 11:04 am

The increasing confidence noted in Figure 3 indicates that by 2020 they should be ‘Absolutely Sure’. And once AS is achieved much bigger budgets will be needed as it becomes harder to become even more sure.
Being able to accurately forecast the growing season would be of great $ value if it can be done. If not, not.

Gamecock
September 14, 2015 11:07 am

“Short, medium, and long-term climate forecasts are wrong more than 50 percent of the time so that a correct one is a no better than a random event.”
With all due respect, Dr. Ball, a short-term climate forecast seems an oxymoron.

Berényi Péter
September 14, 2015 11:17 am

Computational general circulation climate models are clearly nothing but a waste of time and money, but they are actually worse than that. They are diverting attention and resources from fundamental scientific questions and attract the wrong kind of minds to the field, who are unable to do experimental work to solve basic riddles; whose the lack of understanding prohibits any further progress.
We have a fairly comprehensive understanding of reproducible non equilibrium thermodynamic systems. Unfortunately the terrestrial climate system does not belong to this class. It is not reproducible in the thermodynamic sense, i.e. microstates belonging to the same macrostate can evolve to different macrostates in a short time. Of course, it is nothing, but the other side of the coin called “chaos”, a.k.a. “butterfly effect”.
Such systems are not understood at all, their Jaynes entropy can’t even be defined.
If the climate system were reproducible, it would lend itself to the Maximum Entropy Production Principle. However, it clearly does not.
As energy exchange between the climate system and its cosmic environment goes exclusively by electromagnetic radiation, one hardly has to do more than to count incoming vs. outgoing photons, with a small correction for the different angular distribution of their momenta and deviations of their spectra from that of pure thermal radiation, to calculate net entropy production of the system. Turns out it would be very easy to increase entropy production by making Earth darker, that is, by decreasing its albedo.
However, Earth’s albedo is what it is, the planet is very far from being pitch black. In spite of this, albedo is strictly regulated, only its sweet spot is different. Due to a peculiar property of Keplerian orbits, annual average insolation of the two hemispheres is the same. It is a curious fact, that annual average reflected shortwave radiation is also the same though, which can’t be explained by simple astronomy. It is an emergent property of the climate system, for clear sky albedo of the Southern Hemisphere is much lower due to prevalence of oceans there, still, its all sky albedo is the same. The difference is, of course, due to clouds.
This simple symmetry property is neither understood nor replicated by computational general circulation climate models.
Under these circumstances any rational scientist would leave the climate system alone for a while, go back to the lab and study the behavior of non reproducible (chaotic) non equilibrium thermodynamic systems in general, until a breakthrough is achieved.
Plenty of such systems would fit into a lab happily, the terrestrial climate system being an exception in this respect.

Reply to  Berényi Péter
September 14, 2015 2:34 pm

BP, Feynman did that for a significant portion of his career (middle third). Lectures on Physics, V2, Chapter 41, delightfully titled the Flow of Wet Water. (Insider joke, chapter 40 is the Flow of Dry Water, only omitting a simple small ‘detail’–viscosity (see paragraph 1 of V2:41 for confirmation). The last paragraph of this very famous chapter is also known as his sermon on the Mysteries of Mathematics. As famous in some circles as the Sermon on the Mount. We await the next Feynman.
And, both V2 chapters should be required reading for every climate modeler. To their shame.
Just a little historical context. None of this is new. Lectures is Caltech Physics 101, 1961-1962. (and of course, much more…Feynman’s then statement about everything he understood about everything then existing in physics). Pity the poor CalTech freshman and sophomores then. It is said that by the second year, only genius undergrads, grad students, and post docs were following along. The other CalTech physics professors were too terrified to show up. Lectures is also the origins of his aphorism, ‘if you cannot explain it simply, then you do not understand it yourself’.
Regards

Victor
September 14, 2015 11:18 am

It doesn’t matter how much money they sink into their high tech crystal ball, it still just shows what the developers want to see.

September 14, 2015 11:20 am

The IPCC AR5 report itself has graphics showing the divergence of climate models from measurements.
From the AR5 Technical Summary, pg. 64:
http://www.climatechange2013.org/images/figures/WGI_AR5_FigTS_TFE.3-1.jpg
And detail of figure 1-4:
http://www.climatechange2013.org/images/figures/WGI_AR5_Fig1-4.jpg
And here:comment image
When a climate alarmists tells you the climate models are accurate, point them to these 3 graphics from the IPCC’s own report. A picture is worth a thousand words.

SAMURAI
September 14, 2015 11:22 am

What do you mean, Dr. Ball?
Adjusted hind casting and raw-data manipulation show the models are working great!
Everyone knows empirical evidence must be adjusted to match CAGW hypothetical projections, because the empirical evidence is obviously off, because 97% of scientists agree the models are correct.
How can you fault that logic?
Always remember that, War is peace, freedom is slavery and ignorance is strength, because the consensus says so.
Again, how can you fault that impeccable logic?

1 2 3 4