IPCC Scientists Knew Data and Science Inadequacies Contradicted Certainties Presented to Media, Public and Politicians, But Remained Silent

Guest essay by Dr. Tim Ball

I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. Arthur Conan Doyle. (Sherlock Holmes)

There is no more common error than to assume that, because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain. A.N.Whitehead

The recent article by Nancy Green at WUWT is an interesting esoteric discussion about models. Realities about climate models are much more prosaic. They don’t and can’t work because data, knowledge of atmospheric, oceanographic, and extraterrestrial mechanisms, and computer capacity are all totally inadequate. Computer climate models are a waste of time and money. 

Inadequacies are confirmed by the complete failure of all forecasts, predictions, projections, prognostications, or whatever they call them. It is one thing to waste time and money playing with climate models in a laboratory, where they don’t meet minimum scientific standards, it is another to use their results as the basis for public policies where the economic and social ramifications are devastating. Equally disturbing and unconscionable is the silence of scientists involved in the IPCC who know the vast difference between the scientific limitations and uncertainties and the certainties produced in the Summary for Policymakers (SPM).

IPCC scientists knew of the inadequacies from the start. Kevin Trenberth’s response to a report on inadequacies of weather data by the US National Research Council said

“It’s very clear we do not have a climate observing system…” “This may come as a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.”

This was in response to the February 3, 1999 Report that said,

“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.

Remember this is 11 years after Hansen’s comments of certainty to the Senate and five years after the 1995 IPCC Report. It is worse now with fewer weather stations and less data than in 1990.

Before leaked emails exposed its climate science manipulations, the Climatic Research Unit (CRU) issued a statement that said,

“GCMs are complex, three dimensional computer-based models of the atmospheric circulation. Uncertainties in our understanding of climate processes, the natural variability of the climate, and limitations of the GCMs mean that their results are not definite predictions of climate.”

Phil Jones, Director of the CRU at the time of the leaked emails and former director Tom Wigley, both IPCC members, said,

“Many of the uncertainties surrounding the causes of climate change will never be resolved because the necessary data are lacking.“

Stephen Schneider, prominent part of the IPCC from the start said,

“Uncertainty about feedback mechanisms is one reason why the ultimate goal of climate modeling – forecasting reliably the future of key variables such as temperature and rainfall patterns – is not realizable.”

Schneider also set the tone and raised eyebrows when he said in Discover magazine.

Scientists need to get some broader based support, to capture the public’s imagination…that, of course, entails getting loads of media coverage. So we have to offer up scary scenarios, make simplified dramatic statements, and make little mention of any doubts we may have…each of us has to decide what the right balance is between being effective and being honest.

The IPCC achieved his objective with devastating effect, because they chose effective over honest.

A major piece of evidence is the disparity between the Working Group I (WGI) (Physical Science Basis) Report, particularly the Chapter on computer models and the claims in the Summary for Policymakers (SPM) Report. Why did the scientists who participated in the WGI Report remain so silent about the disparity?

Here is the IPCC procedure:

Changes (other than grammatical or minor editorial changes) made after acceptance by the Working Group or the Panel shall be those necessary to ensure consistency with the Summary for Policymakers (SPM) or the Overview Chapter.

The Summary is written then the WGI is adjusted. It is like an executive publishing findings then asking employees to produce material to justify them. The purpose is to present a completely different reality to the press and the public.

This is to ensure people, especially the media, read the SPM first. It is released well before the WGI Report, which they knew few would ever read. There is only one explanation for producing it first. David Wojick, an IPCC expert reviewer, explained:

Glaring omissions are only glaring to experts, so the “policymakers”—including the press and the public—who read the SPM will not realize they are being told only one side of a story. But the scientists who drafted the SPM know the truth, as revealed by the sometimes artful way they conceal it

What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory. Instead of assessing these objections, the Summary confidently asserts just those findings that support its case. In short, this is advocacy, not assessment.

The Physical Basis of the Models

Here is a simple diagram of how the atmosphere is divided to create climate models.

clip_image002

Figure 1: Schematic of General Circulation Model (GCM).

The surface is covered with a grid and the atmosphere divided into layers. Computer models vary in the size of the grids and the number of layers. They claim a smaller grid provides better results. It doesn’t! If there is no data a finer grid adds nothing. The model needs more real data for each cube and it simply isn’t available. There are no weather stations for at least 70% of the surface and virtually no data above the surface. There are few records of any length anywhere; the models are built on virtually nothing. The grid is so large and crude they can’t include major weather features like thunderstorms, tornados, or even small cyclonic storm systems. The IPCC 2007 Report notes,

Despite the many improvements, numerous issues remain. Many of the important processes that determine a model’s response to changes in radiative forcing are not resolved by the model’s grid. Instead, sub-grid scale parameterizations are used to parametrize the unresolved processes, such as cloud formation and the mixing due to oceanic eddies.

O’Keefe and Kueter explain how a model works: “

The climate model is run, using standard numerical modeling techniques, by calculating the changes indicated by the model’s equations over a short increment of time—20 minutes in the most advanced GCMs—for one cell, then using the output of that cell as inputs for its neighboring cells. The process is repeated until the change in each cell around the globe has been calculated.”

Interconnections mean errors are spread and amplified. Imagine the number of calculations necessary that even at computer speed take a long time. The run time is a major limitation.

All of this takes huge amounts of computer capacity; running a full-scale GCM for a 100-year projection of future climate requires many months of time on the most advanced supercomputer. As a result, very few full-scale GCM projections are made.

A comment at Steve McIntyre’s site, Climateaudit, illustrates the problem.

Caspar Ammann said that GCMs (General Circulation Models) took about 1 day of machine time to cover 25 years. On this basis, it is obviously impossible to model the Pliocene-Pleistocene transition (say the last 2 million years) using a GCM as this would take about 219 years of computer time.

So you can only run the models if you reduce the number of variables. O’Keefe and Kueter explain.

As a result, very few full-scale GCM projections are made. Modelers have developed a variety of short cut techniques to allow them to generate more results. Since the accuracy of full GCM runs is unknown, it is not possible to estimate what impact the use of these short cuts has on the quality of model outputs.

Omission of variables allows short runs, but allows manipulation and moves the model further from reality. Which variables do you include? For the IPCC only those that create the results they want. Besides, because climate is constantly and widely varying so a variable may become more or less important over time as thresholds change.

By selectively leaving out important components of the climate system, likelihood of a human signal being the cause of change is guaranteed. As William Kinninmonth, meteorologist and former head of Australia’s National Climate Centre explains,

… current climate modeling is essentially to answer one question: how will increased atmospheric concentrations of CO2 (generated from human activity) change earth’s temperature and other climatological statistics? Neither cosmology nor vulcanology enter the equations. It should also be noted that observations related to sub-surface ocean circulation (oceanology), the prime source of internal variability, have only recently commenced on a consistent global scale. The bottom line is that IPCC’s view of climate has been through a narrow prism. It is heroic to assume that such a view is sufficient basis on which to predict future ‘climate’.

Static Climate Models In A Virtually Unknown Dynamic Atmosphere.

Heroic is polite. I suggest it is deliberately wrong. Lack of data alone justifies that position, lack of knowledge about atmospheric circulation is another. The atmosphere is three-dimensional and dynamic, so to build a computer model that even approximates reality requires far more data than exists, much greater understanding of an extremely turbulent and complex system, and computer capacity that is unavailable for the foreseeable future. As the IPCC note,

Consequently, for models to predict future climatic conditions reliably, they must simulate the current climatic state with some as yet unknown degree of fidelity. Poor model skill in simulating present climate could indicate that certain physical or dynamical processes have been misrepresented.

The history of understanding the atmosphere leaps 2000 years from Aristotle who knew there were three distinct climate zones to George Hadley in the 18th century. The word climate comes from the Greek word klima for slope referring to the angle of the sun and the climate zones it creates. Aristotle’s views dominated western science until the 16th century, but it wasn’t until the 18th century wider, but still narrow, understanding began.

In 1735 George Hadley used the wind patterns, recorded by English sailing ships, to create the first 3D diagram of circulation.

clip_image004

Figure 1. Hadley Cell (Northern Hemisphere)

Restricted only to the tropics, it became known as the Hadley Cell. Sadly, today we know little more than Hadley although Willis Eschenbach has worked hard to identify its role in transfer of heat energy. The Intergovernmental Panel on Climate Change (IPCC) illustrates the point in Chapter 8 of the 2007 Report.

The spatial resolution of the coupled ocean-atmosphere models used in the IPCC assessment is generally not high enough to resolve tropical cyclones, and especially to simulate their intensity.

The problem for climate science and modelers is the Earth is spherical and it rotates. Rotation around the sun creates the seasons, but the rotation around the axes creates even bigger geophysical dynamic problems. Because of it, a simple single cell system (Figure 2) with heated air rising at the Equator moving to the Poles, sinking and returning to the Equator, breaks up. The Coriolis Effect is the single biggest influence on the atmosphere caused by rotation. It dictates that anything moving across the surface appears to be deflected to the right in the Northern Hemisphere and to the left in the Southern Hemisphere. It appears that a force is pushing from the side so people incorrectly refer to the Coriolis Force. There is no Force.

clip_image006

Figure 2: A Simple Single Cell.

Figure 3 shows a more recent attempt to approximate what is going on.

clip_image008

Figure 3: A more recent model of a cross-section through the Northern Hemisphere.

Now it is the Indirect Ferrell Cell. Notice the discontinuities in the Tropopause and the Stratospheric – Tropospheric Mixing. This is important, because the IPCC doesn’t deal with the critical interface between the stratosphere and a major mechanism in the upper Troposphere in their models.

Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.

This is just one example of model inadequacies provided by the IPCC.

What the IPCC Working Group I, (The Physical Science Basis Report) Says About the Models.

The following quotes (Italic and inset) are under their original headlines from Chapter 8 of the 2007 IPCC AR4 Report. Comments are in regular type.

8.2 Advances in Modelling

There is currently no consensus on the optimal way to divide computer resources among finer numerical grids, which allow for better simulations; greater numbers of ensemble members, which allow for better statistical estimates of uncertainty; and inclusion of a more complete set of processes (e.g., carbon feedbacks, atmospheric chemistry interactions).

Most don’t understand models or the mathematics on which they are built, a fact exploited by promoters of human caused climate change. They are also a major part of the IPCC work not yet investigated by people who work outside climate science. Whenever outsiders investigate, as with statistics and the hockey stick, the gross and inappropriate misuses are exposed. The Wegman Report investigated the Hockey Stick fiasco, but also concluded,

We believe that there has not been a serious investigation to model the underlying process structures nor to model the present instrumented temperature record with sophisticated process models.

FAQ 8.1: How Reliable Are the Models Used to Make Projections of Future Climate Change?

Nevertheless, models still show significant errors. Although these are generally greater at smaller scales, important large-scale problems also remain. For example, deficiencies remain in the simulation of tropical precipitation, the El Niño- Southern Oscillation and the Madden-Julian Oscillation (an observed variation in tropical winds and rainfall with a time scale of 30 to 90 days).

Models continue to have significant limitations, such as in their representation of clouds, which lead to uncertainties in the magnitude and timing, as well as regional details, of predicted climate change. Nevertheless, over several decades of model development, they have consistently provided a robust and unambiguous picture of significant climate warming in response to increasing greenhouse gases.

Of course they do, because that is how they are programmed.

8.2.1.1 Numerics

In this report, various models use spectral, semi-Lagrangian, and Eulerian finite-volume and finite-difference advection schemes, although there is still no consensus on which type of scheme is best.

But how different are the results and why don’t they know which is best?

8.2.1.3 Parameterizations

The climate system includes a variety of physical processes, such as cloud processes, radiative processes and boundary-layer processes, which interact with each other on many temporal and spatial scales. Due to the limited resolutions of the models, many of these processes are not resolved adequately by the model grid and must therefore be parametrized. The differences between parametrizations are an important reason why climate model results differ.

How can parameterizations vary? The variance is evidence they are simply guessing at the conditions in each grid and likely choosing the one that accentuates their bias.

8.2.2.1 Numerics

Issues remain over the proper treatment of thermobaricity (nonlinear relationship of temperature, salinity and pressure to density), which means that in some isopycnic coordinate models the relative densities of, for example, Mediterranean and Antarctic Bottom Water masses are distorted. The merits of these vertical coordinate systems are still being established.

8.2.3.2 Soil Moisture Feedbacks in Climate Models

Since the TAR, there have been few assessments of the capacity of climate models to simulate observed soil moisture. Despite the tremendous effort to collect and homogenise soil moisture measurements at global scales (Robock et al., 2000), discrepancies between large-scale estimates of observed soil moisture remain. The challenge of modelling soil moisture, which naturally varies at small scales, linked to landscape characteristics, soil processes, groundwater recharge, vegetation type, etc., within climate models in a way that facilitates comparison with observed data is considerable. It is not clear how to compare climate-model simulated soil moisture with point-based or remotely sensed soil moisture. This makes assessing how well climate models simulate soil moisture, or the change in soil moisture, difficult.

Evaporation is a major transfer of long-wave energy from the surface to the atmosphere. This inadequacy alone likely more than equals the change created by human addition of CO2.

8.2.4.1 Terrestrial Cryosphere

Glaciers and ice caps, due to their relatively small scales and low likelihood of significant climate feedback at large scales, are not currently included interactively in any AOGCMs.

How big does an ice cap have to be to influence the parameterization in a grid? Greenland is an ice cap.

8.2.5 Aerosol Modelling and Atmospheric Chemistry

The global Aerosol Model Intercomparison project, AeroCom, has also been initiated in order to improve understanding of uncertainties of model estimates, and to reduce them (Kinne et al., 2003).

Interactive atmospheric chemistry components are not generally included in the models used in this report.

8.3 Evaluation of Contemporary Climate as Simulated by Coupled Global Models

Due to nonlinearities in the processes governing climate, the climate system response to perturbations depends to some extent on its basic state (Spelman and Manabe, 1984). Consequently, for models to predict future climatic conditions reliably, they must simulate the current climatic state with some as yet unknown degree of fidelity. Poor model skill in simulating present climate could indicate that certain physical or dynamical processes have been misrepresented.

They don’t even know which ones are misrepresented?

8.3.1.2 Moisture and Precipitation

For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.

Precipitation forecasts (projections?) are worse than their temperature projections (forecasts).

8.3.1.3 Extratropical Storms

Our assessment is that although problems remain, climate models are improving in their simulation of extratropical cyclones.

This is their self-serving assessment. How much are they improving and from what baseline?

8.3.2 Ocean

Comparisons of the type performed here need to be made with an appreciation of the uncertainties in the historical estimates of radiative forcing and various sampling issues in the observations.

8.3.2.1 Simulation of Mean Temperature and Salinity Structure

Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.

8.3.2.2 Simulation of Circulation Features Important for Climate Response

The MOC (meridional overturning circulation) is an important component of present-day climate and many models indicate that it will change in the future (Chapter 10). Unfortunately, many aspects of this circulation are not well observed.

8.3.2.3 Summary of Oceanic Component Simulation

The temperature and salinity errors in the thermocline, while still large, have been reduced in many models.

How much reduction and why in only some models?

8.3.3 Sea Ice

The magnitude and spatial distribution of the high-latitude climate changes can be strongly affected by sea ice characteristics, but evaluation of sea ice in models is hampered by insufficient observations of some key variables (e.g., ice thickness) (see Section 4.4). Even when sea ice errors can be quantified, it is difficult to isolate their causes, which might arise from deficiencies in the representation of sea ice itself, but could also be due to flawed simulation of the atmospheric and oceanic fields at high latitudes that drive ice movement (see Sections 8.3.1, 8.3.2 and 11.3.8).

8.3.4 Land Surface

Vast areas of the land surface have little or no current data and even less historic data. These include 19 percent deserts, 20 percent mountains, 20 percent grasslands, 33 percent combined tropical and boreal forests and almost the entire Arctic and Antarctic regions.

8.3.4.1 Snow Cover

Evaluation of the land surface component in coupled models is severely limited by the lack of suitable observations.

Why? In 1971-2 George Kukla was producing estimates of varying snow cover as a factor in climate change. Satellite data is readily available for simple assessment of the changes through time.

8.3.4.2 Land Hydrology

The evaluation of the hydrological component of climate models has mainly been conducted uncoupled from AOGCMs (Bowling et al., 2003; Nijssen et al., 2003; Boone et al., 2004). This is due in part to the difficulties of evaluating runoff simulations across a range of climate models due to variations in rainfall, snowmelt and net radiation.

8.3.4.4 Carbon

Despite considerable effort since the TAR, uncertainties remain in the representation of solar radiation in climate models (Potter and Cess, 2004).

8.4.5 Atmospheric Regimes and Blocking

Blocking events are an important class of sectoral weather regimes (see Chapter 3), associated with local reversals of the mid-latitude westerlies.

There is also evidence of connections between North and South Pacific blocking and ENSO variability (e.g., Renwick, 1998; Chen and Yoon, 2002), and between North Atlantic blocks and sudden stratospheric warmings (e.g., Kodera and Chiba, 1995; Monahan et al., 2003) but these connections have not been systematically explored in AOGCMs.

Blocking was a significant phenomenon in the weather patterns as the Circumpolar flow changed from Zonal to Meridional in 2013-14.

8.4.6 Atlantic Multi-decadal Variability

The mechanisms, however, that control the variations in the MOC are fairly different across the ensemble of AOGCMs. In most AOGCMs, the variability can be understood as a damped oceanic eigenmode that is stochastically excited by the atmosphere. In a few other AOGCMs, however, coupled interactions between the ocean and the atmosphere appear to be more important.

Translation; We don’t know.

8.4.7 El Niño-Southern Oscillation

Despite this progress, serious systematic errors in both the simulated mean climate and the natural variability persist. For example, the so-called double ITCZproblem noted by Mechoso et al. (1995; see Section 8.3.1) remains a major source of error in simulating the annual cycle in the tropics in most AOGCMs, which ultimately affects the fidelity of the simulated ENSO.

8.4.8 Madden-Julian Oscillation

The MJO (Madden and Julian, 1971) refers to the dominant mode of intra-seasonal variability in the tropical troposphere. Thus, while a model may simulate some gross characteristics of the MJO, the simulation may be deemed unsuccessful when the detailed structure of the surface fluxes is examined (e.g., Hendon, 2000).

8.4.9 Quasi-Biennial Oscillation

The Quasi-Biennial Oscillation (QBO; see Chapter 3) is a quasi-periodic wave-driven zonal mean wind reversal that dominates the low-frequency variability of the lower equatorial stratosphere (3 to 100 hPa) and affects a variety of extratropical phenomena including the strength and stability of the winter polar vortex (e.g., Baldwin et al., 2001).. Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.

8.4.10 Monsoon Variability

In short, most AOGCMs do not simulate the spatial or intra-seasonal variation of monsoon precipitation accurately.

Monsoons are defined by extreme seasonality of rainfall. They occur in many regions around the word, though most only associate them with Southern Asia. It is not clear what the IPCC mean. Regardless, these are massive systems of energy transfer from the region of energy surplus to the deficit region.

8.4.11 Shorter-Term Predictions Using Climate Models

This suggests that ongoing improvements in model formulation driven primarily by the needs of weather forecasting may lead also to more reliable climate predictions.

This appears to contradict the claim that weather and climate forecasts are different. As Norm Kalmonavitch notes,

The GCM models referred to as climate models are actually weather models only capable of predicting weather about two weeks into the future and as we are aware from our weather forecasts temperature predictions

In 2008 Tim Palmer, a leading climate modeller at the European Centre for Medium-Range Weather Forecasts in Reading England said in the New Scientist.

I dont want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.

8.5.2 Extreme Precipitation

Sun et al. (2006) investigated the intensity of daily precipitation simulated by 18 AOGCMs, including several used in this report. They found that most of the models produce light precipitation (<10 mm day1) more often than observed, too few heavy precipitation events and too little precipitation in heavy events (>10 mm day1). The errors tend to cancel, so that the seasonal mean precipitation is fairly realistic (see Section 8.3).

Incredible, the errors cancel and since the results appear to match reality they must be correctly derived.

8.5.3 Tropical Cyclones

The spatial resolution of the coupled ocean-atmosphere models used in the IPCC assessment is generally not high enough to resolve tropical cyclones, and especially to simulate their intensity.

8.6.2 Interpreting the Range of Climate Sensitivity Estimates Among General Circulation Models

The climate sensitivity depends on the type of forcing agents applied to the climate system and on their geographical and vertical distributions (Allen and Ingram, 2002; Sausen et al., 2002; Joshi et al., 2003). As it is influenced by the nature and the magnitude of the feedbacks at work in the climate response, it also depends on the mean climate state (Boer and Yu, 2003). Some differences in climate sensitivity will also result simply from differences in the particular radiative forcing calculated by different radiation codes (see Sections 10.2.1 and 8.6.2.3).

Climate sensitivity has consistently declined and did so further in IPCC AR5. In fact, in the SPM for AR5 the sensitivity declined in the few weeks from the first draft to the final report.

8.6.2.2 Why Have the Model Estimates Changed Since the TAR?

The current generation of GCMs[5] covers a range of equilibrium climate sensitivity from 2.1°C to 4.4°C (with a mean value of 3.2°C; see Table 8.2 and Box 10.2), which is quite similar to the TAR. Yet most climate models have undergone substantial developments since the TAR (probably more than between the Second Assessment Report and the TAR) that generally involve improved parametrizations of specific processes such as clouds, boundary layer or convection (see Section 8.2). In some cases, developments have also concerned numerics, dynamical cores or the coupling to new components (ocean, carbon cycle, etc.). Developing new versions of a model to improve the physical basis of parametrizations or the simulation of the current climate is at the heart of modelling group activities. The rationale for these changes is generally based upon a combination of process-level tests against observations or against cloud-resolving or large-eddy simulation models (see Section 8.2), and on the overall quality of the model simulation (see Sections 8.3 and 8.4). These developments can, and do, affect the climate sensitivity of models.

All this says is that climate models are a work in progress. However, it also acknowledges that they can only hope to improve parameterization. In reality they need more and better data, but that is not possible for current or historic data. Even if they started an adequate data collection system today it would be thirty years before it would be statistically significant.

8.6.2.3 What Explains the Current Spread in Models’ Climate Sensitivity Estimates?

The large spread in cloud radiative feedbacks leads to the conclusion that differences in cloud response are the primary source of inter-model differences in climate sensitivity (see discussion in Section 8.6.3.2.2). However, the contributions of water vapour/lapse rate and surface albedo feedbacks to sensitivity spread are non-negligible, particularly since their impact is reinforced by the mean model cloud feedback being positive and quite strong.

What does “non-negligible “ mean? Is it a double negative? Apparently. Why don’t they use the term significant? They assume their inability to produce accurate results is because of clouds and water vapor. As this review shows there are countless other factors and especially those they ignore like the Sun. The 2001 TAR Report included a table of the forcings with a column labeled Level of Scientific Understanding (LOSU). Of the nine forcings only two have a ”high” rating, although that is their assessment, one is medium and the other six are “low”. The only difference in the 2007 FAR Report is the LOSU column is gone.

8.6.3.2 Clouds

Despite some advances in the understanding of the physical processes that control the cloud response to climate change and in the evaluation of some components of cloud feedbacks in current models, it is not yet possible to assess which of the model estimates of cloud feedback is the most reliable.

The cloud problem is far more complicated than this summary implies. For example, clouds function differently depending on type, thickness, percentage of water vapor, water droplets, ice crystals or snowflakes and altitude.

8.6.3.3 Cryosphere Feedbacks

A number of processes, other than surface albedo feedback, have been shown to also contribute to the polar amplification of warming in models (Alexeev, 2003, 2005; Holland and Bitz, 2003; Vavrus, 2004; Cai, 2005; Winton, 2006b). An important one is additional poleward energy transport, but contributions from local high-latitude water vapour, cloud and temperature feedbacks have also been found. The processes and their interactions are complex, however, with substantial variation between models (Winton, 2006b), and their relative importance contributing to or dampening high-latitude amplification has not yet been properly resolved.

You can’t know how much energy is transported to polar regions if you can’t determine how much is moving out of the tropics. The complete lack of data for the entire Arctic Ocean and most of the surrounding land is a major limitation.

8.6.4 How to Assess Our Relative Confidence in Feedback to controls Simulated by Different Models?

A number of diagnostic tests have been proposed since the TAR (see Section 8.6.3), but few of them have been applied to a majority of the models currently in use. Moreover, it is not yet clear which tests are critical for constraining future projections. Consequently, a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed.

The IPCC chapter on climate models appears to justify use of the models by saying they show an increase in temperature when CO2 is increased. Of course they do, that is how they’re programmed. Almost every individual component of the model has, by their admission, problems ranging from lack of data, lack of understanding of the mechanisms, and important ones are omitted because of inadequate computer capacity or priorities. The only possible conclusion is that the models were designed to prove the political position that human CO2 was a problem.

Scientists involved with producing this result knew the limitations were so severe they precluded the possibility of proving the result. This is clearly set out in the their earlier comments and the IPCC Science Report they produced. They remained silent when the SPM claimed, with high certainty, they knew what was going on with the climate. They had to know this was wrong. They may not have known about the political agenda when they were inveigled into participating, but they had to know when the 1995 SPM was published because Benjamin Santer exploited the SPM bias by rewriting Chapter 8 of the 1995 Report in contradiction to what the members of his chapter team had agreed. The gap widened in subsequent SPMs but they remained silent and therefore complicit.

0 0 votes
Article Rating
143 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Jim Cripwell
March 21, 2014 3:33 am

I wonder if Dr. Susan Seestrom has read this.

Claude Harvey
March 21, 2014 3:34 am

I find Dr. Ball’s breadth and depth of knowledge on the subject rather startling. I suspect that many who defend the models have little knowledge of what’s actually in them. For those who do know what Ball knows and continue to claim a high degree of certainty, I must conclude the explanation is either fraud or self-delusion. For those who know and remain silent, cowardly self-interest comes to mind and history is replete with examples of the consequences of such a “go along, get along” mentality.

steveta_uk
March 21, 2014 3:39 am

There’s some awefully old stuff referenced here. Does it really matter that in 1997 they couldn’t accurately desribe the climate? The is pre-argo, for example.

March 21, 2014 3:47 am

I remember well a paper in, I believe, Mathematical Biology, or some such, I read in my freshman year of college in 1964. It was writen in 1954 I think. It was a mathematical model of a cell undergoing mitosis. The theoretical model of the cell was a bag of water.
They never learn.

richard verney
March 21, 2014 3:57 am

steveta_uk says:
March 21, 2014 at 3:39 am
There’s some awefully old stuff referenced here. Does it really matter that in 1997 they couldn’t accurately desribe the climate? The is pre-argo, for example.
////////////////////
But the problem is that it has not improved since then.
The divergence between model projections/predictions and reality and the vast dispparity between each of the models, conclusively confirms that not enough is known or understood, or the data is crap such that modles based on this are equally crap.
It really is a case of GIGO.

johnmarshall
March 21, 2014 4:01 am

I think that the ARGO data is ignored because it is too much data for the GCM’s to handle. The climate system is a chaotic one which means that the calculations will diverge faster from reality than thought.
Thanks Dr. Ball, good interesting post.

March 21, 2014 4:06 am

For those who care to read more from Tim and support his work, please consider buying his recently released book: The Deliberate Corruption of Climate Science.
http://www.amazon.com/The-Deliberate-Corruption-Climate-Science/dp/0988877740/ref=sr_1_1?ie=UTF8&qid=1395399851&sr=8-1&keywords=tim+ball+corruption

Bruce Cobb
March 21, 2014 4:08 am

They are professional liars. One doesn’t have to look very far to spot the lies either.
From WG1, AR5 comes this doozy:
As the Earth’s temperature has been relatively constant over many centuries, the incoming solar energy must be nearly in balance with outgoing radiation.
Lies couched in other lies. Their claims of even greater confidence now, due to faster computers is absurd, and purposely misleading.
One can only hope that their house of cards collapses soon.

Quinn the Eskimo
March 21, 2014 4:11 am

steveta_uk: It matters because the fundamental problems with modeling that are described here have not been fixed. The mismatch between the models and observations has only grown more profound, and these are the some of the reasons why.
The logic of the attribution analysis is essentially this: We don’t understand and are not able to model the climate system very well. Nevertheless, when we model the climate system, we don’t know what else could be causing the warming except for CO2, so it must be CO2. I am not making that up. It’s really that stupid.

kencoffman
March 21, 2014 4:21 am

I’m amused by those who point to the continuous updating of the climate models. The models evolve and hindcasting get better and better, but that’s not an argument for trusting them at any point along the way. If the historic modeling had any merit, they wouldn’t need to be updated. And what is the justification for using an ensemble of model outputs? It creates a mush and the mush is useful for overlaying on the chaotic climate and pretending there’s a match, but why is it a valid scheme, physically? I’d be more impressed if there was one model that matched history and produced accurate projections, but there isn’t and there won’t be.

Jimbo
March 21, 2014 4:21 am

Very often you here statements like ‘we must listen to what the science tells us’ while forgetting that it’s listening to what the oracle like computer simulations tells us. They most often don’t tell us anything useful.

Abstract
The key role of heavy precipitation events in climate model disagreements of future annual precipitation changes in California
Between these conflicting tendencies, 12 projections show drier annual conditions by the 2060s and 13 show wetter. These results are obtained from sixteen global general circulation models downscaled with different combinations of dynamical methods……
http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00766.1

Use that to formulate policy.

Abstract
Identifying human influences on atmospheric temperature
“The multimodel average tropospheric temperature trends are outside the 5–95 percentile range of RSS results at most latitudes. The likely causes of these biases include forcing errors in the historical simulations (40–42), model response errors (43), remaining errors in satellite temperature estimates (26, 44), and an unusual manifestation of internal variability in the observations (35, 45). These explanations are not mutually exclusive. Our results suggest that forcing errors are a serious concern.”
http://www.pnas.org/content/early/2012/11/28/1210514109.full.pdf
http://www.pnas.org/content/early/2012/11/28/1210514109

Use that for 95% certainty.

Larry Ledwick
March 21, 2014 4:22 am

Even if the math was perfect (which it is not) if you cannot establish initial conditions with some reasonable level of precision (measurements of temperature, pressure etc. at a point in time) for 70% of the surface and almost all of the upper atmosphere, it is a hopeless mathematical impossibility to presume that the models can take that limited flawed input data, rummage it around with “scientific wild ass guesses” (swag) for key variables, stir it with some approximations and short cuts, and then project forward those conditions into the future using highly coupled non-liner chaotic processes and come out with anything but a colossal torrent of digital nonsense.
The input does not even qualify as garbage in it is as they so quaintly say it in computer manuals, “is undefined” it is like dividing by zero. That means the calculations are infinitely more likely to produce useless info than they are to provide a useful approximation of reality.
Just because you did it on a super computer and it took days/weeks to do it, the result is about the same as taking your input data and running it through a garbage disposal or shredder and calling the output a model run.

Ian
March 21, 2014 4:25 am

Dr Ball,
Thank you for the post.
My knowledge of models and how they are put together was very sketchy to say the least. Thanks for filling some gaps.

Ron C.
March 21, 2014 4:27 am

Thanks Dr. Bell for giving us a tour of the sausage machines.

Ron C.
March 21, 2014 4:27 am

Whoops, sorry– Dr.Ball.

eyesonu
March 21, 2014 4:34 am

Dr. Tim Ball,
Excellent essay and well presented. Thank you.
There are going to be a lot of eyes on this and the timing is right.

Billy Ruff'n
March 21, 2014 4:39 am

Claude Harvey said, “For those who know and remain silent, cowardly self-interest comes to mind…”
True enough, but can you blame them? Put yourself in the shoes of a young, up and coming climate scientist who has finally arrived at a point where they’re invited to work in the bowels of the IPPC. They have invested considerable financial sums in their education and many years in training and hard work to arrive where they are. If they speak up, their careers in academia and the climate research establishment are over. How many private sector jobs in climate science are there? If they speak up, how will they pay the mortgage and feed the kids? For an honest person, it must be devastating to find oneself in such a position.

March 21, 2014 4:40 am

i watched a recent ipcc evidence video for the latest report 5. You have to wait right to the end to listen to the caveats on the models. So even though in that video the caveats were mumbled and rushed they had to put them in.
The mystery is how those caveats the ipcc group themselves admit to never make it to the public. Rather in the ‘narrative for the public’ there is an emphasis on implied exactness by talking about satellites, £30m supercomputers and other instruments, that only a genius can understand ‘the science’ so its best left to them , a constant deluge of co2 disaster move headlines from ‘predictions’ based on ‘settled science’ and name calling anyone who points out the caveats as deniers [or worse] to intimidate and keep a curfew on the whole truth.

AlecM
March 21, 2014 4:44 am

The Mathematical Physics of Gaia:
1. The Earth’s atmosphere is a heat engine which ensures solar SW thermalised in the atmosphere on average equals LW OLR <b<whilst minimising the rate of production of radiation entropy.
2. The atmosphere adapts to meet those two requirements. One key factor is that the radiation entropy production rate is inversely proportional to the OLR IR source temperature.
3. Because the source temperature for 15 micron CO2 IR is the lowest, atmospheric pCO2 is maximised by increasing atmospheric enthalpy and entropy therefore Gibbs’ free energy.
4. This is why the GHE is higher in interglacials. Animal evolution and geological processes adapt to maximise pCO2 and aerosols, which reduce cloud albedo (Sagan’s aerosol physics is wrong).
5. Processes in the heat engine use CO2 as the working fluid, ensuring near zero CO2-AGW.
I hope the above clarifies the issues: everything is subordinate to the external thermodynamics.
Forget about ‘forcing,’ ‘back radiation’ and ‘positive feedback’, it’s junk physics by deluded amateurs paid by demagogic politicians to displace good physics by bad, a corollary of Gresham’s Law.

eyesonu
March 21, 2014 4:47 am

steveta_uk says:
March 21, 2014 at 3:39 am
There’s some awefully old stuff referenced here. Does it really matter that in 1997 they couldn’t accurately desribe the climate? The is pre-argo, for example.
===============
“They” led the public and policy makers to believe they knew the answers and immediate action must be taken based on their reasoning or lack thereof.
I see no reason to disregard past failures of those in support of the “cause” and would in fact encourage an occasional reminder as the latest modeling seems to be no better.
An academic thief in 1997 is still an academic thief today. A statute of limitations only applies where legal proceedings are concerned. Moral and credibility issues are much longer, a lifetime perhaps.

Alberta Slim
March 21, 2014 4:53 am

As many skeptics have said for years that the climate is too complex with too many variables to model accurately.
More measuring and less modelling. IMO

timspence10
March 21, 2014 4:56 am

That’s quite an enormous list of what models fail to provision for. So they are in effect useless. Yet more evidence that climate science is in its infancy.

eyesonu
March 21, 2014 4:56 am

Billy Ruff’n says:
March 21, 2014 at 4:39 am
===============
In consideration of your reasoning, then now is the time to step forward and spill the beans. Their hands are dirty and it’s time to wash them in mass. A coward is still a coward, but one who is dirty and remains so is still dirty and also remains a coward.
http://www.merriam-webster.com/dictionary/coward
noun \ˈkau̇(-ə)rd\. : someone who is too afraid to do what is right or expected :
someone who is not at all brave or courageous.

Alberta Slim
March 21, 2014 5:03 am

Larry Ledwick says:
March 21, 2014 at 4:22 am
“Even if the math was perfect (which it is not) if you cannot establish initial ……………….”
I think your comment is about as close to explaining the futility of GCMs as there is.

DC Cowboy
Editor
March 21, 2014 5:07 am

Using parameters to simulate physical processes that you don’t understand and/or can’t represent in the models due to ‘computational limitations’ you have a model with no predictive skill at all, even if you can successfully ‘tune’ the parameters to create a successful hind cast. ‘Tuning’ the parameters actually means you’re changing the output of the physical processes that they represent and that doesn’t make any sense to me as far as obtaining an accurate representation of their effect. It more or less makes the model nothing more than a guess. It also creates the risk that your ‘tuning’ of one or several parameters masks the true effect of others. I wonder if there has ever been any work to try to determine the probability of creating an accurate model when there are multiple ‘parameters’ involved. I would bet that the probability would be very small.

M Seward
March 21, 2014 5:14 am

As an engineer with experience using CFD modelling software and using all sorts of other analytical tools from simple formulae to one page spreadsheets to large databases it is a no brainer that the analysis one is doing is based on the mathematics of a particular mechanism that is understood to be applicable and whose limitations are understood. It is also implicit that the scale at which one works is a scale at which the models being used are applicable and appropriate and the aggregate result is properly thus determined by summation of all the local outcomes.
The very idea that one would have a model which has a cellular scale which is too large to apply to so much of the actual mechanisms known to be in play is so ludicrous as to be laughable in a very nervous way. Nervous because I wonder what kind of science/engineering nut job would even do such a thing? What possible use would the output be?
Apart from perpetrating a fraud that is.

March 21, 2014 5:14 am

this was the IPCC video i watched. the caveats on the models are at the end [bit mumbled]
http://www.rmets.org/events/climate-change-2013-physical-science-basis-working-group-1-contribution-fifth-assessment
compare that to what the public gets. Notice the constant images of implied exactness. The term ‘inter glacial warming period’ is not mentioned once. The warming is decontextualised from the ice age cycle which allows for ‘something else’ to be the cause.

March 21, 2014 5:17 am

Wow, what an article. Thanks for all the work Dr. Ball and WUWT for posting this. This only confirms what climate realists have known since 1998 or before, that the cult of warm, is not about science, but about egos, money, getting published, and falsifying the real climate record. There are about 1 million or so variables in climate, as an IT professional for a long time, I can confirm that you cannot model, nor simulate many-to-many relationship with so many variables and unknowns. It is simply impossible.

sherlock1
March 21, 2014 5:26 am

So – the chicanery, muddled thinking, lack of data, assumptions, predictions and conclusions – are actually FAR worse than any of us thought..!

March 21, 2014 5:31 am

yes there are a lot of variables but people were predicting the uk winter storms back in oct 2013 with normal meteorological reasoning while those with supercomputers predicted a drier than av winter and actually can’t forecast past 2 days. So which group of people have the better understanding of energy transference processes?
just because the co2ers models can’t do anything doesn’t mean the processes cannot be predicted. One doesn’t have to model everything. Just the essential mechanisms. ie understanding the hierarchy. Putting co2 at the top is why they get nonsense.

tadchem
March 21, 2014 5:34 am

“[Computer models] don’t and can’t work because data, knowledge of atmospheric, oceanographic, and extraterrestrial mechanisms, and computer capacity are all totally inadequate. Computer climate models are a waste of time and money.”
Their *advertised* function of modeling earth’s climate is unachievable.
However, their *designed* function – to provide numerical tables in bulk and persuasive numbers for the rhetoric of grant applications – has been an unqualified success.

Bruce Cobb
March 21, 2014 5:36 am

With so many involved, and each being able to claim “but I was only involved in this small part”, I guess frogmarching to the Hague is probably not in their cards, though it should be.

March 21, 2014 5:38 am

Cagw is about power and money. Science has little to do
With it. These cowards are alarmed at losing their money.

Jim Happ
March 21, 2014 5:48 am

If the models could just predict one growing season they would be worth a lot.

Tom In Indy
March 21, 2014 5:56 am

The Summary is written then the WGI is adjusted. It is like an executive publishing findings then asking employees to produce material to justify them. The purpose is to present a completely different reality to the press and the public.
I’d say it’s closer to an executive of a publicly held company signing off on the annual report, when the executive is aware of significant misrepresentations in the report. The DOJ/SEC send people to prison for that crime.
In the IPCC case, scientists are signing off on documents with significant misrepresentations. Public policy is based on these misrepresentations. These “scientists” are no different than the corporate executive who misrepresents material facts. The scientists should be prosecuted, or at least held accountable in some manner.

Jimbo
March 21, 2014 5:58 am

Here is an excellent essay from Dr. John Christy.

March 20, 2014
The reason there is so much contention regarding “global warming” is relatively simple to understand: In climate change science we basically cannot prove anything about how the climate will change as a result of adding extra greenhouse gases to the atmosphere.
So we are left to argue about unprovable claims………..
Climate science is a murky science. When dealing with temperature variations and trends, we do not have an instrument that tells us how much change is due to humans and how much to Mother Nature. Measuring the temperature change over long time periods is difficult enough, but we do not have a thermometer that says why these changes occur.
We cannot appeal to direct evidence for the cause of change, so we argue……..
http://www.centredaily.com/2014/03/20/4093680/john-r-christy-climate-science.html

March 21, 2014 6:07 am

Excellent essay.
We knew the models weren’t giving good results, but:
they are worse than we though.
(Someone had to say it. )
“IPCC Scientists Knew Data and Science Inadequacies Contradicted Certainties Presented to Media, Public and Politicians, But Remained Silent”
For the most part, aren’t the IPCC scientists still remaining silent?
Shouldn’t most, if not all, of them be shouting “no, you are misrepresenting my/our work!”

Clovis Marcus
March 21, 2014 6:07 am

steveta_uk says:
March 21, 2014 at 3:39 am
There’s some awefully old stuff referenced here. Does it really matter that in 1997 they couldn’t accurately desribe the climate? The is pre-argo, for example.
============================
Surely the importance of the old models is that serious policy decisions were taken on demonstrably faulty grounds then. And they continue to be now.
The problem is that is not in the policy-makers interest to question the gift of a global disaster on which they can build power and raise taxes.

March 21, 2014 6:09 am

<Watch out! The blue car is going to run us over!
-That is not a car, it is a van.
<Come on it is car.
-It is a van and is not blue is grey but it depends on the angle of the light from where you see it.
<Are you saying I am blind and can not think by myself? It is a car and it is blue cause I know what I know and when I tell you … … …
Science is as much about developing suitable methodologies to give answers as about finding and addressing the "appropriate questions". When there is a mismatch between "the question" and "the methodology" both should be reassessed. I see much debate about data and its manipulation. I don´t see much debate about the suitability of the questions. If the technical development to obtain data is limited the questions are the ones to be re-framed. And from my point of view, sooner or later there are some questions that need to be addressed. They might not be the ones you agree on but these are the ones I haven´t seen a consensuated answer yet.
Could human development have an impact in the ecosystem at global scale? What would have to do humans to alter the ecosystem at global scale? Which part of the ecosystem (soil, atmosphere, light and heat (from our sun), water or living organisms) would reflect primary the impact from human perturbation? In case the answer is “yes” to the first question, how much of the answer for the second and third questions matches with actual facts?

Tim
March 21, 2014 6:19 am

“Computer climate models are a waste of time and money.”
But not for those elites who use them for political fun and profit. They have the time and they have the money. Prejudgment in, prejudgment out.

Nancy Green
March 21, 2014 6:23 am

Thank you Dr Ball.
As Dr Ball points out, on many levels the climate models do not reflect reality. These failings are not simply minor. They are major failings over a wide range of issues. This is well known in the scientific community but the community remains largely silent out of fear. To speak out risks the loss of funding and the end of your scientific career. The scientific search for truth has been replaced by the search for fame and funding.
In my small work I pointed out that no matter how much we spend on Climate Models, there is a fundamental problem in predicting the future. Some very simple experiments in physics have demonstrated that our common sense understanding of the future is fundamentally wrong. These experiments gave rise to quantum mechanics, the single most successful description of reality in science.
Asking a computer to predict the future is asking a computer to solve the impossible. There is no specific future to be predicted; only a probability out of an infinity of possible futures. Adding CO2 may increase the odds of a warmer future, but it in no way determines that the future will be warmer. The future can remain stubbornly colder, no matter how much CO2 we add.
Some futures are more likely, but that is simply God is playing dice. We are not guaranteed to arrive at any specific future, thus there is nothing for the climate models to solve. They are being asked to deliver an impossible result and like Hal in 2001 they have gone insane. They are killing people by cutting life support via energy poverty.
HAL: “The 9000 series is the most reliable computer ever made. No 9000 computer has ever made a mistake or distorted information. We are all, by any practical definition of the words, foolproof and incapable of error”.

March 21, 2014 6:39 am

The models are massively deficient; many of them have been “parameterized” differently, and some have been corrected for this and that and some not. So let’s take an average and pretend that it’s accurate to within a few 10ths of a degree at 100 years? Can they really say that with a straight face?
What does an “average” of the models actually mean? Does anyone really argue that they are samples from the same population? Is the same weight applied to each model? Why or why not? Does this procedure imply that they can’t distinguish among models by their predictive skill?
Whatever it is their doing, it’s not science as we know it, Jim.

Pamela Gray
March 21, 2014 6:44 am

Modeling is likely less expensive than measuring. Plus it allows authors to set and get instead of walk and measure. Setting while waiting for a computer to output its output gives them time to twiddle their fingers or teach a class (what IS the difference?). Walk and measure is field research, which is very expensive and requires far more skill that many scientists no longer have or were never trained in how to do it. So I imagine there is pressure to write grants that revolve around set and get. More papers and researchers. IE more bang for the buck. Which would explain the explosion of climate science papers..

Gamecock
March 21, 2014 6:48 am

As a corporate computer jock for over 30 years, I supported models. The models were accurate and useful to the businesses.
Models are the codification of the interaction between different parameters. A model run consists of inputting data (actual or hypothetical) for the parameters, and the output, a forecast, is the result of the interaction of the data.
The problem with climate “models” is not that they don’t have enough data.
The problem with climate “models” is not that they don’t have enough computing power.
The problem is that they don’t understand the interaction of all the parameters, or even what all the parameters are.
If you don’t know how things interact, more data gives you nothing. Climate models have zero predictive capability, and will continue so until we know how the atmosphere works in sufficient detail to codify it into the models. Even then, we won’t know what Ol’ Sol will decide to do (unless we can then adequately model the sun, too!).
My bias: My first computer system (1978) was a Dec PDP-11/45. The degreed computer scientists I worked with had a saying, “If you can’t get it done in 128k, it’s not worth doing.” Hence, all my career, I was suspect of the value of more computing power. I considered it a crutch for those without sufficient intellect to figure out how to get it done in 128k. In my not so humble opinion, more computing power or more data is not going to help climate models. Only more understanding of the atmosphere will, and reaching a useful level is decades off. Progress on that has been dead for over 20 years, as scientists try to force fit CO2 into the equations. They are stuck on stupid.

March 21, 2014 6:48 am

Tim Ball’s post is a very detailed explanation the argument which I have been making for some years now that the model outputs are useless as a basis for climate and energy policy and that it is time to move to a different method of climate forecasting. In a series of posts at
http://climatesense-norpag.blogspot.com
I have presented forecasts of a likely coming cooling based on using the 60 and 1000 year periodicities in the temperature data and the neutron count (and10Be) as the best proxy for solar activity.Here for convenience are the conclusions of the latest post on my blog
“With that in mind it is reasonable to correlate the cycle 22 low in the neutron count (high solar activity and SSN) with the peak in the SST trend in about 2003 and project forward the possible general temperature decline in the coming decades in step with the decline in solar activity in cycles 23 and 24.
In earlier posts on this site http://climatesense-norpag.blogspot.com at 4/02/13 and 1/22/13
I have combined the PDO, ,Millennial cycle and neutron trends to estimate the timing and extent of the coming cooling in both the Northern Hemisphere and Globally.
Here are the conclusions of those posts.
1/22/13 (NH)
1) The millennial peak is sharp – perhaps 18 years +/-. We have now had 16 years since 1997 with no net warming – and so might expect a sharp drop in a year or two – 2014/16 -with a net cooling by 2035 of about 0.35.Within that time frame however there could well be some exceptional years with NH temperatures +/- 0.25 degrees colder than that.
2) The cooling gradient might be fairly steep down to the Oort minimum equivalent which would occur about 2100. (about 1100 on Fig 5) ( Fig 3 here) with a total cooling in 2100 from the present estimated at about 1.2 +/-
3) From 2100 on through the Wolf and Sporer minima equivalents with intervening highs to the Maunder Minimum equivalent which could occur from about 2600 – 2700 a further net cooling of about 0.7 degrees could occur for a total drop of 1.9 +/- degrees
4)The time frame for the significant cooling in 2014 – 16 is strengthened by recent developments already seen in solar activity. With a time lag of about 12 years between the solar driver proxy and climate we should see the effects of the sharp drop in the Ap Index which took place in 2004/5 in 2016-17.
4/02/13 ( Global)
1 Significant temperature drop at about 2016-17
2 Possible unusual cold snap 2021-22
3 Built in cooling trend until at least 2024
4 Temperature Hadsst3 moving average anomaly 2035 – 0.15
5 Temperature Hadsst3 moving average anomaly 2100 – 0.5
6 General Conclusion – by 2100 all the 20th century temperature rise will have been reversed,
7 By 2650 earth could possibly be back to the depths of the little ice age.
8 The effect of increasing CO2 emissions will be minor but beneficial – they may slightly ameliorate the forecast cooling and help maintain crop yields .
9 Warning !! There are some signs in the Livingston and Penn Solar data that a sudden drop to the Maunder Minimum Little Ice Age temperatures could be imminent – with a much more rapid and economically disruptive cooling than that forecast above which may turn out to be a best case scenario.
How confident should one be in these above predictions? The pattern method doesn’t lend itself easily to statistical measures. However statistical calculations only provide an apparent rigor for the uninitiated and in relation to the IPCC climate models are entirely misleading because they make no allowance for the structural uncertainties in the model set up. This is where scientific judgment comes in – some people are better at pattern recognition and meaningful correlation than others. A past record of successful forecasting such as indicated above is a useful but not infallible measure. In this case I am reasonably sure – say 65/35 for about 20 years ahead. Beyond that certainty drops rapidly. I am sure, however, that it will prove closer to reality than anything put out by the IPCC, Met Office or the NASA group. In any case this is a Bayesian type forecast- in that it can easily be amended on an ongoing basis as the Temperature and Solar data accumulate. If there is not a 0.15 – 0.20. drop in Global SSTs by 2018 -20 I would need to re-evaluate

Ashby
March 21, 2014 6:52 am

Sheep entrails and “Turtles all the way down”.

Matthew R Marler
March 21, 2014 6:56 am

Dr Tim Ball: They don’t and can’t work because data, knowledge of atmospheric, oceanographic, and extraterrestrial mechanisms, and computer capacity are all totally inadequate. Computer climate models are a waste of time and money.
I found one ambiguity in your presentation. It isn’t clear which of the problems you address are simply problems with current models, and which are problems you think will never be solved. A computer model that could accurately forecast the mean, s.d., quartiles, and extremals of rain and temperature for a bunch of regions over spans of 2 decades would be quite valuable, even if it could not forecast the mean temp and rainfall on Oct 11 of any year in Columbus OH.

Matthew R Marler
March 21, 2014 7:04 am

Gamecock: Hence, all my career, I was suspect of the value of more computing power. I considered it a crutch for those without sufficient intellect to figure out how to get it done in 128k.
Does it bother you how much computing power is built into modern commercial aircraft? The facilities that design and build such aircraft? Or cell phones? CAT-scans and fMRI? Were you appalled by the waste of computing resources in mapping and sequencing the genomes of humans and rice?

Greg
March 21, 2014 7:05 am

“The only possible conclusion is that the models were designed to prove the political position that human CO2 was a problem.”
Hell, it took some time to get there, but you make the case is a very detailed and irrefutable way. Well done.
It’s all one massive game of computerised smoke and mirrors. Climate models start out with a CO2 driven rise then decades of research and billions of dollars putting a few “climate-like” wiggles onto it. ( Which may or may not coincide with the wiggles in the “corrected and adjusted” datasets ).
One massive SCAM masquerading as “the” science.
Great article, Dr. Tim Ball

March 21, 2014 7:09 am

CO2 sensitivity theory is like non-stick coating or fabric protector or the miracle knife you never sharpen. In testing and demonstration it all works fine, and there’s an answer to every possible FAQ. Professor so-and-so agrees. An audience of stunned professionals can’t believe their eyes. Unsolicited testimonials from ecstatic customers. Celebrities will vouch. And so on.
In the vast and messy place called the real world none of it works. Not the non-stick coating, not the fabric protector, not the miracle knife…and not the CO2 theory.

David Jay
March 21, 2014 7:12 am

Gamecock:
My bias: My first computer system (1978) was a Dec PDP-11/45. The degreed computer scientists I worked with had a saying, “If you can’t get it done in 128k, it’s not worth doing.” Hence, all my career, I was suspect of the value of more computing power. I considered it a crutch for those without sufficient intellect to figure out how to get it done in 128k.
Thanks for making me feel young. The correct answer is that it takes a VAX and 512K.

Tom O
March 21, 2014 7:15 am

Excellent assessment. It really opened my eyes to the complexity of the process. You can’t simulate the process from a lack of data.
Billy Ruff’n says:
March 21, 2014 at 4:39 am
Claude Harvey said, “For those who know and remain silent, cowardly self-interest comes to mind…”
True enough, but can you blame them?
(Yes I can)
Put yourself in the shoes of a young, up and coming climate scientist who has finally arrived at a point where they’re invited to work in the bowels of the IPPC. They have invested considerable financial sums in their education and many years in training and hard work to arrive where they are. If they speak up, their careers in academia and the climate research establishment are over.
(If they remain silent, then they are not “working” in their career field, they are merely playing “follow the leader.”)
How many private sector jobs in climate science are there?
(Don’t you think this should have been a consideration BEFORE they chose their academic field?)
If they speak up, how will they pay the mortgage and feed the kids?
(You either have a soul or you don’t. You find a different job and down-size your expectations, and in so doing, you pay the mortgage and feed the kids. You don’t have to be dishonest.)
For an honest person, it must be devastating to find oneself in such a position.
(Frankly, if they are in this position, they are not an “honest person” since they are committing dishonest acts.)

Editor
March 21, 2014 7:23 am

Thanks, Tim. Nice post.

Jeff Alberts
March 21, 2014 7:32 am

I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. Arthur Conan Doyle. (Sherlock Holmes)

I think it’s a capital mistake to rely on quotes from fictional characters, made by an author who believed in patently nonsensical stuff. The twisting of facts and theories can and does occur whether or not one has data.

March 21, 2014 7:35 am

This is a good article, but Dr. Ball is wrong on one point. A computer model cannot prove a theory. At best a computer model can provide evidence for or against a theory. This is because a computer model is nothing more than a mathematical expression of the theory, ie a computer model is the theory, any claim it proves a theory would be a circular argument. If real world measurements agree with the model outputs, this is evidence of the accuracy of the theory, not proof.

March 21, 2014 7:36 am

For an analysis of the inherent inutility of climate models y/all should take the time to watch

Jim Cripwell
March 21, 2014 7:39 am

Tom O writes “For an honest person, it must be devastating to find oneself in such a position.”
I completely agree. But you fail to identify what is wrong. The problem is not with junior scientists, but with senior scientists. It is the silence of the likes of the former President of the Royal Society and current Astronomer Royal, Lord Reese, who lies about the science of CAGW in public. It is these senior scientists who have remained silent, and allowed this disgusting state of affairs to occur.
Do blame the junior scientists. Blame their bosses.

Jeff Alberts
March 21, 2014 7:42 am

Diego Fdez-Sevilla says:
March 21, 2014 at 6:09 am
<Watch out! The blue car is going to run us over!
-That is not a car, it is a van.
<Come on it is car.
-It is a van and is not blue is grey but it depends on the angle of the light from where you see it.
<Are you saying I am blind and can not think by myself? It is a car and it is blue cause I know what I know and when I tell you … … …

Actually it’s a bicycle, and it’s going in the opposite direction.

izen
March 21, 2014 8:01 am

There is a common error in both the posted article and many of the responses that has to do with the difference between models intended to predict a specific state and models intended to simulate a physical process.
It is usually summed up as the difference between intial conditions and boundary conditions.
It is certainly true that the observational data we have is insufficient to define the initial conditions adequately to make accurate predictions of a latter state of the system.
However climate modelling is an exercise in boundary conditions, not specific states.
Physical modelling in such circumstances gives insight into the envelope of behaviour of the system. It does not give specific predictions of final states. This distinction makes many of the criticisms here of the shortcomings of the initial data and model predictions irrelevant because of the ignorance of this difference.

KevinK
March 21, 2014 8:04 am

If you can’t get it done in 128k, it’s not worth doing.”
You had 128k ?, boy back in my day all we had was ones and zeros, and sometimes we ran out of zeros and we had to use o’s.
Climate modelling is a FARCE.
Cheers, Kevin.

March 21, 2014 8:04 am

The vastness and complexity of Creation should awe and humble us. Dr. Ball does a good job of demonstrating how even the hugest computers cannot grasp the intricate interactions between the numerous levels, and how we lack data to fill in many of the bulky cubes of “grids.”
In essence our most bragged-about computers are pathetic, compared to what we’d need, to capture the intricacy of the atmosphere. One might as well try to capture a hurricane with a butterfly net. People who trust climate models are in some ways like people who trust a witch doctor when he shakes a bone at the sky, commanding it to rain. There is no basis for the trust, but they trust just the same.
The best weather forecasters are the ones who are awed and humbled by the atmosphere. They seem to understand the dynamics to some degree, but also to refer back to older maps which show roughly the same situation. Because they refer back to old maps so much, they know maps which start out looking remarkably alike can lead to situations that are remarkably dissimilar in only five days. (Some butterfly flapped its wings somewhere.)
I am amazed by the skill some display in the longer term. However that skill is only general, and cannot lead to specific forecasts. Also, just because a climate cycle lasted sixty years last time around does not mean it will last sixty years this time around. The infinite variety of weather offers us infinite opportunity to be wrong. This Creation we are part of is one heck of a lot bigger and more complex than we are.

March 21, 2014 8:26 am

It should hardly come as a surprise that none of these models are of any value. Those building them expect the models to show them how things work, while a good model can only be constructed when the builder already knows how the system works.

March 21, 2014 8:39 am

Follow the money. Al Gore certainly did.

March 21, 2014 8:42 am

Reblogged this on Power To The People and commented:
James Lovelock in the Guardian sums up the fact that Climate Scientists are well aware their theories are little more than a house of cards.
“The great climate science centres around the world are more than well aware how weak their science is. If you talk to them privately they’re scared stiff of the fact that they don’t really know what the clouds and the aerosols are doing. They could be absolutely running the show. We haven’t got the physics worked out yet. One of the chiefs once said to me that he agreed that they should include the biology in their models, but he said they hadn’t got the physics right yet and it would be five years before they do. So why on earth are the politicians spending a fortune of our money when we can least afford it on doing things to prevent events 50 years from now? They’ve employed scientists to tell them what they want to hear.”
http://www.theguardian.com/environment/blog/2010/mar/29/james-lovelock?guni=Article:in%20body%20link

Billy Ruff'n
March 21, 2014 8:54 am

eyesonu @ 3/21, 4:39 AM
Tom O @ 3/21, 7:51 am
I agree with you both, but at my age I guess I have a bit more empathy for the foolish decisions and mistakes of youth and the consequences thereof. The real villains in all this are the senior scientists who started and have perpetuated the fraud.

March 21, 2014 9:02 am

Fantastic read
And some great comments. I like short pithy ones like these two:
Quinn the Eskimo said at 4:11 am
The logic of the attribution analysis is essentially this: We don’t understand and are not able to model the climate system very well. Nevertheless, when we model the climate system, we don’t know what else could be causing the warming except for CO2, so it must be CO2. I am not making that up. It’s really that stupid.
jauntycyclist said at 5:31 am
just because the co2ers models can’t do anything doesn’t mean the processes cannot be predicted. One doesn’t have to model everything. Just the essential mechanisms. ie understanding the hierarchy. Putting co2 at the top is why they get nonsense.

Gamecock
March 21, 2014 9:11 am

David Jay says:
March 21, 2014 at 7:12 am
Thanks for making me feel young. The correct answer is that it takes a VAX and 512K.
=================================
You are welcome. I worked on VAXes for many years, too. When they came out, with their virtual memory, etc., I thought it ridiculous not just overlaying physical memory. In time, I came to appreciate virtual memory, though not as much as most did/do.

Gamecock
March 21, 2014 9:13 am

izen says:
March 21, 2014 at 8:01 am
It does not give specific predictions of final states. This distinction makes many of the criticisms here of the shortcomings of the initial data and model predictions irrelevant because of the ignorance of this difference.
=======================
Bullshit. Climate models are being used to make predictions about future temperatures. Are you daft?

March 21, 2014 9:15 am

Izen-
I dont see your point about boundary conditions. The climate models are predicting temperature based on co2 level. Temperature is hardly a boundary condition. If you are saying that the problem is that the climate models are designed to estimate boundary conditions and are being misused to predict temperature, I could go along with that.

knr
March 21, 2014 9:26 am

The bottom line , no AGW no IPCC , so what else do you expect them to do.

March 21, 2014 9:33 am

Through the use of information theory, it is possible to build the best possible model from given informational resources. Experience with building models of this type supports generalization about the prospects for creating a statistically validated global warming model that successfully predicts the numerical values of probabilities of the outcomes of events.
In building such a model, the first step would be to identify the events underlying the model. This step has yet to be taken. That it has not been taken means that “predictions” cannot be made with models of the type that are currently available. These models make “projections” which, however, convey no information to a policy maker about the outcomes from his/her policy decisions. Policy makers have no information but believe they have information as a result of confusing “projection” with “prediction.” This mistake accounts for the continuing fatuous attempts by governments at controlling the climate.
If the underlying events were to be identified, each event would have a duration in time. In climatology, the canonical duration is three decades. The bare minimum number of observed statistically independent events for construction of a statistical validated model is about 150. The time to observe these events is 30 X 150 = 4500 years. The various global temperature time series extend backward in time to the year 1850 providing us with 164 years worth of data. Thus, the minimum number of years that must elapse before there is the possibility of constructing that statistically validated global warming model which is maximally efficient in its use of information is 4500 – 164 = 4300 years. In 4300 years, though, our supply of fossil fuels will have long been exhausted.
The approach now being taken is to compute the future state of the climate at time t + delta t, given the state at t, to feed the state at t + delta t into computation of the state at t + 2 * delta t and to continue in this vein ad infinitum. The growing divergence between the computed and observed global temperature demonstrates that this approach doesn’t work. It doesn’t work because at the beginning of every time step, information about the current state is missing and the missing information grows as a function of time.
There is an urgent need for the directors of the world’s climatological research program to gain knowledge about the role of entropy in science and to factor this knowledge into the planning of the research. Their ignorance on this score has already cost us a fortune and this cost continues to rise.

rgbatduke
March 21, 2014 9:54 am

A good article summarizing a few — not even all — of the many problems with trying to solve the Navier-Stokes equation on a spinning, tilted ball in an eccentric orbit around a variable star, with an inhomogeneous surface consisting of 70% ocean (necessitating a separate coupled Navier-Stokes system in a moving fluid with highly variable temperature, salinity, density, depth and surface structure) and 30% of land surface that varies in terms of height above sea level, vegetation and use, moisture content, non-oceanic water systems, albedo, geology, and distribution relative to (e.g.) the Earth’s precise tilt and position in its aforementioned eccentric orbit around the variable star.
I tend to hammer on still other ones he omits — such as using a latitude-longitude grid in the first place on a spherical surface when this coordinatization has well known statistical and mathematical inadequacies for performing unbiased sampling, numerical interpolation, numerical integration (especially of the adaptive sort) and when there are well-known e.g. icosahedral tessellations that are both adaptive (systematically rescalable to a finer resolution) and which have no polar bias — all surface tessera end up with roughly the same area. Or, the inverse of the problem he describes — the fact that they don’t have data on anything like the grid resolution they are using already — which is that in a strongly coupled, highly nonlinear non-Markovian Navier-Stokes system (let alone two coupled Navier-Stokes systems consisting of completely distinct fluids with enormously strong coupling between them and nearly independent dominant circulation patterns) there is no theoretical basis for omitting detail at any scale when attempting a forward solution because even tiny fluctuations in state can nonlinearly grow until they dominate the simulated “climate’s” evolution on basically all future time scales. This is clearly evident in the enormous spread in model results produced by any given model within the “perturbed parameter ensemble”. Any actual future time evolution of the Earth’s climate is “unlikely” within the spread of possible future evolutions any given GCM produces, although the current GCMs almost all have produced PPE results from the “predictive epoch” after the training set that are predominantly systematically much warmer than reality has turned out to be.
We have the paradox that in order to get reliable results, we might well need to use a much, much finer grid to get the physics right, but have much, much worse data to use to initialize the grid in a way that believably corresponds to the current or any past climate state. Catch-22, with no way around it.
Dr. Ball also omits the fact that if different GCMs are applied to the same, vastly simplified toy problem — an untilted water world in a circular orbit around a constant star — they converge to completely different solutions for the imaginary planet’s steady state. It is difficult to sufficiently emphasize what this means as far as the possible reliability of GCMs in general are concerned. In any other branch of physics (say, quantum mechanics) if one took four different computer codes all purporting to solve the same quantum problem — perhaps determining the quantum properties of a promising new semiconductor — and all four codes:
a) produced completely, significantly, different results (different band structures with different gaps at different energies);
b) none of which agreed with
direct measurements of those gaps or the band structure in the laboratory,
then who would take any one of those codes, let alone some sort of “average” of their results, seriously? Seriously enough to invest a few billion dollars building a massive fabrication plant based on their collective predictions. Seriously enough to publish paper after paper on the “average prediction” of the four codes as if it had some meaningful predictive value in spite of the fact that direct comparison with experiment proves that they do not have any predictive value, either singly or collectively.
Yet that is business as usual in the world of climate modeling, which attempts to solve a problem that is much more difficult than solving a band structure problem reasonably accurately.
Finally, Dr. Ball fails to address Chapter 9 of AR5, which is arguably even more deceptive than Chapter 8. My favorite quotes there are:
From 9.2.2.1, Multi-Model Ensembles (MME):
The MME is created from existing model simulations from multiple climate modeling centers. MMEs sample structural uncertainty and internal variability. However, the sample size of MMEs is small, and is confounded because some climate models have been developed by sharing model components leading to shared biases… Thus, MME members cannot be treated as purely independent, which implies a reduction in the effective number of independent models…
Translation for those not gifted in statistics-speak: We pretend that the GCMs make up an “ensemble”:
http://en.wikipedia.org/wiki/Statistical_ensemble_%28mathematical_physics%29
Note well, not only do they not constitute such an ensemble, it is a horrendous abuse of the term, implying a kind of controlled variability that is utterly lacking. In essence, they are pretending that GCMs are being pulled out of a large hat containing “random” variations of GCM-ness, that GCMs are somehow independent, identically distributed quantities being drawn from some distribution.
However (the paragraph continues) we know that this is not correct. And besides, in addition to there only being a paltry few GCMs in the first place, we cannot even begin to pretend that they are in any meaningful sense independent, or that the differences are random (or rather, “unbiased”) and hence likely to cancel out. Rather, we have no idea how many “independent” models the collection consists of, but it is almost certainly too small and too biased for any sort of perversion of the Central Limit Theorem in ordinary statistics to apply.
Next, from 9.2.2.3, Statistical Methods Applied to Ensembles:
The most common approach to characterize MME results is to calculate the arithmetic mean of the individual model results, referred to as an unweighted multi-model mean. This approach of ‘one vote per model’ gives equal weight to each climate model regardless of (1) how many simulations each model has contributed, (2) how interdependent the models are or (3) how well each model has fared in objective evaluation. The multi-model mean will be used often in this chapter. Some climate models share a common lineage and so share common biases… As a result, collections such as the CMIP5 MME cannot be considered a random sample of independent models. This complexity creates challenges for how best to make quantitative inferences of future climate…
Translation: In spite of the fact that the MME is not, in fact an ensemble, ignoring the fact that model results are in no possible defensible sense independent and identically distributed samples drawn from a distribution of model results produced by numerically correct models that are randomly perturbed in an unbiased way from some underlying perfectly correct mean behavior, we form the simple arithmetic mean of the mean results of each contributing model, form the standard deviation of those mean predictions around the simple arithmetical mean, and then pretend that the Central Limit Theorem is valid, that is, that the mean of the individual MME mean results will be normally distributed relative to the “true climate”.
We do this in spite of the fact that some models have only a very few runs in their contributing mean while others have many — we make no attempt to correct for an error that would be grounds for flunking any introductory statistics course — treating my mean of 100 coin flips producing a probable value of getting heads of 0.51 on the same basis as your mean of a single flip, that happened to come up tails, to get a probable value of (0.51 + 0)/2 = 0.255 for getting heads. Are they serious?
We do this in spite of the fact that Timmy and Carol were too lazy to actually flip a coin 100 times, so each of them flipped it 25 times and they then pooled the results into 50 flips and flipped it another 50 times independently. Their result still goes in with the same weight as my honestly independent 100 flips.
We do this in spite of the fact that Timmy and Carol somehow got a probability of heads of only 0.18 (Timmy) and 0.24 (Carol) for 100 flips, where I got 0.51 and you got 0 (in your one flip that is still being averaged in as if it were 100). Anyone but a complete idiot would look at the disparity in flip results between me and Timmy and Carol, use the binomial probability distribution to perform a simply hypothesis test (all coins used in this experiment/simulation are unbiased) and would reject the entire experiment until the disparity was explained. But Climate Science only makes money if two-sided coins are not approximately fifty-fifty, and are indecently eager to avoid actually looking too hard at results that suggest otherwise no matter how they are obtained or how inconsistent they are with each other or (worse) with observational reality.
Finally (it concludes) — doing all of these unjustifiably stupid things creates “challenges” for statistically meaningful climate prediction using GCMs.
Ya Mon! You sho’ nuff got dat right, yes you did. If you use statistical methodology that cannot be defended or derived in any sound way from the accepted principles of probability and statistics, it does indeed create “challenges”. Basically, what you are doing is doing things completely unjustifiably and/or incorrectlyand hoping that they’ll work out anyway!
Mathematics being what it is, the result of using made-up methodology to compute quantities incorrectly is actually rather likely not to work out anyway. And you will have nobody to blame but yourself if nature stubbornly persists in deviating further and further away from the MME mean of many biased, broken, predictive models. One day nobody will be able to possibly convince themselves that the models are correct, and then where will you be?
At the heart of a scientific scandal that will make Piltdown Man look like a practical joke, that’s where…
rgb

izen
March 21, 2014 10:05 am

@- dbakerber
I dont see your point about boundary conditions. The climate models are predicting temperature based on co2 level. Temperature is hardly a boundary condition.
Models use multiple runs because temperature is a boundary condition. The multiple runs provide a range, an envelope of possible temperatures.
A comparison would be the modelling used to project the possible position of the missing airplane MH370. The initial conditions are incapable of providing a prediction of its exact position, but by modelling the physical constraints the possible area that the plane could have reached can be defined.
And where it could NOT have reached.

jorgekafkazar
March 21, 2014 10:08 am

Billy Ruff’n says: “… If they speak up, their careers in academia and the climate research establishment are over. …If they speak up, how will they pay the mortgage and feed the kids?
No problemo. When the trials begin, the excuse “I vass only folloving ordehrs” has been extremely popular.

DesertYote
March 21, 2014 10:09 am

Bringing about the Socialist Utopia is so important, that no price is too high, even the death of millions. People are not going to willingly give up comfort, to die in poverty, so they will need to be driven to it using lies and fear.

Matt in Dallas
March 21, 2014 10:09 am

The CAGW advocates know nothing of science, except in so much as what is required to manipulate and deceive the masses. This is nothing new. Sadly, anyone with any sense has known this for a long long time. Nice write up and kudos to Anthony, Dr. Ball and those who continue the fight for the truth being let loose.

jorgekafkazar
March 21, 2014 10:11 am

“Rotation around the sun creates the seasons…”
In Astronomy, we refer to rotation about an axis, revolution about another body.

Berényi Péter
March 21, 2014 10:23 am

Realities about climate models are much more prosaic. They don’t and can’t work because data, knowledge of atmospheric, oceanographic, and extraterrestrial mechanisms, and computer capacity are all totally inadequate. Computer climate models are a waste of time and money.

I think all is not lost yet, but current reductionist computational modelling paradigm needs to be profoundly reconsidered. See exposition of the issue at Judith Curry’s site.
Journal of Climate, Volume 26, Issue 2 (January 2013)
doi: 10.1175/JCLI-D-12-00132.1
The Observed Hemispheric Symmetry in Reflected Shortwave Irradiance
Aiko Voigt, Bjorn Stevens, Jürgen Bader and Thorsten Mauritsen

Abstract
While the concentration of landmasses and atmospheric aerosols on the Northern Hemisphere suggests that the Northern Hemisphere is brighter than the Southern Hemisphere, satellite measurements of top-of-atmosphere irradiances found that both hemispheres reflect nearly the same amount of shortwave irradiance. Here, the authors document that the most precise and accurate observation, the energy balanced and filled dataset of the Clouds and the Earth’s Radiant Energy System covering the period 2000–10, measures an absolute hemispheric difference in reflected shortwave irradiance of 0.1 W/m². In contrast, the longwave irradiance of the two hemispheres differs by more than 1 W/m², indicating that the observed climate system exhibits hemispheric symmetry in reflected shortwave irradiance but not in longwave irradiance. The authors devise a variety of methods to estimate the spatial degrees of freedom of the time-mean reflected shortwave irradiance. These are used to show that the hemispheric symmetry in reflected shortwave irradiance is a nontrivial property of the Earth system in the sense that most partitionings of Earth into two random halves do not exhibit hemispheric symmetry in reflected shortwave irradiance. Climate models generally do not reproduce the observed hemispheric symmetry, which the authors interpret as further evidence that the symmetry is nontrivial. While the authors cannot rule out that the observed hemispheric symmetry in reflected shortwave irradiance is accidental, their results motivate a search for mechanisms that minimize hemispheric differences in reflected shortwave irradiance and planetary albedo.

One may also want to do actual experiments on irreproducible quasi stationary non equilibrium thermodynamic systems other than climate.

DesertYote
March 21, 2014 10:25 am

jorgekafkazar says:
March 21, 2014 at 10:11 am
“Rotation around the sun creates the seasons…”
In Astronomy, we refer to rotation about an axis, revolution about another body.
###
Oh you are such a GOOD boy. You caught a booboo. Do you want a gold star?

Gamecock
March 21, 2014 10:26 am

Matthew R Marler says:
March 21, 2014 at 7:04 am
Does it bother you how much computing power is built into modern commercial aircraft? The facilities that design and build such aircraft? Or cell phones? CAT-scans and fMRI? Were you appalled by the waste of computing resources in mapping and sequencing the genomes of humans and rice?
======================
You conflate results with methods. I have no idea how efficient their programming is, nor do you.

Jimbo
March 21, 2014 10:30 am

A site search of the IPCC website for the word ‘uncertainties’ yields 2,560 results in Google.
A site search of the IPCC website for the word ‘poorly understood’ yields 206 results in Google.
site:www.ipcc.ch uncertainties
site:www.ipcc.ch “poorly understood”

Gamecock
March 21, 2014 10:31 am

KevinK says:
March 21, 2014 at 8:04 am
You had 128k ?, boy back in my day all we had was ones and zeros, and sometimes we ran out of zeros and we had to use o’s.
=======================
Heh, heh.
My father was a computing pioneer. His first system (early 1950s) was a room full of mechanical relays. Bugs were roaches stuck in a relay. Really. Mechanics worked two shifts a day to keep it working. No output was trusted; all results were subjected to a reasonableness test.

jorgekafkazar
March 21, 2014 10:34 am

“It is heroic to assume that such a view is sufficient basis on which to predict future ‘climate’.” — Kinninmonth
“’Heroic’ is polite. I suggest it is deliberately wrong.” — Dr. Ball
Yes, wrong, in a sense, and certainly deliberate. More precisely, I’d call it a transparent euphemism for ‘quixotic,’ i.e., “caught up in the romance of noble deeds and the pursuit of unreachable goals; idealistic without regard to practicality.”*
In other words, barmy in the crumpet.
* The American Heritage® Dictionary of the English Language.

Old'un
March 21, 2014 10:38 am

CJOrach at 8.42am
The ‘fight’ against catastrophic man made global warming/climate change is a self serving indulgence by well fed, well housed, liberal Western nations in the face of their inability to reduce the bloodshed and human misery that is occuring in so many places in the world on a daily basis.
Because of their impotence in solving current problems, Western leaders have convinced themselves that saving the world from a long term hypothetical catastrophe should be their key role. This is akin to the spreading of Christianity in past centuries to save the world from going to hell in a handbasket, and is likely to be to be just as pointless. 
Meanwhile, the real, day to day, agony continues for many millions of the world’s inhabitants, whilst the West obsesses over  ‘green’ policies.
Climate scientists should not have allowed themselves to become pawns in this game.

Stephen Richards
March 21, 2014 10:40 am

dbakerber says:
March 21, 2014 at 7:35 am
This is a good article, but Dr. Ball is wrong on one point. A computer model cannot prove a theory. At best a computer model can provide evidence for or against a theory.
It cannot provide evidence either. Certainly not good quality evidence.
Dr Brown, Duke. Be kind to Tim and say did not address rather than “failed”. please
Love your posts by the way. Remind me of my misspent youth in libraries studying physics.

Stephen Richards
March 21, 2014 10:42 am

DesertYote says:
March 21, 2014 at 10:25 am
jorgekafkazar says:
March 21, 2014 at 10:11 am
“Rotation around the sun creates the seasons…”
In Astronomy, we refer to rotation about an axis, revolution about another body.
###
It isn’t rotation that causes the seasons. It’s the TILT.

March 21, 2014 10:46 am

“It is worse now with fewer weather stations and less data than in 1990.”
huh? earth to ball!!
The amount of data measured any way you like has increased.
1. The number of stations is over 40,000 and climbing.
2. The number of station months is increasing.
Things you dont know.
1. GHCM is not the only repository. It’s one of the smaller ones.
2. Data recovery of previously archived records proceeds. This is recovery and digitization
of records that previously only existed on paper. I know of 13 projects in different countries
that are doing data recovery projects.

March 21, 2014 10:47 am

izen says:
March 21, 2014 at 10:05 am

@- dbakerber
I dont see your point about boundary conditions. The climate models are predicting temperature based on co2 level. Temperature is hardly a boundary condition.
Models use multiple runs because temperature is a boundary condition. The multiple runs provide a range, an envelope of possible temperatures.
A comparison would be the modelling used to project the possible position of the missing airplane MH370. The initial conditions are incapable of providing a prediction of its exact position, but by modelling the physical constraints the possible area that the plane could have reached can be defined.
And where it could NOT have reached.

Bollocks. We know how much a 777 weighs, we know how much fuel it can carry, we know how far it can fly on a given amount of fuel, we know how far it can glide, we have a fair approximation of a last known position, we know the general state of the winds in the area. We have a few decades of experience & literally hundreds of millions of miles of real world verification of these few parameters & we still have no idea what happened, & we can’t even begin to predict anything about the future state or disposition of the aircraft with any more certainty than Don Lemon.
By contrast, with the CO2 -> temperature model we know how much CO2 is in the air near Hawaii for the last few decades & we know about a few percentages of the land temperatures for a very short period of human history. By analogy, we know the diameter of the fuselage of the plane & we know how many packets of crisps the fellow in the window seat of the third aisle had nicked off the drinks cart & that’s it: we don’t know with any precision the last location of your hypothetical climate model plane, we don’t know if it’s a particular type of plane (or even if there’s more than one type), we can’t accurately guess how much fuel your plane was carrying, we don’t know the glide ratio of your plane, we can’t be sure the engines consume the type of fuel we assume, nor how many there are, nor how many are functional, for all we know your hypothetical climate model plane is actually a trolly bus in London that’s fallen on its side.
The CO2 -> temperature model has managed to demonstrate one thing though, & that is that the CO2 -> temperature model is utterly wrong.

catweazle666
March 21, 2014 11:08 am

Oh dear, not this old chestnut again.
As it was in the beginning is now and ever shall be.
‘In sum, a strategy must recognise what is possible. In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.’
The IPCC report of 2001
Anyone who claims that such systems are amenable to projection/prediction over any but very short time scales is either deluded or a computer salesman. All a faster computer does is give you the wrong answer quicker.
Ironically, it was a climatologist, Ed Lorenz, who originally pointed this out.

Tridac
March 21, 2014 11:28 am

>>> The vastness and complexity of Creation should awe and humble us.
Not formally religious, but that’s possibly the most profound thing I’ve read in this thread today. The more you dig into any part of nature and natural processes, the more complex it appears, yet we never seem humble enough to admit our limitations The ironic thing is that this complex system has been essentially stable for millenia without any input from humanity at all.
ps: My first computer was a $100 Kim 1 with 1k (1024 bytes) of ram. It was much later that I became aquainted and worked with pdp, vax, Sun and others. A long strange and wonderous trip indeed …

Ken
March 21, 2014 11:30 am

This link, http://www.rmets.org/events/climate-change-2013-physical-science-basis-working-group-1-contribution-fifth-assessment, does not contain a video. I watched the video below that link, and it appears that the caveats have been removed from it.

bones
March 21, 2014 11:30 am

JohnWho says:
March 21, 2014 at 6:07 am
For the most part, aren’t the IPCC scientists still remaining silent?
Shouldn’t most, if not all, of them be shouting “no, you are misrepresenting my/our work!”
————————————————————
Most of them would never, ever be reading this “misrepresentation” of their work. My academic friends regard WUWT as evil.

george e. conant
March 21, 2014 11:34 am

SO, let me get this right, ALL the Climate Models have agreed on one basic fact….They Have all failed. They have all NOT predicted the dead in its tracks halt to warming for 17.5 years and thus we can extrapolate that they will continue to fail. And Thank You Dr. Ball for a very informative essay!!!!

RichardLH
March 21, 2014 11:55 am

I think ‘computer games’ would probably be a better description than ‘computer models’. They seem to have the required basic grasp on the real world to fit that first description quite well.

Tetragrammaton
March 21, 2014 12:02 pm

Dr. Ball’s analysis of model construction is simply devastating. He uses the very words of the IPCC authors to skewer the validity of the models used by the IPCC to come up with their vacuous “95% certainty” claims. I think he shows great restraint in not naming names and pointing fingers at specific “guilty” originators of the scientific nonsense he exposes.
Indeed, rather than measuring the “climate change scientists” for their striped prison suits, his posting may be showing the beginnings of a pathway to a much brighter future for models. Many of the neglected or glossed over climate parameters (clouds, soil moisture, terrestrial cryosphere, etc.) are actually amenable to the measurement of – gasp! – actual data. As Dr. Ball points out, a lot of ground needs to be covered to develop supportable hypotheses for the behavior of these parameters, although I sincerely hope is wrong about his 30-year estimate (mentioned by Dr. Ball in coverage of 8.6.2.2).
One may hope that someone (perhaps Dr. Ball, perhaps Dr. Brown of Duke University, perhaps Willis Essenbach, probably not I) might take a look at fifteen or twenty of these parameters, one by one, and sketch out a set of research projects to address each one. Busily scribbling on the back of a convenient envelope, I’m reckoning that about $15 billion per year, for about 15 years, would bring the United States (and perhaps the rest of the world along with it) to an actionable level of understanding of climate physics and geophysics. The framework for such a project already exists in the form of the NASA Center for Climate Simulation, although it may need to be extracted from the blinkered Goddard alarmists.
At that point, in 2029, it may be reasonable to restart the IPCC (which we may hope has been in uncomfortable hibernation for the duration) and provide actual sensible guidance to political figures about what (if any) action to take with regard to future climate trends. In addition to much better theories, data and (I hope) models, we’ll have experienced another fifteen years of weather and we may be worrying (again) about global cooling.
It’s not that I’m against computer modeling and simulations. Indeed, for my sins, I at one time managed a large-scale environmental computer simulation project (two dozen programmers and engineers, plus one hapless documentation expert). I actually think a properly-developed model, with the right inputs, can be created and will give useful outputs. And I hope that climate scientists 15 years hence will be smart enough, and honest enough, never to claim that their science is “settled”.

RichardLH
March 21, 2014 12:02 pm

Steven Mosher says:
March 21, 2014 at 10:46 am
“1. The number of stations is over 40,000 and climbing.”
But the number of 1*1 degree cells with coverage will be increasing to what? From what? At 200, 150 and 100 years ago as well as today. Mind you we do have satellites with a better area coverage all round for today.
” 2. The number of station months is increasing.”
Well as we only have 15 contiguous thermometers in the BEST database that are longer than 200 years right now I would certainly hope so.
“1. GHCM is not the only repository. It’s one of the smaller ones.”
There are VERY few temperature records of any sort that are >200 years.
” 2. Data recovery of previously archived records proceeds. This is recovery and digitization
of records that previously only existed on paper. I know of 13 projects in different countries
that are doing data recovery projects.”
And valuable work that is. May help to reduce some of the wilder claims.

March 21, 2014 12:06 pm

Between Dr. Ball’s essay and Dr. Brown’s post (March 21, 2014 at 9:54 am, above), it is a wonder anyone says “Here, look at what the models prove.”

SineWave
March 21, 2014 12:09 pm

Still, it would be nice if the political forcings were removed and these models were refined to a useful state. It would probably take decades, but they could turn into something helpful.

March 21, 2014 12:28 pm

“izen says: March 21, 2014 at 8:01 am
There is a common error in both the posted article and many of the responses that has to do with the difference between models intended to predict a specific state and models intended to simulate a physical process.
It is usually summed up as the difference between intial conditions and boundary conditions.
It is certainly true that the observational data we have is insufficient to define the initial conditions adequately to make accurate predictions of a latter state of the system.
However climate modelling is an exercise in boundary conditions, not specific states.
Physical modelling in such circumstances gives insight into the envelope of behaviour of the system. It does not give specific predictions of final states. This distinction makes many of the criticisms here of the shortcomings of the initial data and model predictions irrelevant because of the ignorance of this difference.”

Pizen!
What brings such a devious troll as you back? Dr. Ball’s short simple revelation of the darkness and foolishness behind the blind beliefs and obeisance to ‘climate modeling’ brings you back from the slimy side of the model?
Interesting piece of dodge bluff smoke and nonsense you spouted. Why don’t you break it down to specifics in a direct one on one comparison to Dr. Ball’s or Dr. Robert G Brown’s identified issues?
Actually, I don’t believe you are capable of responding to specifics. You’re preference if for straw men and obfuscation, but it would be nice if you could surprise us.
Meanwhile:

” Pizen say’s
“…difference between models intended to predict a specific state and models intended to simulate a physical process.
It is usually summed up as the difference between intial (sic) conditions and boundary conditions…”

Really? Perhaps you could explain these ‘models’ in better detail.
Models intended to simulate a physical process, e.g. ‘process control’ models are actually quite common and very capable for their specific roles and are found throughout industry, science and many other disciplines. There again, whenever, let’s emphatically state that word, whenever a model fails to match the process that model is pulled apart and all lines of code along with their steps and results are compared to reality until the reason for the divergence is found!
Only in ‘climate science’ are ‘models’ used without verification nor certification. They are run as they are and when their ‘envelope’s of behavior’ fail to match reality the ‘climate teams and trolls’ run out the spinners and dancers to ‘distract’ the masses while the modelers announce new ‘disaster is upon us’ results.

“Pizen says:
“…Physical modeling (sic) in such circumstances gives insight into the envelope of behaviour of the system … ”

Now that is interesting.
Elucidate!
In detail, giving examples, model run meta data, insights, envelope behaviors, explicit analysis with relevant meta data about the analysis and results all specific to that exact model run.

“Pizen says:
“…It does not give specific predictions of final states. …”

Interesting conclusion or statement. So these models do not give specific predictions of final states?
I suppose your slipping in that ‘predictions’ gives you a slippery answer that can mean almost anything.
Now let’s get into detail about the ‘final states’ that are not predictions? What constitutes a ‘final state’? The computer burps and stops? A run of data is spit out and all of the modelers don their head wraps and light incense before seeking guidance from the ‘final state’?
Give examples of ‘final states’ and how your model does not leave people with something they write up as a ‘prediction’.
Hint, if a number or envelope is purported to represent anything beyond ‘now’ it is technically a prediction even if extremely unlikely. As the warmistas have discovered, ‘predicting’ tomorrow’s climate is not easy nor is it valid till the models get it ‘exact’.

rgbatduke
March 21, 2014 12:30 pm

One may also want to do actual experiments on irreproducible quasi stationary non equilibrium thermodynamic systems other than climate.
Yes, that does seem sensible, doesn’t it?
I visited one of your links and looked over the discussion on hemispheric symmetry, and would add something remarkable to that already remarkable observation. That is that the average annual temperature variation countervaries with the actual TOA insolation. People have been making noises about the importance of the order unity W/m^2 variation associated with CO_2, land use changes, or albedo, but do not forget that the Earth is in a rather eccentric orbit such that TOA insolation varies by 91 W/m^2 — a number that dwarfs all other variations in the system put together by more than an order of magnitude — from perigee to apogee.
In spite of this huge number variation literally at the top of the mechanism for energy delivery into the open system, the coldest annual average temperatures occur at or near perigee, and the warmest at apogee, the exact opposite of what one would expect for a simple uniform spherical ball. Normally, one attempts to explain this by arguing that the average albedo of the northern vs southern hemisphere are quite different, but the data you present seems not to support that.
To be honest, I don’t know what to make of the albedo data you present. If the NH and SH albedos are empirically a close match, one is right back to the drawing board in any attempt to explain the temperature countervariation by albedo alone. It has be a synchronized albedo variation — one that averages out the same but somehow manages to countervary in alignment with the NH and SH temperature asymmetry out of phase with the 45 W/m^2 insolation variation amplitude around its annual mean.
In the end, I agree with your assertion that there is some point to trying to build climate models, just as there is a point to many “grand challenge” activities such as trying to prove the Goldbach conjecture, or the Riemann hypothesis or that P = NP or whatever, even if some of them never succeed. However, there is no point in claiming that they work when they not only don’t work, but at a time when climate modeling is in its infancy, probably decades away from where it has any significant predictive skill at all.
rgb

Richard T
March 21, 2014 12:37 pm

To the old guys with their PDPs and VAXs. My first machine experience was the Bendix G15D running the Intercom interpreter. We used it on undergraduate engineering classroom problems. Fun. Even more so was operating the analog computers of the day. The Bendix had the neat feature of ringing a bell when it encountered a programming error at which point you received “the smile” from your associates in the room.
Government supported science is directed to supporting government goals, good and not so good. It can be corrupted when the goals are regulations and taxes ( by any other name). Climate “science” ?
Having worked in industry, a government lab and academia, I encountered the strictest accountability for my work in industry, reasonable accountability in the lab and observed academic accountability to be a function of the requirements of the funding body. Many (most?) of my industry associates are cAGW doubters. Many of my academic associates are believers.

milodonharlani
March 21, 2014 12:46 pm

Steven Mosher says:
March 21, 2014 at 10:46 am
Whom are you trying to kid?
As of 2010, GISS relied on just 3846 stations. Many were lost in the 1980s & early ’90s:
http://www.appinsys.com/globalwarming/GW_Part2_GlobalTempMeasure.htm#historic
Coverage, duration, siting & reporting are all terrible. Taking a planetary average temperature based upon such shoddy to nonexistent “data” is ludicrous, if not criminal, even before the shameless adjustments & interpolations.

milodonharlani
March 21, 2014 12:57 pm

From the above link:
It is important to note that the HadCRU station data used by the IPCC is not publicly available – neither the raw data nor the adjusted data – only the adjusted gridded data (i.e. after adjustments are made and station anomalies are averaged for the 5×5 degree grid).

Crispin in Waterloo
March 21, 2014 1:01 pm

@KevinK
>>If you can’t get it done in 128k, it’s not worth doing.”
>You had 128k ?, boy back in my day all we had was ones and zeros, and sometimes we ran out of zeros and we had to use o’s.
Reg Barrow, an up and coming programmer based himself in a company that sold data sharing in the 60’s using an IBM 360. He was delighted to hear the bosses were doubling the RAM to 512 K for the princely sum of $250,000 and hopped to the door on Saturday as the technician arrived to perform the upgrade.
“Where is the equipment?” he asked astonished. The technician carried only a briefcase.
“It is here in my case,” was the reply.
They proceeded to the main computer room. The technician pulled out a cable and plugged it into two empty connectors next to an existing cable that provided access to the first 256k of RAM.
“You mean the memory is already in there? asked an astonished Reg.
“Yup. Just have to put in the cable.”
“And for this you charge $250,000?”
“Yup.”
The computer was sold to timeshare companies who created contracts with dumb terminal users. Sixty clients per IMB machine was the mantra. Break even for the purchase cost and financing was 40 users. 66%. In fact as the number of users logged on approached the high 30’s the time to log on increased exponentially, with the 40th user taking literally forever. In practice it was impossible to purchase the machine and make money from selling timeshare. Everything all those timeshare companies made was handed to IBM so they sold programming/consulting services to make actual profits.
The morality in the world of big computing hasn’t change much in 50 years.

March 21, 2014 1:43 pm

My Father was a technician working on one of the first computers in Philadelphia.
Total approximate memory about 2000 bytes, though not as we understand memory today. His job was to find and replace burnt tubes and circuits in between runs and fix them.
It was strictly a serial computer. One step led to another single step, card after card till the last card was read and the final card punched.
I started first with a PDP and then later a little more seriously (student wise) with FORTRAN at one of Penn State’s branch facilities; both using punch cards, submitted runs during specific open times and collecting output much later at the Computer Operations mail center. From both I would receive total CPU time used out of my allotted CPU budget.
Penn State closed that ‘computer branch’ for refurbishment and I migrated to a community college sporting a brand new massively outfitted 3270 IBM mainframe that even had a pack of 3270PCs that were left alone by the staff and students to my glee.
No punch cards. No constant reminders of ‘allocated CPU budget’. No restrictions on times for submitting jobs and output was immediate, especially if prints were kept in a spool library instead of printed.
After running out of CS courses to take along with business and Finance I moved on. But not before work discovered a blue collar worker who could maintain PCs and interface with a mainframe.
In the finance world I ran into the issue of trying to replicate the multiple dimensions of reality in code.
128K, 512K, 1M, 2M of memory coupled with stepwise single function code takes forever.
Multidimensional arrays hanging in memory coupled with threads and multiple processors allows for some mighty fast and fancy computing.
I get a kick out of seeing historical relics and even looking them over. I am not interested in keeping them for even one day as ‘reminders’ of the good old days. They were anchors then and they’re not even decent for anchors today.
Back about 1990 the Federal government sought to ‘recover’ some of the half million dollars it cost to buy a fully fleshed Wang 100 mini Computer and terminals. They published their intent to sell at auction with ads and flyers throughout town.
The big day arrived, the auction room filled and people proceeded to watch as item after item went without bid. Some of the disk drives sold for a few bucks, mostly to get the removable drive components themselves.
The VS100 central CPU sold for $100 with a few bidders slowly adding their $5 bid increments.
All of the unsold items were put into a dumpster behind the building over the next week. We copped a few parts just in case our Wang VS100 needed them. Completely ignored were all of the Wang WP terminals (monitors and keyboards) and miles of double cable.
So also went the various PDPs and early DEC lans along with those early IBM PCs and clones.
I suppose someone could’ve started designing 1024 bit processor run computers without need for doubleword code or split addressing. But who would’ve bought them back then? $4500 for the original IBM 8086/8088 PC without hard drive was very expensive back then. Several thousand dollars more and one added a separate ‘box’ with a 10MB hard drive and massive wire connection back to the CPU.
So what’s the statement? Something like; “Get over it”. This is the 21st Century and we have little concept what the computer field will be like in twenty years.
Still; I have a lingering thought that the fancy new computers in climate modeling are not being fed state of the art code runs. That annoying one step at a time mainframe approach still seems prevalent in many places. Yes, even state of the science computers can not ‘perform’ old code any better than old computers; just a tad faster.

Editor
March 21, 2014 2:15 pm

8.4.11 “The GCM models referred to as climate models are actually weather models only capable of predicting weather about two weeks into the future”.
Exactly. The whole basis of the GCMs is ridiculous for a climate model. You can’t tell the decade/century future of climate by dividing the globe into little 20-minute cubes, just like you can’t for example predict world food production a decade or century ahead by dividing the surface area into little 20-minute squares.

lemiere jacques
March 21, 2014 2:29 pm

you don’t have to prove models are wrong and you can’t, THEY must prove how models are acurate, and, they can’t.

March 21, 2014 3:26 pm

Thank you, Dr. Ball. yet again, this time for a devastating critique of Warmism’s faith in the “Models”. Your breadth and detail of knowledge on the subject should be rather startling for any Warmist Reader. The idea of a computer model sounds good, but in practice they are little more use than a large Lego Model for modelling the reality of climate.

Paul Coppin
March 21, 2014 4:08 pm

VAXes SMAXes 🙂 you kiddies were spoiled, spoiled I tell you. Who here remembers sitting for hours in front of an 026 or 029 keypunch loading up a 1000 card box full of F2, Watfiv or Watfor, (or yikes COBOL!) for the sysop to run, only to have batch spit out your deck halfway through due to compile errors, never getting to run… Or the perverse joy of watching a modified IBM Selectric “flying ball”, hooked to a System 370 stream out pages of APL gibberish before you figured out your program was wrong, and oh btw, your booked time allotment on the 370 is up… If your input didn’t weigh at least 20lbs, you could hardly call yourself a programmer…
Then we all went out and bought TRS80 Model Is and ran North America’s middle class until IBM claimed to have invented the PC…

juan slayton
March 21, 2014 4:09 pm

lemiere jacques:
you don’t have to prove models are wrong and you can’t…
I dunno about that. If n models say n contradictory things, I can reasonably conclude that n-1I/i> models are wrong, and maybe all n.

juan slayton
March 21, 2014 4:14 pm

Well nuts. Should read n-1 models are wrong and maybe all n.

Dr. Strangelove
March 21, 2014 4:30 pm

That GCMs are useless in making 100-year forecast of climate is known long ago. Dr. Patrick Frank published “The Climate of Belief” in 2008. He showed that GCM forecasts are no better than random guesses. To my knowledge, none of IPCC scientists have refuted Frank’s conclusion. Even the editor of Journal of Geophysical Research of AGU and IPCC scientist admitted to me the large uncertainty in modeling clouds.

Frank
March 21, 2014 5:51 pm

Dr. Ball: Thank you for taking the time to copy these passages from AR5 and providing some context, especially when the IPCC’s words vastly out number you context.
Having adequate data to initialize models is essential for weather forecast model, but I’m not sure why initialization data is important for century scale climate projections. ENSO variability averages out over a century, but not one or two decades. Climate models don’t exhibit much variability on the decade+ time scale, so initializing PDO and AMO shouldn’t change much.

NoFixedAddress
March 21, 2014 6:02 pm

Until one computer modeler can predict next week’s ‘lotto’ (6 from 45) then they are propping up a gigantic tax fraud on people.
Stop taxing or seeking to tax our salt!

eyesonu
March 21, 2014 6:12 pm

izen says:
March 21, 2014 at 10:05 am
==
Please read what you wrote, think about it, and report back. Your analogy doesn’t fly or even get off the ground. Think before you respond.

March 21, 2014 6:30 pm

Reblogged this on The GOLDEN RULE and commented:
This is appropriate given that two posts later, WUWT reveal what Professor Mann claims is science to be believed, when its computer-modelled graphs and predictions right from the start use a 2013 global temperature level already incorrect. That’s how their ‘warmist’ science “works”.
From this post we learn the importance and influence of the agenda factor – “Here is the IPCC procedure:
Changes (other than grammatical or minor editorial changes) made after acceptance by the Working Group or the Panel shall be those necessary to ensure consistency with the Summary for Policymakers (SPM) or the Overview Chapter.”
Blatant admission from the IPCC that reports are amended to prove the premise that ‘man is causing catastrophic warming’. You can’r find anything clearer than that. Yet warmists persist in their false (demonstrably) claims and scaremongering lies.

KevinK
March 21, 2014 6:46 pm

Gamecock, I hope you realized my “running out of zero’s” joke was just in fun. I use it when our younger software professionals start to get “uptight” about “delays” in product deliveries.
Until the 80’s and 90’s most of the signalling systems used on US railroads consisted of mechanical relay logic. Not “reprogrammable” as such, and slow but fast enough for the job at hand. The “computers” were in all those silver metal boxes you see along the railroad right of way and in the control towers. The towers contained “interlocking machines” which had an elegant system of levers/cams/bars that prevented a tower operator from putting two trains on the same track (usually leads to a “train wreck”). Dammed rugged stuff, probably survive an EMP blast just fine.
Personally, I prefer a Hammer to a computer, they are all Rev 1.0, and never need rebooting, ha ha ha …..
Dr. Ball, very nice essay, thanks for your time demonstrating the flaws in climate models.
Cheers, Kevin.

Gamecock
March 21, 2014 7:06 pm

Paul Coppin says:
March 21, 2014 at 4:08 pm
Ahhh . . . the Trash80!

ferdberple
March 21, 2014 7:56 pm

At the heart of a scientific scandal that will make Piltdown Man look like a practical joke, that’s where…
====================
When I learned about Piltdown Man, I thought “how could people back then have been so stupid?”. I was so obviously a fraud. Yet everyone believed because they wanted to believe.
Cars are a nuisance. We want to believe they are bad. So we can get rid of them and replace them with something better. Like the bus?? Or the bicycle?? Or walking?? Or the horse??

ferdberple
March 21, 2014 8:03 pm

Frank says:
March 21, 2014 at 5:51 pm
Having adequate data to initialize models is essential for weather forecast model, but I’m not sure why initialization data is important for century scale climate projections. ENSO variability averages out over a century, but not one or two decades
===========
If initialization doesn’t matter, why keep temperature records? Why train models using past data? If ENSO average out over 100 years, where is the data to support this? Or are these simply assumptions?

ferdberple
March 21, 2014 8:05 pm

NoFixedAddress says:
March 21, 2014 at 6:02 pm
Until one computer modeler can predict next week’s ‘lotto’ (6 from 45)
========
NASA GISS has just such a computer. It cost $10 billion. Every week it predicts the wrong answer for next week’s ‘lotto’. However, in hind casting it manages to predict last weeks’ number almost 1/2 the time.

jorgekafkazar
March 21, 2014 8:11 pm

My first work computer was a Royal McBee LGP-30, circa 1962. Input was via keyboard; programs were stored on 1″ mylar punched tape. No air conditioned sanctum sanctorum with priests in white lab coats, I could get in and use it anytime it was open. Later that year, one of my professors said, “Ach! Vun of you has done his homevork vit a computer. Vell, I didn’t zay dot you couldn’t. Bezides, vun day, all homevork vill be done on computers. Ve vill all have computers to do our calculations.” He was right. I’m still impressed.

jorgekafkazar
March 21, 2014 8:27 pm

I once worked near the ocean and was impressed on my morning commute by how many different ways the sun could reflect off the water. Ocean albedo is a function of solar zenith angle, wind direction and velocity, tides, salinity, seafoam, air and water temperature, currents, pollution, and (so help me!) plankton. There is no way that the GCM’s account for all of these continuously, rapidly changing variables. Oceans make up 71% of the Earth’s surface. If you can’t get the ocean albedo right, you can’t model the system.

ferdberple
March 21, 2014 8:39 pm

Matthew R Marler says:
March 21, 2014 at 6:56 am
It isn’t clear which of the problems you address are simply problems with current models
==========
The problem is inherent in prediction from first principles. Chaos makes any such prediction mathematically impossible because of round off errors in digital computers.
There are other techniques that have proven to work. We can predict the tides by decomposition of the orbital frequencies of sun, moon and planets. However modern science rejects this because the underlying principle in tidal calculation is Astrology.
However, to forecast the tides according to first principles as is done with climate models? Please, it has never been done successfully, even after centuries of trying. For $15.95 the Old Farmers Almanac uses Astrological principles and routinely outperforms $$ Billion dollar climate models used by the IPCC.

Matthew R Marler
March 21, 2014 8:42 pm

izen: There is a common error in both the posted article and many of the responses that has to do with the difference between models intended to predict a specific state and models intended to simulate a physical process.
I think you need to direct that comment toward anyone who proposes or accepts that the current GCMs form a reasonable basis for informing policy decisions. I personally think that the current GCMs are amazing achievements, and incorporate a great deal of scientific knowledge in computable format. That is, a complex computer program is a summary and codification of knowledge in the same sense that the periodic table was, but translates all knowledge representations into explicit computational procedures and makes use of estimates of physical constants. However, the current GCMs also are wrong in their forecasts, showing that they are worthless for anticipating or planning for the future. Either the codified “knowledge” is incorrect, the models are inadequate, the computational techniques are inadequate, the parameter estimates are too inaccruate, too much is unknown and omitted from what may eventually be an adequate and accurate model, or (add to the list, which is not exhaustive) some of all of those.
We commentators are not those making the error that you accuse us of (or at least “many” of us, undifferentiated.). We are showing that the limits of accuracy imply the existence of important shortcomings in the knowledge of the physical processes that they simulate. And we claim that the demonstrated inaccuracy to date provides adequate(!) reason to doubt that the simulations have any practical utility beyond guiding the development of better models.

Reply to  Matthew R Marler
March 22, 2014 8:37 am

I agree with Matthew Marler on the uselessness of the current GCMs for the purpose of planning for the future but seem to disagree with him on the merits of these models. Firstly, the GCMs reference no underlying statistical population but it is the observed events in this population that tie the associated model to reality. Secondly, they lack means for updating the state of the climate to the observed state at beginning of each event in the population; as the GCMs lack these means, their errors are unbounded on the up side. Thirdly, the GCMs fail to extract from the observational data all of the information that is available in it but no more than this information. Had the planners of the global warming research program addressed each of these shortcomings from the outset, they would have realized that 30 year forecasts of climatological outcomes are not a possibility during the next 4500 years, for the observed events will be far too few.

Matthew R Marler
March 21, 2014 9:02 pm

ferd berple: The problem is inherent in prediction from first principles. Chaos makes any such prediction mathematically impossible because of round off errors in digital computers.
Chaos does not establish that everything in particular can not be computed with sufficient accuracy from first principles. All it shows is that errors accumulate faster with chaotic systems than with systems of linear differential equations. And it for sure does not rule out the prediction of functionals like mean, s.d., quantiles and extremals.
Most real numbers, including most rational numbers, can’t be computed or represented exactly on a computer anyway (that is to say, computable/representable numbers have Lebesgue measure 0), so questions/comments like yours always have to start with an appreciation of how much accuracy is required for a given purpose. In biological systems modeling, there are plenty of successful applications of chaotic mathematical models; a model of heartbeat can not exactly predict when the 10th beat of a series will occur, but it can predict the effect of a drug on the distribution of inter-beat intervals.

Bill Reeves
March 21, 2014 9:06 pm

Nice work but redundant. The wonderful WUWT chart that maps all the climate model projections and shows the real temperature record below them all. So far below them that it couldn’t have happened accidently. This means that not only isn’t the science settled it’s not even in the right neighborhood.

Catcracking
March 21, 2014 9:13 pm

Dr. Ball.
An excellent presentation with numerous quotes and examples of statements by CAGW “scientists” admitting the limitations of models.
This casts serious doubt on the ability to model the future climate with computers which many of us already knew. One has to be arrogant to claim they can model conditions out 80 years when they fail even 6 months out. The lack of integrity is appalling.
As an engineer I know full well, if you don’t know with great precision the mathematical equations that govern the behavior, the computer models are virtually useless. I have seen less complex models fail and lead to poor decisions. Unfortunately that’s what is happening with climate models.

bushbunny
March 21, 2014 9:20 pm

I am no IT expert, but I always thought that a computer, computes. But is dependent on the data fed to it by the operator. And the systems analyst who set the program up? It is a fact, that some will corrupt the data to fit the hypothesis, not the other way around when data is the main ingredient to form a hypothesis. I can recall a Ph.D candidate putting on a lecture on some archaeological hypothesis. I was a humble student but others there were not! The Ph.D candidate was flawed, when he said he did not have enough data to complete his research One man there, got up and before he walked out said,
“No data? Then what indeed are you basing your hypothesis on? Speculation?” The ph.d candidate was very annoyed he was shown up as being incompetent, but he was given a grant to pursue this research. See the same with the Mann diatribe.

thingadonta
March 21, 2014 9:48 pm

A model that gives a result that is not consistent with the dominant paradigm of the institution is rejected at birth, as a matter or routine. Over time this of course means that the dominant paradigm gets stronger.
This more or less explains why the IPCC is becoming more certain, whilst at the same time the models are getting less consistent with what is going on in the climate. It shows there is something fundamentally wrong.
One of the main problems here, which occurs often in science, is how to distinguish valid, from invalid variation. Models which may be valid are rejected because they vary from a certain position, or theme, or agenda, of established knowledge, etc, but the trouble is, that is just not how nature operates.
Human beings are not very good at differentiating valid from invalid variation, it is also one of the reasons (not the only one), dare I say, that people may reject the idea of evolution; they can’t allow for the fact that some variations can lead to an entirely new paradigm or system that may upset, replaces, or threaten, the old one, in this case a new species. In the case of evolution, they may reject the very idea of ‘valid variation’ to begin with, saying such a concept doesn’t even exist. Many commentators have noted that often the problem with acceptance of evolution is the idea in philosophy of ‘essentialism’, that is, there is something about species that is ‘essential’, and variation within populations are therefore routinely less valid, or less true, or just malfunctions, which are to be rejected or ‘corrected’; and they therefore never lead to replacement, or change, or a new species. Of course, in nature, ‘essentialism’ doesn’t even exist to begin with, and species routinely change and/or are replaced, and new species are budding off and coming into being all the time, although the process may occur over very long time periods.
The case becomes acute when the dominant paradigm becomes ‘essential’ so to speak, and may not be at all correct to being with, i.e. the IPCC modellers chose to accept as the dominant paradigm (which they should have never chosen, based on available data), i.e. that the warming of the atmosphere in the late 20th century was mostly from human activities. This then becomes very difficult to dislodge, as there is now too much political and personal baggage at stake, valid variation is now being filtered out routinely, and this can only change when the data overwhelms what should never have been granted as ‘essential’, or the dominant paradigm to begin with.

bushbunny
March 21, 2014 10:26 pm

My computer is very slow, even the keyboard, putting on the virus protector.

Claude Harvey
March 21, 2014 10:43 pm

Re: Billy Ruff’n says:
March 21, 2014 at 4:39 am
Claude Harvey said, “For those who know and remain silent, cowardly self-interest comes to mind…”
“True enough, but can you blame them? Put yourself in the shoes of a young, up and coming climate scientist who has finally arrived at a point where they’re invited to work in the bowels of the IPPC. They have invested considerable financial sums in their education and many years in training and hard work to arrive where they are. If they speak up, their careers in academia and the climate research establishment are over.”
You define the coward and then ask me if I blame him. Of course I blame him! Whom else am I to blame but the fellow who chooses his own hide at the expense of truth and the hides of others?

Girma
March 21, 2014 10:44 pm

What is systematically omitted from the SPM are precisely the uncertainties and positive counter evidence that might negate the human interference theory.

Very true.
Here is the best example:
CO2 is more soluble in colder than in warmer waters; therefore, changes in surface and deep ocean temperature have the potential to alter atmospheric CO2.
https://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch6s6-4.html

Brian H
March 21, 2014 10:44 pm

sub-grid scale parameterizations are used to parametrize {fudge} the unresolved processes

Fixed It For Them

bushbunny
March 21, 2014 11:55 pm

Thank you Dr Tim, we need your credible expertise here. Well done!

Chuck Nolan
March 22, 2014 6:33 am

Billy Ruff’n says:
March 21, 2014 at 4:39 am
Claude Harvey said, “For those who know and remain silent, cowardly self-interest comes to mind…”
For an honest person, it must be devastating to find oneself in such a position.
—————————————————————
Unfortunately, for many climate scientists honesty is considered a major fault.
Fortunately for them, few honest climate scientists exist.
cn

Brendan
March 22, 2014 7:31 am

“rgbatduke” does an excellent job on the idiosyncrasies of numerical modeling. I have not read the latest release from the IPCC… I actually have two kids, a wife, and a job… but I did waste a lot of years doing true numerical analysis in fluid dynamics and heat transfer.
I have also worked on a series of models that are like the GWM – modifications of the Navier Stokes. I will say that I have to disagree that these type of models do not do a good job when applied properly – you just need to be aware of their limitations. As always, where a model is limited in its initial information, you will have results of limited final accuracy. And it will be worse when many of the sub models that are designed to “fill in” for not being able to apply the energy equation properly (or the viscous terms) are not properly modeled to begin with.
However, I must disagree with Professor Ball in his statement that grid size has no impact on the final answer. If that is the case, why not create a grid of 8 nodes? It is standard methodology in numerical analysis that you must look at the asymptote of your solution – some use shortcuts to get around that, but they are rules of thumb – and if you have not reached your asymptote then your model has numerical error on top of any other errors.
I reviewed such a model, which was compared to another, real world measured factor (and you must forgive me, its been over 10 years since I did, so I don’t remember what they were comparing the numerical residual to – it had to do with rainfall measurements). But to get to numerical convergence (for the model as designed) they needed to go from a 300 km grid to a 50 km grid. This halted the numerical residual. It was still 100% greater than the other point he was comparing it too, which the modeler described as a result of the uncertainties underlying many of the approximation models. So the basic 300 km model, of which most of the original policies were based on, had residual errors up to 3 times what would be expected of a model that properly approximated the real world. A properly gridded model had residuals twice what would be expected. I know that there are many of the approximation models that are just wrong – one of which has to do with the energy balance of water. No effort has been made to fix those, although I know that those errors have been pointed out by researchers with a lot more heft than myself.
Other than that, Professor Ball’s overview is excellent. But I suggest he revise the comment on numerical grid size. Its a weakness in his paper that needs to be addressed.

Brendan
March 22, 2014 7:33 am

“Halved” not “halted” Sorry….

markx
March 22, 2014 9:28 am

I went looking for Trenberth’s statements in Tim Ball’s article above:
Here is a powerpoint and an article.
Overall a pretty strong critique of weaknesses of the historical records.
Trenberth (about 2002?) (Power point) (some extracts below)
http://www.cgd.ucar.edu/staff/trenbert/Presentations/climObams.pdf

We do NOT have an adequate Climate Observing System!
Instead we rely on an eclectic mix of observations taken for other purposes. But we can not create
an observing system just for Climate!
Therefore observations MUST serve multiple purposes.
Climate Data Records
Surface and in situ observations, often associated with weather networks, have provided the most important data so far for the detection and attribution of causes of global climate change.
Long term consistency does not exist.
Instead heroic reconstruction attempts are made to quantify and minimize space and time dependent biases and try to produce continuity of records.

The article on the topic (Trenberth) here: http://home.chpc.utah.edu/~u0035056/jhorel/mac/trenberth.pdf

ferd berple
March 22, 2014 9:42 am

Consider your local weather forecast. 60% chance of precipitation tomorrow. Consider the IPCC forecast. Doubling CO2 will increase temperatures 2C. A least the weatherman is honest enough to recognize the forecast has a chance of error.
Look at the IPCC spaghetti graph of model results. They are not all showing the same results. 1/3 are saying a doubling of CO2 will cause less that 2C warming. So, even the climate models are telling us that there is a 1/3 chance that doubling of CO2 will not increase temperatures 2C.
Some of the models are even showing that increasing CO2 will not increase temperatures, which is what the observational record is showing. Why has the IPCC not rejected the other models that are so far removed from observation?
Clearly some of the models are providing more accurate results than others. Why are they all treated equally correct by the IPCC? Is this science? If the IPCC was truly a scientific body, those models that were not matching reality would have been rejected as incorrect, and only those models that still matched observations would be used.
Perhaps some enterprising individual will take the IPCC spaghetti graph and remove those projections that don’t match reality, and post the results as an article for WUWT showing what the models are actually telling us.

Reply to  ferd berple
March 22, 2014 9:56 am

Fred Berple:
I’ll play your “enterprising individual.” With those projections removed that don’t match reality, there are no projections.

March 31, 2014 10:04 pm

I have been sucked into this battle for rationality ( characterized by the quantitative experimental-analytical method of classical physics ) as an APL programmer offended by the amateur understanding of even undergraduate math and physics.
I only claim to understand that which I can compute . And APL notation which is as or more succinct as any physics text makes it possible to compute quantitative relationship I do understand .
On my own , I’ve only implemented the half dozen expressions needed to compute the equilibrium temperature of radiantly heated uniformly colored opaque balls . But even that shows the notion that Venus’s surface temperature , 2.25 times that of a gray ball in its orbit , is caused by a “runaway greenhouse effect” is absurd . It would have to be 10x as reflective in the IR as aluminum . See the AGW powerpoint on my CoSy.com .
But these relationships are the non-optional core of any more detailed model of a planet . These relationships are experimentally testable and have been relied upon since the 19th century .
Applying any spectral map to the sphere rather than a uniform spectrum is essentially adding an outer-product in APL . Adding what in computer graphics are called “sprites” to implement the vertical grid structure thru the atmosphere Dr Ball described is just another outer product . The programming is simple ; the data collection , not .
There is about 8 degrees ( 3% ) difference between our asserted surface temperature and the 279 kelvin of a uniform gray ball in our orbit . This implies we are on average 15% more reflective over the longer wavelengths than over the Sun’s spectrum .
I don’t know the data sets available. But my first question is whether there are any measurements of Earth’s averaged absorption=emission spectrum as seen from far away ? Can those measurements be reconciled with the calculated 15% ratio .
This would be a step towards forcing this nonscience back to the step-by-tested-step methods of any other branch of applied physics .