By Christopher Monckton of Brenchley
The IPCC’s forthcoming Fifth Assessment Report continues to suggest that the Earth will warm rapidly in the 21st century. How far are its projections short of observed reality?
A monthly benchmark graph, circulated widely to the news media, will help to dispel the costly notion that the world continues to warm at a rapid and dangerous rate.
The objective is to compare the IPCC’s projections with observed temperature changes at a glance.
The IPCC’s interval of temperature projections from 2005 is taken from the spaghetti-graph in AR5, which was based on 34 models running four anthropogenic-forcing scenarios.
Curiously, the back-projections for the training period from 2005-2013 are not centered either side of the observational record (shown in black): they are substantially above outturn. Nevertheless, I have followed the IPCC, adopting the approximate upper and lower bounds of its spaghetti-graph.
The 34 models’ central projection (in yellow below) is that warming from 2005-2050 should occur at a rate equivalent to approximately 2.3 Cº/century. This is below the IPCC’s long-established 3 Cº centennial prediction because the models expect warming to accelerate after 2050. The IPCC’s upper-bound and lower-bound projections are equivalent to 1.1 and 3.6 Cº/century respectively.
The temperature scale at left is zeroed to the observed temperature anomaly for January 2005. Offsets from this point determine the slopes of the models’ projections.
Here is the outturn graph. The IPCC’s projections are shown in pale blue.
The monthly global mean UAH observed lower-troposphere temperature anomalies (vortex.nsstc.uah.edu/data/msu/t2lt/uahncdc.lt) are plotted from the the beginning of the millennium in January 2001 to the latest available month (currently April 2013).
The satellite record is preferred because lower-troposphere measurements are somewhat less sensitive to urban heat-island effects than terrestrial measurements, and are very much less likely to have been tampered with.
January 2001 was chosen as a starting-point because it is sufficiently far from the Great El Niño of 1998 to prevent any distortion of the trend-line arising from the remarkable spike in global temperatures that year.
Since the 0.05 Cº measurement uncertainty even in satellite temperature anomalies is substantial, a simple least-squares linear regression trend is preferred to a higher-order polynomial fit.
The simplest test for statistical significance in the trend is adopted. Is the warming or cooling trend over the period of record greater than the measurement error in the dataset? On this basis, the zone of insignificance is shown in pink. At present the trend is at the upper bound of that zone and is thus barely significant.
The entire trend-line is beneath the interval of IPCC projections. Though this outcome is partly an artefact of the IPCC’s unorthodox training period, the slope of the linear trend, at just 0.5 Cº/century over the past 148 months, is visibly below half the slope of the IPCC’s lower-bound estimate of 1.1 Cº/century to 2050.
The principal result, shown in the panel at top left on the graph, is that the 0.5 Cº/century equivalent observed rate of warming over the past 12 years and 4 months is below a quarter of the 2.3 Cº/century rate that is the IPCC models’ current central projection of warming to 2050.
The only moment when the temperature anomaly reached the IPCC’s central estimate was at the peak of the substantial el Niño of 2010.
The RSS dataset, for which the April anomaly is not yet available, shows statistically significant cooling since January 2001 at a rate equivalent to 0.6 Cº/century.
Combining the two satellite temperature datasets by taking their arithmetic mean is legitimate, since their spatial coverage is similar. Net outturn is a statistically insignificant cooling at a rate equivalent to 0.1 Cº/century this millennium.
The discrepancy between the models’ projections and the observed outturn is startling. As the long period without statistically-significant warming (at least 17 years on all datasets; 23 years on the RSS data) continues, even another great el Niño will do little to bring the multi-decadal warming rate up to the IPCC’s least projection, which is equivalent to 1.1 Cº/ century to 2050.
Indeed, the maximum global warming rate sustained for more than a decade in the entire global instrumental record – equivalent to 1.7 Cº/century – is well below the IPCC’s mean projected warming rate of 2.3 Cº/century to 2050.
This discrepancy raises serious questions about the reliability of the models’ projections. Since theory would lead us to expect some anthropogenic warming, its absence suggests the models are undervaluing natural influences such as the Sun, whose activity is now rapidly declining following the near-Grand Maximum of 1925-1995 that peaked in 1960.
The models are also unable to predict the naturally-occurring changes in cloud cover which, according to one recent paper echoing a paper by me that was published three years ago, may have accounted for four and a half times as much warming from 1976-2001 as all other influences, including the influence of Man.
Nor can the models – or anyone else – predict el Niños more than a few months in advance. There is evidence to suggest that the ratio of el Niño to la Niña oscillations, which has declined recently, is a significant driver of medium-term temperature variation.
It is also possible that the models are inherently too sensitive to changes in radiative forcing and are taking insufficient account of the cooling effect of non-radiative transports
Furthermore, the models, in multiplying direct forcings by 3 to allow for allegedly net-positive temperature feedbacks, are relying upon an equation which, while applicable to the process engineering of electronic amplifiers for which it was designed, has no physical meaning in the real climate.
Without the Bode equation, net feedbacks may well be vanishingly different from zero, in which event the warming in response to a CO2 doubling, which is about the same as the centennial warming, will be equivalent to the IPCC’s currently-predicted minimum warming rate, equivalent to 1.1 Cº/century.
Be that as it may, as the above graph from the draft Fifth Assessment Report shows, in each of the four previous IPCC Assessment Reports the models have wildly over-projected the warming rate compared with the observed outturn, and, as the new outturn graph shows, the Fifth Assessment Report does the same.
I should be interested in readers’ reactions to the method and output. Would you like any changes to the monthly graph? And would it be worthwhile to circulate the monthly-updated graph widely to the news media as an answer to their dim question, “Why don’t you believe in global warming?”
Because there hasn’t been any to speak of this millennium, that’s why. The trouble that many of the media have taken to conceal this fact is shameful. This single, simple monthly graph, if widely circulated, will make it very much harder for them to pretend that the rate of global warming is accelerating and we are to blame, or that the “consensus” they have lazily accepted is trustworthy.
The climate scare has only lasted as long as it has because the truth that the models have failed and the world has scarcely warmed has been artfully hidden. Let it be hidden no longer.
If I were to vote, I would vote for Dr. Spencer’s graph.
But I wish to make a note about the other graph with the 75% and 95% lines. Assume the line is at the bottom of the 95% line. Is it then correct to say that there is a 5% chance the IPCC is correct? The reason I am asking is that there is a 2.5% chance the temperatures could be at the upper 95% line and a 2.5% chance it could be at the lower 95% line. So could we say 97.4% of cherry picked climate scientists are 97.5% wrong?
And which of the scenarios starts where we are now and goes to catastrophe?
The only scenarios of concern are those that include the recent past unless all can flip to anything anytime.
As careeristic climbatologers climb down, one by one, in coming years, WUWT headlines might all have a suffix-phrase attached: “another bottle of beer off the wall.”
Meantime, here’s a start:
“97 bottles of beer on the wall
97 bottles of beer on the wall
If one of those bottles should happen to fall
96 bottles of beer on the wall
(Repeat until down to none, which is where we are heading)”
Graphs should also have a CO2 trend line on them to show the disconnect with temperature.
Like the 97% of Climate Scientists were 100 % Wrong.
Commenters above have suggested that Lord Monckton’s graphs are too complex for public viewing. That may be true for science ignorant journalists, but not for the readership at WUWT.
This scientific illiterate has been slowly educated by this, the very, very best, website in the WWW to the point where I am increasingly confident in perusing and dissecting quite complex scientific papers and posts.. This osmosis to my and other brains may be the greatest of Anthony Watts’ many achievements.
Here’s an improved set of captions for that graph once its black line falls out of pink territory:
Well, you all missed it.
Viscount Monckton writes –
“It is also possible that the models are inherently too sensitive to changes in radiative forcing and are taking insufficient account of the cooling effect of non-radiative transports.”
That would have to be the understatement of the century.
Radiative gases are of course critical for continued convective circulation in the troposphere. This has been established science for some time. A simple explanation of the role of radiative gases in convective circulation can be found here –
http://www.st-andrews.ac.uk/~dib2/climate/tropics.html
Without radiative cooling at altitude and convective circulation below the tropopause, our atmosphere would heat dramatically.
So how did the pseudo science that adding radiative gases to the atmosphere causes warming get established? AGW supporter site Scienceofdoom has one answer –
http://scienceofdoom.com/2012/12/23/clouds-water-vapor-part-five-back-of-the-envelope-calcs-from-pierrehumbert/
– which includes the following summary of Pierrehumberts wholly un-empirical 1995 claims –
“So increasing the emissivity from zero (increasing “greenhouse” gases) cools the climate to begin with. Then as the emissivity increases past a certain point the warm pool surface temperatures start to increase again.”
A “certain point” was it? How many ppm is that? Empirical evidence? Not likely. This attempt to write the role of radiative gases in convective circulation out of atmospheric science will not stand up to scrutiny. It is no wonder that AGW supporters keep running back to static atmosphere two shell radiative models to justify their absurd claims. The AGW hypothesis fails for an atmosphere in which the gases are free to move.
The “97%” were 95% Certain–but 100% Wrong. Watts up with that? ; ^ )
David L,
The model outputs are expressed in year to year, or month to month changes (depending on the model). In no way do the modelers assert that any particular model, or even the ensemble, is meant to be a prediction of what will actually happen on any given year in the future. It is about climatic averages.
Who cares if the models are wrong, what counts is the answer, the answer to EVERYTHING, a powerful global government with vast powers to redistribute wealth, control all aspects of business and life.
Who cares about the QUESTION, it’s the ANSWER that counts. Obey the elite, they know what’s best for everyone. Especially themselves.
It looks to me like the obs have remained within the envelope of projections. In 1998 global temps went higher than any of the projections. In recent years temps have been near the bottom of the projections (individual model runs).
I also see individual model runs with neutral temp trends over almost 20 years that end up being warm, and cooling trends of 10-15 years here and there that also wind up warmer in the long run. So the models predict ~15-year time periods with no warming that eventually end up with warming as time progresses.
The short-term graph (from 2002) is a ludicrously too-short time period. To be on the safe side, 20-year blocks should be a minimum to winnow statistically significant results (25 years if using satellite data, which has more variance than surface records). Model outputs for global temperature are for the surface temp, not the lower troposphere, so one must use the instrumental records, not the satellite data, to compare applaes with apples.
I see nothing here that convincingly demonstrates observations bust the ensemble results.
The IPCC’s approach is completely flawed since it has all the veracity of asking 100 blokes in the same room to say how big their penises are, then forming a mean male penile size without bothering to get them all to drop their trousers and apply a ruler to their assertions.
Anyone who knows anything about group think, psychology and the like knows that no man will say in public to another bunch of blokes that they have a miniscule penis and few will admit to having a smaller than average one.
Equally, no scientist will say they don’t think much warming will happen if they are talking amongst grant awarding bodies. It’s not the way to get funded in the current regime and their Vice Chancellors or other senior University worthies will be on their case if they don’t keep shovelling millions into the University coffers.
In my opinion, the modellers are all equivalent to ambitious young Tories wanting to get on in the party. As a result, nationalisation is per se evil, management is never to be criticised and workers are acceptable collateral damage.
You don’t need to be Einstein to see that this is bullshit.
I could use the Labour Party as a similar example, please don’t ascribe my example to implying what my political views are. I merely illuminate by highlighting situations where you can’t be dispassionate if you want preferment. Climate modelling is a scenario which fits that reality.
How about a chart that illustrates the entire 160 year instrument record,100 years of model projections and that 3.2°C CO2 Climate Sensitivity?
http://oi56.tinypic.com/f3tlb6.jpg
It is totally impossible to prove (using valid physics) that climate is in any way inherently “sensitive to changes in radiative forcing” as is claimed in this post. Radiation cannot force changes in surface temperature. All that radiation from a cooler atmosphere can do to a warmer surface is slow that one-third of its rate of cooling which is itself due to radiation. The back radiation does this by supplying electromagnetic energy for most of the radiation being emitted from the surface. This portion of the radiation is thus not transferring any thermal energy from the surface.
When you understand this, then you can understand the original NASA net energy diagram which showed (in terms of incident Solar radiation) 19% absorbed by the atmosphere and clouds on the way in, then 51% absorbed by the surface with the remaining 30% reflected. Then, the 51% exits the surface with 23% by evaporative cooling thence latent heat, 7% by conduction and thence rising convection, 6% direct to space and 15% radiated and then absorbed by the atmosphere.
So the atmosphere absorbs more thermal energy from radiation on its way in (19%) than on its way out (15%) and hence there is more of an umbrella effect than a blanket effect.
Now, that’s not all. We find that there is an underlying thermal gradient which has formed autonomously over many years due to the effect of gravity upon individual molecules in free flight between collisions. This is indisputable, because the only way that the Second Law of Thermodynamics can establish the required thermodynamic equilibrium with maximum available entropy is for there to be no entropy gradient. But there cannot be isothermal conditions and no entropy gradient in a sealed, insulated cylinder in a vertical plane. Although there has been a post on here some time back saying that a wire outside a cylinder proves the gradient cannot happen, the logic in that was false simply because a thermal gradient also develops in the wire. It doesn’t matter if the gradients are different, there will evolve a thermodynamic equilibrium (without any cyclic energy flow) in the total system – cylinder and wire.
Until people can understand what happens on the planet Uranus, they will probably not understand what happens on Earth. Uranus receives hardly any Solar radiation, and far less at the base of its troposphere where the temperature is about 320K. Yet there is radiative equilibrium and thus no evidence of any cooling off process, or radioactive decay or fission or whatever. What happens is that, when the atmosphere absorbs new energy (such as at dawn) that energy can actually move in all directions, including up the thermal gradient toward the surface. This is because the thermodynamic equilibrium is disturbed. In fact any convection, up or down or sideways happens because of an extra supply of energy, such as when the surface transmits energy by conduction to the atmosphere on Earth.
This the only way that sufficient energy gets to heat the Uranus atmosphere at that depth. And it keeps on heating more at greater depths, following the autonomous thermal gradient which occurs in solids, liquids and gases, though weather conditions or extreme absorption (such as in the stratosphere) can over-ride the slow diffusion process that establishes the atmospheric gradient in calm conditions.
Only if you understand (and accept) that the gravitationally induced gradient does occur – only then will you understand how temperatures on Uranus, Venus and Earth reach the observed values. Even temperature in Earth’s crust and mantle continue to follow this upward gravitational gradient which varies with the specific heat.
However, water vapor can reduce the gradient, not so much by the release of latent heat, but by the transfer of heat to higher cooler layers by way of radiation. This has an obvious levelling effect, working against the gravity gradient.
Finally, the whole picture comes into focus when you realise that this gradient (in combination with insolation levels) actually predetermines the thermal plot in the troposphere, and thus the surface temperature. Hence, we would expect a supporting temperature which would be lower in more moist regions. The Sun could never have raised the surface temperature to what we observe without such a supporting temperature which slows the rate of cooling in the early hours before dawn. We know this happens, because the surface does not keep cooling at the same rate all night. Furthermore, actual climate data does in fact show that moist regions have lower daily maximum and minimum temperatures than dry ones with similar latitude and altitude – contrary to what is often claimed. But if you think about it, according to the greenhouse effect, water vapour is supposedly doing nearly all of that 33 degrees of warming. So it ought to be doing much more warming in some moist regions than in dry ones. We don’t find this happening, and so the greenhouse effect is not functioning as proposed and, in fact, does not control mean surface temperatures at all.
Konrad says:
May 5, 2013 at 6:58 pm
Well, you all missed it.
Viscount Monckton writes –
“It is also possible that the models are inherently too sensitive to changes in radiative forcing and are taking insufficient account of the cooling effect of non-radiative transports.”
That would have to be the understatement of the century.
Radiative gases are of course critical for continued convective circulation in the troposphere. This has been established science for some time. A simple explanation of the role of radiative gases in convective circulation can be found here –
http://www.st-andrews.ac.uk/~dib2/climate/tropics.html
Without radiative cooling at altitude and convective circulation below the tropopause, our atmosphere would heat dramatically.
So how did the pseudo science that adding radiative gases to the atmosphere causes warming get established? AGW supporter site Scienceofdoom has one answer –
http://scienceofdoom.com/2012/12/23/clouds-water-vapor-part-five-back-of-the-envelope-calcs-from-pierrehumbert/
– which includes the following summary of Pierrehumberts wholly un-empirical 1995 claims –
“So increasing the emissivity from zero (increasing “greenhouse” gases) cools the climate to begin with. Then as the emissivity increases past a certain point the warm pool surface temperatures start to increase again.”
A “certain point” was it? How many ppm is that? Empirical evidence? Not likely. This attempt to write the role of radiative gases in convective circulation out of atmospheric science will not stand up to scrutiny. It is no wonder that AGW supporters keep running back to static atmosphere two shell radiative models to justify their absurd claims. The AGW hypothesis fails for an atmosphere in which the gases are free to move.
==============================
Even their, AGW, static atmosphere doesn’t exist – the reason they don’t have convection, and consequently no weather at all, is because they have substituted the imaginary “ideal” gas pre Van der Waals, for real gas. Really, actually, created an “atmosphere” from ideal gas scenario without volume. They do not have anything to convect and their ideal gases are zooming off into outer space. There’s nothing static about that..
Their fictional ideal gas molecules are without mass, so they have no gravity in their world which is what gives gases relative weight- all their ideal gases go directly from the Earth’s surface to empty space.
They are climate scientists with no climate.
Their AGW gases are non-condensable – of course they’re not, they have no real gas molecules only the imaginary ideal gas hard dots of no mass nothing diffusing instantly by their own molecular momentum zipping at great speeds through empty space miles apart from each other and mixing thoroughly by bouncing off each other in elastic collisions – they don’t have gases rising and sinking in air as they expand when heated and condense when cooled which is how we get our winds and weather.
Their imaginary ideal gases are not bouyant in air – of course they’re not, they don’t have any air for a start, but their gases are not bouyant in air because their gases are hard dots of nothing with no volume to expand so they cannot become less dense and lighter than air and so rise and so cannot be bouyant in air.
Their AGW ideal gases have no attraction, they are hard dots of nothing with no mass not real molecules of gas – so they have no rain in their carbon cycle which is the attraction of water and carbon dioxide – all natural clean unpolluted rain is carbonic acid.
It is pointless explaining to them in terms of the real physical world around us, because they don’t have any of this. Your explanations don’t make sense in their ideal gas world. Their gases can’t do what your gases do.
There is no internal logic in their fisics, it cannot be called physics because their AGW world is purely imaginary, because in their empty space atmosphere scenario without gravity and gases without volume and attraction, all their ideal carbon dioxide continue to diffuse at great speeds into outer space, so cannot accumulate in their empty space atmosphere for the hundreds and thousands of years they claim it does.
They pretend to talk ‘physics’ giving example their ideal gases bouncing off the sides of a container creating pressure – where is their invisible container around the Earth keeping in their ideal gas dots of no mass nothing for gravity to pull in?
Is this the same invisible barrier at the TOA which they say stops the direct heat of thermal longwave infrared from the Sun?
[Their Sun either doesn’t produce any longwave infrared, which is the Sun’s thermal energy in transfer by radiation, radiant heat, or, their Sun’s direct radiant heat is stopped by some invisible barrier like the glass of a greenhouse.]
The cooling by the non-radiative gases in the real world which is practically the whole of atmosphere of real nitrogen and oxygen which is our real gas air is how we get our winds, as hot air rises and cold air sinks.
Volumes (packets) of real gas air expand when heated and so less dense is lighter than air under gravity and rise, and spontaneously volumes of colder air heavier and denser under gravity will sink and flow beneath the less dense.
Air will rise taking heat away from the surface where the heated molecules are creating areas of low pressure because less dense and sink from colder areas of high pressure which they create by condensing when cold becoming more dense and heavier, in conjunction with gravity giving them weight relative to each other; so: Winds Flow from High to Low.
This is bog standard basic meteorology in the real world. Built on the understanding of the actual properties and processes of real gases under gravity.
AGWScienceFiction’s Greenhouse Effect doesn’t have real gases. It is an imaginary world, a fiction.
What Monckton isn’t taking into account is that water in the atmosphere is also a real gas and its properties in convection of cooling far exceed any radiative properties it has, and greater at cooling than convection of air, and, that oxygen and nitrogen are the real thermal blanket around our Earth trapping heat.
Water has a great heat capacity, which means it stores, traps, a great deal of heat before changing temperature.
So, water heated at the surface evaporates taking huge amounts of heat away from the surface as it becomes even more lighter than the real air around it under gravity, as it expands and less dense rises.
In the cooler heights this heat energy laden water vapour releases its heat, heat flows spontaneously from hot to cold, and condenses back to liquid water and ice, and so colder and heavier than air precipitates out as rain.
AGWScienceFiction has excised the Water Cycle, the cooling cycle of the real Earth.
They have no way to get the clouds they keep rabitting on about ..
Now, it is important to note here exactly what AGWSF has done by sleight of hand to create its Greenhouse Effect Illusion in its claim that “ir imbibing greenhouse gases mostly water and carbon dioxide warm the Earth 33°C from the -18°C it would be without them.”
That -18°C figure is from real world traditional physics and is the temperature of the Earth without any atmosphere at all, not, “without these AGW greenhouse gases”, but without the rest of the whole atmosphere which is practically all nitrogen and oxygen.
The comparison in real traditional physics is with the Moon without an atmosphere, and the comparable figure for the Moon is around -23°C.
AGWSF has committed science fraud here by claiming the -18°C figure relates to absence only of their “greenhouse gases” leaving “the rest of the atmosphere in place”.
Earth with atmosphere: 15°C
Earth without any atmosphere at all: -18°C
Moon without any atmosphere: -23°C
Now, here’s the interesting bit of why they took out real gases from their ’empty space atmosphere:
In the real world physics, with the rest of the atmosphere in place, the Earth with atmosphere of real gas with volume weight and attraction of mainly nitrogen and oxygen would be 67°C if water was absent.
Think deserts.
The Earth with its real gas heavy atmosphere of nitrogen and oxygen weighing down on us 14lb a square inch is what is really acting as a thermal blanket around the Earth – these are real greenhouse gases warming the real world preventing the extremes of temperature cold of the Moon without this atmosphere.
So, without the “AGW ir imbibing greenhouse gases of mainly water”, the Earth would be 52°C hotter..,
not “33°C colder”
AGWScienceFiction has changed the meaning of “greenhouse gases” and changed the meaning of “greenhouse”.
The Earth’s real Greenhouse is the whole of its atmosphere of real gases around it which both warm and cool the Earth preventing the extremes of the Moon without an atmosphere. Just like a real greenhouse, which is why the analogy was first used in traditional physics. Real greenhouses both heat and cool to give optimum growing conditions for the plants, AGWSF has changed this to mean “only warm”.
Earth with atmosphere: 15°C
Earth without any atmosphere: -18°C
Earth with atmosphere in place but without water: 67°C
AGWSF have taken out the Water Cycle which in the real world cools the Earth bringing the temperature down from 67°C it would be without it to 15°C.
The conclusion is obvious, the “AGW greenhouse ir imbing gas warming of 33°C” is an illusion.
Created by sleight of hand science fraud changes to real physics, manipulating properties and processes.
barry says:
May 5, 2013 at 10:19 pm
It looks to me like the obs have remained within the envelope of projections. In 1998 global temps went higher than any of the projections. In recent years temps have been near the bottom of the projections (individual model runs).
/end quote
barry EVERYTHING before 2005 is NOT PROJECTIONS.
it is HISTORY, known at the time, so it is MEANINGLESS to look at the pre 2005 values for ANY indication as to how well the models work. You may prefer to think of the pre 2005 values as ‘training’ for the models to see if they can recreate what was known to have ALREADY happened
See Vulvic’s version with the training runs removed and only the PROJECTIONS shown.
barry says:
May 5, 2013 at 10:19 pm
It looks to me like the obs have remained within the envelope of projections. In 1998 global temps went higher than any of the projections. In recent years temps have been near the bottom of the projections (individual model runs).
/end quote
just another thought for you
given that the prev2005 data is comparing the models against known facts; and these known facts were used to ‘train’ the models; wouldn’t you expect them to produce 100% match ?
I occasionally use simple models in my job, most of these are abject failures, totally u/s (unsuitable for use).
I suspect the diagram would look a lot better if only the actual emission scenario model runs were used. This would eliminate 75% of the noise as well as many of the lower lines that make it look like the models were closer to reality than they were.
Why bother showing lines from emissions scenarios that have not and will not happen?
The Hurricane makes landfall in a single location. For each, I hope that location is not my house. After landfall, the models are discarded. However, the data points are added, and over time the models have improved. This suggests at least one branch of climate science where “making spaghetti” is properly done. Perhaps, in a future, more enlightened era,the temperature folks could learn some methodology from their wind tracking colleagues (who also note no increase recently in storm frequency or severity).
@ur momisugly vukcevic
Nice graphic! My only suggestion would be to substitute “actual predictions” for “true predictions” in the side note.
“True” may imply that these predictions are accurate or correct to some of our media people here in the States.
peter_dtm,
barry EVERYTHING before 2005 is NOT PROJECTIONS.
it is HISTORY, known at the time, so it is MEANINGLESS to look at the pre 2005 values for ANY indication as to how well the models work.
Yes, ‘projections’ is the wrong term, but the pre-2005 model runs are not based on temperature data. They had a much better idea of the forcings, of course, but the hindcasting is still an estimate based on the models. You can see that there are different model runs, can’t you? You’re not looking at the instrumental record there, you’re looking at the same types of models that make the projections, just with some more certain data (not actual temps).
I wonder why Monckton compares trends since 2001 with the post-2005 model ensemble? Time-frames are too short to get meaningful results. Linear trend since 2001 is definitely not statistically significant: 0.047C/decade +/- 0.281. The uncertainty is 6 times larger than the trend!
It is absolutely vital, rather, to look at how models do at hindcasting, because there we have something concrete to test the models against. Hindcasting is one of the best possible ways to improve models.
In any event, individual model runs (projections) show 10 – 20 year neutral trends with long-term warming, and the recent temps are still within the envelope. What we’re currently seeing shows up (at various times) in model runs than wind up with warming over the long-term, so unless anyone was expecting models to predict the temperature for every year accurately, then the obs don’t bust the models. Yet.
The earth has been warming since the end of the Little Ice Age, with localized accelerations and plateaus along the way, just as the earth has done many times before, and will do again.
So the question must be asked ….. are the “improvements” being applied to the models through hindcasting giving us a better understanding of how the earth’s climate system actually works, or are these “improvements” giving us instead nothing more than a better fit to a known curve, but without necessarily telling us anything more useful than we had before about what is actually happening up there in the atmosphere and down there in the ocean?
Plant a Seed of Doubt in Their Minds
The true believers and the average person who has not been following Climate Change generally will not read anything that might upset their world view. There is too much conflicting information, too many competing scientists, theories and studies. Therefore I have condensed the strongest argument into a short letter which is below. I have had amazing sucess with opening minds with this simple message which plants a seed of doubt in their mind and leads them to the Economist article. A simple graph like Vuk’s will also work and even better if it has a CO2 line on it.
Climate Sensitivity May Have Been Overestimated.
The Economist Magazine has a new article on Climate Sensitivity that is a must read.
See http://www.economist.com/news/science-and-technology/21574461-climate-may-be-heating-up-less-response-greenhouse-gas-emissions
The top climate scientists in the world have acknowledged that the global temperatures are trending way below their forecasts despite higher CO2 releases. The Climate Sensitivity to changes in CO2 may have been over estimated. This means that something may be wrong with the theories in the computer models.
Anthropogenic Global Warming (AGW) has three main theories that each depend on the previous theory. The first theory is that the first doubling of CO2 will cause about 1 C of warming due to back radiation from the increased CO2. Subsequent doublings have minimal effect due to the logarithmic decline in back radiation.
The second theory is called the amplification or positive feedback theory. The 1 C warming should cause higher humidity and more low clouds which should trap more heat. The problem is that clouds can also reflect sunlight or condense into precipitation which will cause cooling. The net effect may even be negative so the models may be way off. The article refers to various new peer reviewed studies that now estimate climate sensitivity to be less than 2 C.
The third theory is that the estimated warming will large enough to be bad. The world has warmed about .8 C so climate sensitivity estimates of a total of 2 C are very unlikely to lead to extreme weather as there is no scientific mechanism for CO2 to influence the climate without warming. Mild warming has many benefits like less fuel use, less cold deaths (see Europe for last 2 winters), minor sea level rise and easier lives. Mild warming combined with higher CO2 concentrations also increases crop yields and greens the earth.
This will be great news for the world if the Climate Crisis has been over estimated and overstated. The 150 billion dollars that the world has spent to date is gone (not counting 100’s of billions on wind and solar) but the world may not have to spend the trillions that scientists and politicians forecasted. The Climate Sensitivity Questions need to be resolved as quickly as possible but we may have to wait for actual temperatures to be the judge.
Beta Blocker,
The answer to your questions are in the model inputs. Something that you can easily look up if you’re curious. i wouldn’t presume to school anyone on that.
Notice that none of the hindcasts ‘predicted’ the 1998 anomaly. That falls outside the envelope of the hindcast spread. Clearly they aren’t just punching in the temperature data. So, if they’re tweaking some parametrics on turbulent phenomena that aren’t well-constrained by physics in order to get a better fit, and they test those phenomena with multiple runs and changing other parameters, then this would be a good way to bound a component of the system when the physics do not well contain it. Obviously, they cannot account for every molecule in the climate system, so they must generalise phenomena. There is nothing wrong with training in this way if it is done honestly to improve the models (and of course it is – or what would be the point?).
I’m not sure what the LIA has to do with anything. The Earth’s climate is not some piece of elastic that bounces back from every perturbation. There must be causes that change the global temperature, and these are the forcings that are investigated. Attributing cause is a major part of the research, obviously.
This thread went totally off the rails or OT if you will.
But let me say to Christopher Monckton, thank you. That was a very well put letter and hopefully the powers that be may grasp what you have presented.