Five points about climate change

Guest essay by Professor Philip Lloyd, Cape Peninsula University of Technology

Daily we are told that we are wicked to burn fossil fuels.  The carbon dioxide which is inevitably emitted accumulates in the atmosphere and the result is “climate change.” If the stories are to believed, disaster awaits us. Crops will wither, rivers will dry up, polar bears will disappear and malaria will become rampant.

It is a very big “IF”. We could waste trillions for nothing.  Indeed, Lord Stern has estimated that it would be worth spending a few trillion dollars each year to avoid a possible disaster in 200 years’ time. Because he is associated with the London School of Economics he is believed – by those whose experience of insurance is limited. Those who have experience know that it is not worth insuring against something that might happen in 200 years time – it is infinitely better to make certain your children can cope. With any luck, they will do the same for their children, and our great-great-great grandchildren will be fine individuals more than able to deal with Lord Stern’s little problem.

So I decided to examine the hypothesis from first principles. There are five steps to the hypothesis:

1. The carbon dioxide (CO2) content of the atmosphere is rising.

2. The rise in CO2 in the atmosphere is largely paralleled by the increase in fossil fuel combustion. Combustion of fossil fuels results in emission of CO2, so it is eminently reasonable to link the two increases.

3. CO2 can scatter infra-red over wavelengths primarily at about 15 µm. Infra-red of that wavelength, which should be carrying energy away from the planet, is scattered back into the lower troposphere, where the added energy input should cause an increase in the temperature.

4. The expected increase in the energy of the lower troposphere may cause long-term changes in the climate and thermosphere, which could be characterized by increasing frequency and/or magnitude of extreme weather events, an increase in sea temperatures, a reduction in ice cover and many other changes.

5. The greatest threat is that sea levels may rise and flood large areas presently densely inhabited.

Are these hypotheses sustainable in the scientific sense? Is there a solid logic linking each step in this chain?

The increase in CO2 in the atmosphere is incontrovertible. Many measurements show this. For instance, since 1958 there have been continuous measurements at the Mauna Loa observatory in Hawaii:

clip_image002

The annual rise and fall is due to deciduous plants growing or resting, depending on the season.  But the long-term trend is ever-increasing levels of CO2 in the atmosphere.

There were only sporadic readings of CO2 before 1958, no continuous measurements. Nevertheless, there is sufficient information to construct a view back to 1850:

clip_image004

There was a slight surge in atmospheric levels about 1900, then a period of near stasis until after 1950, when there was a strong and ongoing increase which has continued to this day. Remember this pattern – it will re-appear in a different guise.

The conclusion is clear – there has been an increase in the carbon dioxide in the atmosphere. What may have caused it?

Well, there is the same pattern in the CO2 emissions from the burning of fossils fuels and other industrial sources:

clip_image006

A similar pattern is no proof – correlation is not causation. But if you try to link the emissions directly to the growth in atmospheric CO2, you fail. There are many partly understood “sinks” which remove CO2 from the atmosphere. Trying to follow the dynamics of all the sinks has proved difficult, so we do not have a really good chemical balance between what is emitted and what turns up in the air.

Fortunately isotopes come to our aid. There are two primary plant chemistries, called C3 and C4. C3 plants are ancient, and they tend to prefer the 12C carbon isotope to the 13C. Plants with a C4 chemistry are comparatively recent arrivals, and they are not so picky about their isotopic diet. Fossil fuels primarily come from a time before C4 chemistry had evolved, so they are richer in 12C than today’s biomass. Injecting into the air 12C-rich CO2 from fossil fuels should therefore cause the 13C in the air to drop, which is precisely what is observed:

clip_image008

So the evidence that fossil fuel burning is the underlying cause of the increase in the CO2 in the atmosphere is really conclusive.  But does it have any effect?

Carbon dioxide scatters infra-red over a narrow range of energies.  The infra-red photons, which should be carrying energy away from the planet, are scattered back into the lower troposphere. The retained energy should cause an increase in the temperature.

Viewing the planet from space is revealing:

clip_image010clip_image012

The upper grey line shows the spectrum which approximates that of a planet of Earth’s average albedo at a temperature 280K. That is the temperature about 5km above surface where incoming and outgoing radiation are in balance. The actual spectrum is shown by the blue line. The difference between the two is the energy lost by scattering processes caused by greenhouse gases. Water vapour has by far the largest effect. CO2 contributes to the loss between about 13 and 17 μm, and ozone contributes to the loss between about 9 and 10µm.

The effect of carbon dioxide absorption drops off logarithmically with concentration. Doubling the concentration will not double any effect. Indeed, at present there is ~400ppm in the atmosphere.  We are unlikely see a much different world at 800ppm. It will be greener – plants grow better on a richer diet – and it may be slightly warmer and slightly wetter, but otherwise it would look very like our present world.

However, just as any effect will lessen proportionately with increase in concentration, so it will increase proportionately with any decrease.  If there are to be any observable effects, they should be visible in the historical records.  Have we seen them?

There are “official” historical global temperature records. A recent version from the Hadley Climate Research unit is:

clip_image014

The vertical axis gives what is known as the “temperature anomaly”, the change from the average temperature over the period 1950-1980. Recall that carbon dioxide only became significant after 1950, so we can look at this figure with that fact in mind:

* from 1870 to 1910, temperatures dropped, there was no significant rise in carbon dioxide

* from 1910 to 1950, temperatures rose, there was no significant rise in carbon dioxide.

* from 1950 to 1975, temperatures dropped, carbon dioxide increased

* from 1975 to 2000, both temperature and carbon dioxide increased

* from 2000 to 2015, temperatures rose slowly but carbon dioxide increased strongly.

Does carbon dioxide drive temperature changes? Looking at this evidence, one would have to say that, if there is any relationship, it must be a very weak one. In one study I made of the ice core record over 8 000 years, I found that there was a 95% chance that the temperature would change naturally by as much as +/-2degrees C during 100 years. During the 20th century, it changed by about 0.8degrees C. The conclusion? If carbon dioxide in the atmosphere does indeed cause global warming, then the signal has yet to emerge from the natural noise.

One of the problems with the “official” temperature records such as the Hadley series shown above is that the official record has been the subject of “adjustments”. While some adjustment of the raw data is obviously needed, such as that for the altitude of the measuring site, the pattern of adjustments has been such as to cool the past and warm the present, making global warming seem more serious than the raw data warrants.

It may seem unreasonable to refer to the official data as “adjusted”. However, the basis for the official data is what is known as the Global Historical Climatology Network, or GHCN, and it has been arbitrarily adjusted. For example, it is possible to compare the raw data for Cape Town, 1880-2011, to the adjustments made to the data in developing GHCN series Ver. 3:

clip_image016

The Goddard Institute for Space Studies is responsible for the GHCN. The Institute was approached for the metadata underlying the adjustments. They provided a single line of data, giving the station’s geographical co-ordinates and height above mean sea-level, and a short string of meaningless data including the word “COOL”. The basis for the adjustments is therefore unknown, but the fact that about 40 successive years of data were “adjusted” by exactly 1.10 degrees C strongly suggests fingers rather than algorithms were involved.

There has been so much tampering with the “official” records of global warming that they have no credibility at all. That is not to say that the Earth has not warmed over the last couple of centuries.  Glaciers have retreated, snow-lines risen. There has been warming, but we do not know by how much.

Interestingly, the observed temperatures are not unique. For instance, the melting of ice on Alpine passes in Europe has revealed paths that were in regular use a thousand years and more ago. They were then covered by ice which has only melted recently. The detritus cast away beside the paths by those ancient travellers is providing a rich vein of archaeological material.

So the world was at least as warm a millennium ago as it is today. It has warmed over the past few hundred years, but the warming is primarily natural in origin, and has nothing to do with human activities.  We do not even have a firm idea as to whether there is any impact of human activities at all, and certainly cannot say whether any of the observed warming has an anthropogenic origin. The physics say we should have some effect; but we cannot yet distinguish it from the natural variation.

Those who seek to accuse us of carbon crime have therefore developed another tool – the global circulation model. This is a computer representation of the atmosphere, which calculates the conditions inside a slice of the atmosphere, typically 5km x 5km x 1km, and links each to an adjacent slice (if you have a big enough computer – otherwise your slices have to be bigger).

The modellers typically start their calculations some years back, for which there is a known climate, and try to see they can predict the (known) climate from when they start up to today.  There are many adjustable parameters in the models, and by twiddling enough of these digital knobs, they can “tune” the model to history.

Once the model seems to be able to reproduce historical data well enough, it is let rip on the future. There is a hope that, while the models may not be perfect, if different people run different tunings at different times, a reasonable range of predictions will emerge, from which some idea of the future may be gained.

Unfortunately the hopes have been dashed too often. The El Nino phenomenon is well understood; it has a significant impact on the global climate; yet none of the models can cope with it. Similarly, the models cannot do hurricanes/typhoons – the 5kmx5km scale is just too coarse. They cannot do local climates – a test of two areas only 5km apart, one of which receives at least 2 000mm of rain annually, and the other averages just on 250mm, failed badly.  There was good wind and temperature data and the local topography. The problem was modelled with a very fine grid, but there were not enough tuning knobs to be able to match history.

Even the basic physics used in these models fails. The basic physics predicts that, between the two Tropics, the upper atmosphere should warm faster than the surface. We regularly fly weather balloons carrying thermometers into this region. There are three separate balloon data sets, and they agree that there is no sign of extra warming:

clip_image018

The average of the three sets is given by the black squares. The altitude is given in terms of pressure, 100 000Pa at ground level and 20 000Pa at about 9km above surface. There are 22 different models, and their average is shown by the black line.  At ground level, measurement shows warming by 0.1degrees C per decade, but the models predict 0.2degrees C per decade.  At 9km, measurement still shows close to 0.1degrees C, but the models show an average of 0.4degrees C and extreme values as high as 0.6degrees C. Models that are wrong by a factor of 4 or more cannot be considered scientific. They should not even be accepted for publication – they are wrong.

The hypothesis that we can predict future climate on the basis of models that are already known to fail is false. International agreements to control future temperature rises to X degrees C above pre-industrial global averages have more to do with the clothing of emperors than reality.

So the third step in our understanding of the climate boondoggle can only conclude that yes, the world is warming, but by how much and why, we really haven’t a clue.

What might the climate effects of a warmer world be? What is “climate”? It is the result of averaging a climatological variable, such as rainfall or atmospheric pressure, measured typically over a month or a season, where the average is taken over several years so as to give an indication of the weather that might be expected at that month or season.

Secondly, we need to understand the meaning of “change”. In this context it clearly means that the average of a climatological variable over X number of years will differ from the same variable averaged over a different set of X years.  But it is readily observable that the weather changes from year to year, so there will be a natural variation in the climate from one period of X years to another period X years long.  One therefore needs to know how long X must be to determine the natural variability and thus to detect reliably any change in the measured climate.

This aspect of “climate change” appears to have been overlooked in all the debate.  It seems to be supposed that there was a “pre-industrial” climate, which was measured over a large number of years before industry became a significant factor in our existence, and that the climate we now observe is statistically different from that hypothetical climate.

The problem, of course, is that there is very little actual data from those pre-industrial days, so we have no means of knowing what the climate really was.  There is no baseline from which we can measure change.

Faced by this difficulty, the proponents of climate change have modified the hypothesis. It is supposed that the observed warming of the earth will change the climate in such a way as to make extreme events more frequent. This does not alter the difficulty; in fact, it makes it worse.

To illustrate, assume that an extreme event is one that falls outside the 95% probability bounds. So in 100 years, one would expect 5 extreme events on average.  Rather than taking 100 years of data to obtain the average climate, there are now only 5 years to obtain an estimate of the average extreme event, and the relative error in averaging 5 variable events is obviously much larger than the relative error in averaging 100 variable events.

The rainfall data for England and Wales demonstrates this quite convincingly:-

clip_image020

The detrended data are close to normally distributed, so that it is quite reasonable to use normal statistics for this. The 5% limits are thus two standard deviations either side of the mean.  In the 250-year record, 12.5 extreme events (those outside the 95% bounds) would be expected.  In fact, there are 7 above the upper bound and 4 below the lower bound, or 11 in total. Thus it requires 250 years to get a reasonable estimate (within 12%) of only the frequency of extreme rainfall.  There is no possibility of detecting any change in this frequency, as would be needed to demonstrate “climate change”.

Indeed, a human lifespan is insufficient even to detect the frequency of the extreme events.  In successive 60-year periods, there are 2, 4, 2 and 2 events, an average of 2.5 events with a standard deviation of 1.0. There is a 95% chance of seeing between 0.5 and 5.5 extreme events in 60 years, where 3 (5% of 60) are expected. Several lifetimes are necessary determine the frequency with any accuracy, and many more to determine any change in the frequency.

It is known to have been warming for at least 150 years. If warming had resulted in more extreme weather, it might have been expected that there was some evidence for an increase in extreme events over that period. The popular press certainly tries to be convincing when an apparently violent storm arises. But none of the climatological indicators that have data going back at least 100 years show any sign of an increase in frequency of extreme events

For instance, there have been many claims that tropical cyclones are increasing in their frequency and severity.  The World Meteorological Organisation reports:  “It remains uncertain whether past changes in tropical cyclone activity have exceeded the variability expected from natural causes.”

It is true that the damage from cyclones is increasing, but this is not due to more severe weather.  It is the result of there being more dwellings, and each dwelling being more valuable, than was the case 20 or more years ago.  Over a century of data was carefully analysed to reach this conclusion.  The IPCC report on extreme events agrees with this finding.

Severe weather of any kind is most unlikely to make any part of our planet uninhabitable – that includes drought, severe storms and high winds. In fact, this is not too surprising – humanity has learned how to cope with extreme weather, and human beings occupy regions from the most frigid to the most scalding, from sea level to heights where sea-level-dwellers struggle for breath. Not only are we adaptable, but we have also learned how to build structures that will shield us from the forces of nature.

Of course, such protection comes at a cost. Not everyone can afford the structures needed for their preservation.  Villages are regularly flattened by storms that would leave most modern cities undamaged. Flood control measures are designed for the one-in-a-hundred year events, and they generally work – whereas low-lying areas in poor nations are regularly inundated for want of suitable defences.

Indeed, it is a tribute to the ability of engineers to protect against all manner of natural forces.  For instance, the magnitude 9 Tōhoku earthquake of 2011 (which caused the tsunami that destroyed the reactors at Fukushima) caused little physical damage to buildings, whereas earlier that year, the “mere” magnitude 7 earthquake in Wellington, New Zealand, toppled the cathedral, which was not designed to withstand earthquakes.

We should not fear extreme weather events. There is no evidence that they are any stronger than they were in the past, and most of us have adequate defenses against them.  Of course, somewhere our defenses will fail, but that is usually because of a design fault by man, not an excessive force of Nature. Here, on the fourth step of our journey, we can clearly see the climate change hypothesis stumble and fall.

In the same way, most of the other scare stories about “climate change” fail when tested against real data. Polar bears are not vanishing from the face of the earth; indeed, the International Union for the Conservation of Nature can detect no change in the rate of loss of species over the past 400 years. Temperature has never been a strong determinant of the spread of malaria – lack of public health measures is a critical component in its propagation. Species are migrating, but whether temperature is the driver is doubtful – diurnal and seasonal temperature changes are so huge that a fractional change in the average temperature is unlikely to be the cause. Glaciers are melting, but the world is warmer, so you would expect them to melt.

There remains one last question – will the seas rise and submerge our coastlines?

First it needs to be recognized that the sea level is rising. It has been rising for about the past 25 000 years. However, for the past 7 millennia it has been rising slower than ever before:

clip_image022

The critical question is whether the observed slow rate of rise has increased as a result of the warming climate. There are several lines of evidence that it has not.  One is the long-term data from tide gauges. These have to be treated with caution because there are areas where the land is sinking (such as the Gulf of Mexico, where the silt carried down the Mississippi is weighing down the crust), and others where it is rising (such as much of Scandinavia, relieved of a burden of a few thousand metres of ice about 10 000 years ago). A typical long-term tide gauge record is New York:

clip_image024

The 1860-1950 trend was 2.47-3.17mm/a; the 1950-2014 trend was 2.80-3.42mm/a, both at a 95% confidence level. The two trends are statistically indistinguishable. There is <5% probability that they might show any acceleration after 1950.

Another line of evidence comes from satellite measurements of sea level.  The figure below shows the latest available satellite information – it only extends back until 1993. Nevertheless, the 3.3±0.3mm/a rise in sea level is entirely consistent with the tide gauge record:

clip_image026

Thus several lines of evidence point to the present rate of sea level rise being about 3mm/a or 30cm per century. Our existing defences against the sea have to deal with diurnal tidal changes of several metres, and low-pressure-induced storm surges of several metres more.  The average height of our defences above mean sea level is about 7m, so adding 0.3m in the next century will reduce the number of waves that occasionally overtop the barrier.

The IPCC predicts that the sea level will rise by between 0.4 and 0.7m during this century. Given the wide range of the prediction, there is a possibility they could be right. Importantly, even a 0.7m rise is not likely to be a disaster, in the light of the fact that our defences are already metres high – adding 10% to them over the next 80 years would be costly, but we would have decades to make the change, and should have more than adequate warning of any significant increase in the rate of sea level rise.

To conclude, our five steps have shown:

· the combustion of ever increasing quantities of fossil fuel has boosted the carbon dioxide concentration of the atmosphere.

· The physical impact of that increase is not demonstrable in a scientific way. There may be some warming of the atmosphere, but at present any warming is almost certainly hidden in the natural variation of temperatures.

· there is no significant evidence either for any increase in the frequency or magnitude of weather phenomena, or climate-related changes in the biosphere.

· any sea level rise over the coming century is unlikely to present an insuperable challenge.

Attempts to influence global temperatures by controlling carbon dioxide emissions are likely to be both futile and economically disastrous.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
175 Comments
Inline Feedbacks
View all comments
April 8, 2016 8:08 am

Per IPCC AR5 Table 6.1 the increase in CO2 is 2/3 FF & 1/3 land use which rarely gets mention.

ossqss
Reply to  Nicholas Schroeder
April 8, 2016 10:06 am

Courtesy of R.P. Sr. via twitter yesterday relating to such.
http://onlinelibrary.wiley.com/doi/10.1002/2015JD024102/full

RWturner
Reply to  Nicholas Schroeder
April 8, 2016 11:12 am

Also, CO2 is always shown to be a normal green house gas along with the others, but it’s not. CO2 is a non polar molecule, unlike the others, and has only a temporary induced dipole. I think this may be why the GCMs fail so miserably.
CO2 is not the green house gas they make it out to be; it absorbs no infrared radiation when not in its temporary dipole state and the wavelengths it does absorb as a dipole were already opaque to the atmosphere in preindustrial times.
https://stevengoddard.wordpress.com/2014/01/25/ir-expert-speaks-out-after-40-years-of-silence-its-the-water-vapor-stupid-and-not-the-co2/

Reply to  RWturner
April 8, 2016 2:53 pm

Methane is a polar molecule? It does not look like it. It’s a symmetric tetrahedron and the C-H dipole is not very large at all. I don’t see much a dipole for methane and notice no electron asymmetry. At least with O=C=O oxygen has significant electronegativity. I guess methane must be “special” in some way, with its GHG effect due to some other cause?

Reply to  RWturner
April 9, 2016 9:48 am

Not true, the bending mode of CO2 which is the wavelength we are concerned with in this case is perpetually bending even in its ground state. Consequently its dipole is constantly changing and is nonzero for the vast majority of the time only very briefly is it straight during this vibration Think of it like a pendulum which is only vertical for a very small part of its period (the period of the CO2 vibration is order 10^-14 sec). In the case of CO2 it is further complicated by the fact that there are two orthogonal bending modes.
The post on Goddard’s site is a joke, an astronomer cites the fact that ‘window’ region in which he was constrained to make his observations didn’t show absorption by CO2, therefore CO2 doesn’t absorb IR, the whole reason why he was only using that ‘window’ was because CO2 doesn’t emit there! Had he tried to make observations at 14-15 microns he would only have seen CO2 emissions.
Mark also comments about the fact that methane is a symmetrical molecule, the same logic applies there, the individual bonds are vibrating independently so the molecule is virtually never symmetrical!
http://www2.ess.ucla.edu/~schauble/MoleculeHTML/CH4_html/CH4_page.html
mark4asp April 8, 2016 at 2:53 pm

April 8, 2016 8:10 am

Excellent, concise argument against global warming alarms. Why are such articles not widely available for the public to appreciate and understand?
You are doing science. But “Climate Science” these days is something else altogether. For those who have forgotten how we got here, Richard Lindzen remembers:
https://rclutz.wordpress.com/2016/04/08/climate-science-was-broken/

Reply to  Ron Clutz
April 8, 2016 8:18 am

+10

Barbara
Reply to  Ron Clutz
April 8, 2016 8:22 pm

Can’t make money off what Prof. Lloyd has written!

Reply to  Barbara
April 9, 2016 5:24 am

That’s the bottom line, all right.

Ian Magness
April 8, 2016 8:14 am

Quite brilliant argument. This should be widely circulated in government and education, to name but two spheres.

Reply to  Ian Magness
April 8, 2016 2:03 pm

…and the MSM, but their “investigative journalists” would not understand even the simple parts of this reasoning. Our daily newspaper in Auckland is an extreme example, regularly printing the most ridiculous garbage that one has to wonder what financial inducements are involved. Similarly, the most unscientific young reporters are permitted to spout ignorant personal opinions with no balancing fact-based comment.

Firey
Reply to  mikelowe2013
April 8, 2016 6:01 pm

Exactly the same in Sydney, one of the major daily newspapers prints any “wild” climate change claim. No investigation required just print it. It seems some reporters are “green” activists first & investigative journalists second.
As our former PM said, “beware of socialism masquerading as environmentalism”.

Tom Yoke
Reply to  Ian Magness
April 8, 2016 6:58 pm

mikelowe, I yield to no one in cynicism towards the MSM, but I don’t really agree with your analysis of the problem.
The problem is that any reporter who went against the “Green” platitudes that pervade such places, would very likely find his/her own career in jeopardy. They would most certainly be accused of being a “denier”, with overtones of the Holocaust. They are also certain to be accused of being in the pay of the fossil fuel companies.
Furthermore, and very much to the point, these accusations would be made by people who seriously matter in determining the reporter’s rise or fall in status. This is not a mild effect. Conformity to the dictates of political correctness is an important part of the power dynamic faced by an MSM reporter.
The root problem then is political correctness. That is why the MSM is uninterested in articles like the above. Political correctness is the contemporary secular version of “praying in public”. It is how one announces to the world one’s own virtue and moral righteousness. By comparison, scientific arguments are hard, uncomfortable, and uncertain, and will always face serious difficulties in prevailing against this powerful force.

Gerard Flood
Reply to  Tom Yoke
April 8, 2016 8:09 pm

Tom Yoke, Thank you for identifying the ‘force’ which mandates that MSM ‘group speak’! However, this raises the question of what explains the overwhelming level of that illegitimate force in a supposedly open, educated, egalitarian, democratic and self-respecting society. [Incidentally, from my world view, it reminds me of the support by Western opinion leaders for Communism, from blinded English fools in the times of Lenin, to the on-going blindness to mass murderers and torturers as are Castro and [were] Che Guevara.] What marks these campaigns is the eternal war against our search for truth. The ordinary voter knows the MSM has zero credibility. Be a ‘backbone’, not a ‘jawbone’. Support your Proven Alternative Media with your public actions!!

Richard Petschauer
Reply to  Ian Magness
April 8, 2016 8:35 pm

Agree. Great paper!! I would add another point:
Was the preindustrial temperature, that we use as a baseline for warming, the ideal temperature?

April 8, 2016 8:18 am

The IPCC predicts that the sea level will rise by between 0.4 and 0.7m during this century. Given the wide range of the prediction, there is a possibility they could be right.
Simple arithmetic says that in order for that to happen the seas need to rise at an average rate of 4.8 or 8.3 millimeters per year respectively starting right now. Satellites say the rate today is around 3 mm/yr and tide gauges a good deal less than that. Satellite data shows no acceleration. Tide gauges show so much variation between locations it is difficult to say.

Reply to  Steve Case
April 8, 2016 1:08 pm

Also helpful to keep in mind a few other salient points regarding damage from rising sea level:
– Structures do not last forever, particularly housing built to modern US standards.
– Coastal storms already inundate areas of our coastlines every year, and there is much evidence that we are in a very weak point in natural hurricane cycles. In decades past, destructive coastal storms were more common. When I was a kid, the coastlines of the US were sparsely built up. Today, expensive homes line the coast nearly everywhere.
– Given the above two points, how many places will be flooded by rising seas before they are destroyed by a coastal storm? I think that number is very low.
– Despite the near constant barrage of warnings about the dangers of climate change, and the alarmist fairy tales of accelerating sea level rise being a reality, homes and businesses and entire cities and stretches of coastline are rebuilt as quickly as humanly possible after storms like Katrina and Sandy. If an acceleration of rising seas are thought to be a reality by so many, why is this the case? Why are the politicians who issue dire warnings about climate change and rising seas, also the same people who cannot throw money fast enough to the people who choose to build in risky coastal and low-lying locations?
Actions speak louder than words, and the actions of politicians and filthy rich entertainers do not match up with the words that spew from their mouths.

Reply to  Menicholas
April 10, 2016 7:16 pm

Excellent point Menicholas…. It never fails to amaze me that riverside city suburbs in my home town of Brisbane, QLD, Australia are “allowed” (if not encouraged) to rebuild in the same area with the same chance of 1/20 year (or whatever) floods…. These suburbs should be reserved for parks or other such zoning where the financial consequences of floods are minimal… Of course I am sure that subsidizing relocation of such numbers of residents/businesses would be regarded as prohibitively expensive, and in our culture I doubt very much if the people could be persuaded (without force (:-)) to do so, but the role of government should be to make it at least harder to rebuild + give incentives to move to “safer” areas….

Tom Halla
April 8, 2016 8:37 am

The post is a good overview of “global warming” .

birdynumnum
April 8, 2016 8:37 am

Just a point. The earthquake that toppled the cathedral in 2011 was in Christchurch NZ not Wellington.

HAS
Reply to  birdynumnum
April 8, 2016 12:53 pm

Ironically if it had been in Wgtn it would have been built or strengthen to a code that would have meant it was still standing.

The_Iceman_Cometh
Reply to  birdynumnum
April 9, 2016 1:39 am

Thanks – quite correct!

April 8, 2016 8:45 am

Nice concise summary of the situation as it stands today. Here’s one observation that might reflect on the merits of the global warming story.
Climate models needed documented volcanic eruptions to put “aerosols” into the atmosphere to cause short-term cooling to bring the model predictions into approximate conformity with historical reality. Without those volcanoes, their modelled rate of warming would have been greater than the observed rate of warming.
As far as I can tell, climate models have no volcanoes in the future. This is totally unrealistic and renders the models invalid, BASED ON THEIR OWN INTERNAL LOGIC.
If anyone knows differently, please say so. I’m quite prepared to be corrected. I’ve been wrong before.

old engineer
Reply to  Smart Rock
April 8, 2016 11:56 am

Smart Rock –
I don’t know about all climate models, but Dr. Hansen’s GISS model in his 1988 paper did include future volcanos in two of his scenarios. Quote from his paper:
“4.2 Stratospheric Aerosols
….. In scenarios B and C additional large volcanoes are inserted in the year 1995 (identical in properties to EL Chichon), in the 2015 (identical to Agung) and in the year 2025 (identical to El Chichon) ….”
Note that scenario A in Hansen’s paper did not include volcanoes. It would indeed be interesting to know how many models do include future volcanoes.

Reply to  Smart Rock
April 8, 2016 1:11 pm

Terrestrial volcanoes only versus ocean floor volcanic & geothermal heat flux and gases of which very little is actually known.

April 8, 2016 8:54 am

Excellent summary. While most was probably known to most regulars here, it’s a substantial service to rehearse the evidence as the author did in a clear and organized manner.
Moreover, at least the following observation had previously escaped at least my attention:

In one study I made of the ice core record over 8 000 years, I found that there was a 95% chance that the temperature would change naturally by as much as +/-2degrees C during 100 years.

This provides yet another basis for stating the conclusion that many had come to:

During the 20th century, it changed by about 0.8degrees C. The conclusion? If carbon dioxide in the atmosphere does indeed cause global warming, then the signal has yet to emerge from the natural noise.

Great job.

Reply to  Joe Born
April 8, 2016 9:52 am

The lack of an observed tropical mid- upper tropospheric warm spot as predicted by CO2 GHG strong warming theory (the 8th figure in the above essay) should be seen as strong evidence to reject the theory. That main stream climate science can’t find it within itself to do that, (probably for reputational, monetary and political reasons) identifies today’s state of climate science as junk science.

Alastair Brickell
Reply to  Joe Born
April 9, 2016 7:37 am

This is a very important point…perhaps the author of the article could provide more detail of his study.

The_Iceman_Cometh
Reply to  Alastair Brickell
April 9, 2016 12:29 pm

The ice core study was Lloyd, Philip J. An estimate of the centennial variability of global temperatures. Energy & Environment, 26(3), pp. 417–424 2015. DOI: 10.1260/0958-305X.26.3.417
The upper troposphere story came from AMERICAN PHYSICAL SOCIETY
CLIMATE CHANGE STATEMENT REVIEW WORKSHOP in January 2014 the transcript of which is required reading if only for the struggle Ben Santer had to stop the other participants from taking him seriously.

Reply to  Alastair Brickell
April 10, 2016 6:39 am

The_Iceman_Cometh:
Thank you for the cite.
I must confess, though, that after reviewing it I now find Dr. Lloyd’s (to me) new argument less compelling than I it first appeared, for the reason Brandon Shollenberger sets forth nearby (although I see no basis for his claiming that the head post makes a “ton of errors”).
It turns out that what Dr. Lloyd studied was the records of four isolated locations. It seems clear that at least as far as annual-temperature anomalies go we would expect a single location’s variance greatly to exceed that of the average surface temperature. I haven’t thought through whether that should be as true of centennial trends, but at first blush it does seem to me that even for centennial durations a global trend is likely to be amplified in the local trends of high-latitude locations such as Vostok. That is, the fact that a local trend has varied by a certain amount in pre-industrial periods may not tell us much about how unusual a lesser global trend since then is.
Dr. Lloyd addressed a similar objection elsewhere, but, perhaps because of my statistics limitations or failure to understand the data, I was unable to see how what he said was an adequate answer.
So, while I continue to be impressed by Dr. Lloyd’s post in general, I will put off for now adding that new argument to my arsenal.

April 8, 2016 8:57 am

Well done. Here’s a way to think of these steps – a Markov chain of events, where each event depends on the previous one happening.
The Markov chain going from “C02 is going up” to “let’s do some effective measures to counter it” is lost on the public. Each of these steps has a probability of being right, but you have to multiply the probabilities together to get the probability of the entire sequence being correct.
Sample calculation for a positive outcome for reducing C02 emissions in order to avoid dangerous climate changes:
(1) emissions are causing C02 to go up – I think the author demonstrates p = 1.0
(2) Physical impact of increase in C02 is measurable in dangerous temperature changes, extreme events, or dangerous acceleration in sea level rise – I think the author demonstrates p = 0.01 of this being true.
(3) probability of success of centralized command economy efforts to mitigate C02 rise sufficient to cap C02 below say 500ppm: p = 0.01. These types of efforts rarely succeed, and the side effects are usually far more costly than the benefit. Source: history on command economies.
the total probability of this chain is 1.0 * 0.01 * 0.01 = 0.0001. You can interpret this to mean that CAGW meme has a 0.01% chance of a net positive outcome if carried out in its entire detail of attempting to mitigate C02 emissions. The chance of a negative outcome is 1 minus this, or 99.99%.
You can compare this to the alternative proposed by the author – mitigate if the worst happens. The probability of a net positive outcome is far higher, because mitigation is local, i.e. distributed, and there’s no chance of bad command economy effects in the common case where nothing happens.
You have to calculate two Markov chains here: the probability of a positive outcome for “nothing happens” and “something happens, mitigate:”
(1) emissions cause C02 to go up: p = 1.0
(2) physical impact requiring mitigation : p = 1 – 0.01 = 0.99, note this is the inverse of the above, we are calculating positive outcomes here.
do nothing positive outcome: 0.99
For the mitigation chain:
(1) emissions cause C02 to go up: p = 1.0
(2) physical impact requires mitigation, p = 0.01
(3) mitigation effective: guess, but let’s say it’s p = 0.5
Probability of a positive outcome is 1*0.01*0.5 = 0.005. You add this to the “do nothing” to get 99.5% chance of a positive outcome.
A more detailed analysis would assign $ or some other metric of value to these rather that comparing probabilities.
Peter

Marcus
April 8, 2016 9:02 am

..WOW, this is definitely the best and most informative analysis of the whole Climate Change / Glo.Bull Warming picture !! Thank You, I will spread it around !!

chaamjamal
April 8, 2016 9:15 am

the only empirical evidence that relates warming to fossil fuel emissions is a correlation between cumulative values. this correlation is spurious.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2725743
also no correlation between the rate of change of atmos co2 and the rate of fossil fuel emissions
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2642639
i think the fundamental issue is that we don’t know natural flows well enough to measure the effect of fossil fuel emissions
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2654191

Reply to  chaamjamal
April 8, 2016 11:40 am

Chaamjamal,
“also no correlation between the rate of change of atmos co2 and the rate of fossil fuel emissions
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2642639
This paper is superb. I think it will make Engelbeen’s head explode…

Reply to  Michael Moon
April 8, 2016 1:09 pm

Michael,
No reason at all to explode, only to shake my head for such stupidity…
Take the second paper:
is likely to be spurious because it vanishes when the two series are detrended.
What happens if you detrend the emissions and the increase in the atmosphere?
You remove the correlation and the causation in this case, which is in the double trend of the emissions compared to the increase in the atmosphere…
All what remains is a huge correlation between temperature variability and CO2 rate of change variability, which is a correlation of only the noise around the trend: +/- 1.5 ppmv around a trend of 80+ ppmv over the past 55 years, a variability which zeroes out after a few years…

CaligulaJones
April 8, 2016 9:19 am

Thank you for a brilliantly clear view of why I call myself a “lukewarmer”.

Reply to  CaligulaJones
April 10, 2016 7:27 pm

!(:-)!

Jpatrick
April 8, 2016 9:25 am

“The annual rise and fall [of CO2 within a year] is due to deciduous plants growing or resting, depending on the season. ”
I used to presume this was true, but really? For this to happen, the deciduous plant activity would have to be considerably more either the Northern Hemispher or the Southern Hemisphere. Is that the case?

ECB
Reply to  Jpatrick
April 8, 2016 9:39 am

Yes, the Arctic tundra is not duplicated in the southern hemisphere.

Reply to  ECB
April 8, 2016 10:00 am

Neither is the boreal forest. I’m looking at it now. It’s quite extensive, and it’s tough as heck to walk through.

Reply to  ECB
April 8, 2016 10:22 am

Indeed, forests of Siberia, Canada, Alaska, northern US are not duplicated in the Suthern Hemisphere. And the tropical forests of Amazonia, the Congo, Malaysia-PNG, and Indonesia exhibit a photosynthesis cycle not linked to the seasonal component.
A recent Science article on Amazonia photosynthesis describes this well:

Abstract:
In evergreen tropical forests, the extent, magnitude, and controls on photosynthetic seasonality are poorly resolved and inadequately represented in Earth system models. Combining camera observations with ecosystem carbon dioxide fluxes at forests across rainfall gradients in Amazônia, we show that aggregate canopy phenology, not seasonality of climate drivers, is the primary cause of photosynthetic seasonality in these forests. Specifically, synchronization of new leaf growth with dry season litterfall shifts canopy composition toward younger, more light-use efficient leaves, explaining large seasonal increases (~27%) in ecosystem photosynthesis. Coordinated leaf development and demography thus reconcile seemingly disparate observations at different scales and indicate that accounting for leaf-level phenology is critical for accurately simulating ecosystem-scale responses to climate change.
http://science.sciencemag.org/content/351/6276/972

ExNOAAman
Reply to  Jpatrick
April 8, 2016 10:43 am

That’s right. The NH is largely land mass (with plants), and th SH is mostly water (ocean), so you’ve figured it out.

CaligulaJones
Reply to  ExNOAAman
April 8, 2016 10:51 am

“That’s right. The NH is largely land mass (with plants), and th SH is mostly water (ocean), so you’ve figured it out.”
Which is why I get a bit ornery when the warmists try to raise the old “the Medieval Warming Period and Little Ice Age were only local/regional/NH” crap.

Jpatrick
Reply to  Jpatrick
April 8, 2016 10:47 am

Joel. Many thanks for that.

Reply to  Jpatrick
April 8, 2016 2:18 pm

Look at a globe, and note whether there is a big difference between the land area in the Northern vs. Southern Hemisphere. Subtract out the completely barren continent of Antarctica, and there is very little land surface outside of the tropics in the S.H.

Reply to  Menicholas
April 10, 2016 7:31 pm

& additionally one of/the largest land surface (Australia) is pretty “plant poor” vs. your average NH surface….

Philip Lloyd
Reply to  Jpatrick
April 9, 2016 3:56 am

There is far more land in the Northern hemisphere.

Reply to  Jpatrick
April 10, 2016 7:50 pm

Is it more due to the temperature change influence on the Oceans (the biggest sink/source of CO2, correct?)? How much CO2 is gained/released by the Ocean per degree Temperature difference between Summer/Winter does anyone know? That should cause some seasonal fluctuation in CO2 (& in both hemispheres). Also, presume NH forests (at least above the Tropics) are net sinks/absorbers of CO2 in Summer (depending on growth rates, rainfall events, other seasonal patterns as per another post here…) but more neutral in Winter… Then factor in more emissions from eg. Air Conditioners in Summer but presumably a lot more fires burning in NH winter… And all with extreme differences between NH and SH, between Ocean vs. Land Uses and a plethora of other factors… Another extremely complex system…. How the hell could you model that with any hope of success?

Reply to  llydon2015
April 12, 2016 7:09 am

llydon2015,
The seasonal in/out fluxes are reasonably known, based on O2 and δ13C changes: changes in CO2 uptake/release by vegetation make a lot of difference, as O2 is released with CO2 uptake and as that is preferentially 12CO2, the δ13C level at the same time increases. For the oceans, these changes are much smaller. Over a full seasonal cycle, the quantities involved are:
~50 GtC out and in the ocean surface
~60 GtC in and out the biosphere.
As these fluxes are countercurrent over the seasons and the seasons are opposite between NH and SH, the remaining changes in CO2 and δ13C are rather modest and mainly in the NH, where the changes in the NH biosphere are dominant. The changes in the SH are far smaller:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/month_2002_2004_4s.jpg

Trebla
April 8, 2016 9:34 am

A beautifully written and well reasoned article. We are an adaptive species. Far better to cope with any change that might come our way than to try to reverse a century of human progress and well being.

markl
April 8, 2016 9:40 am

Well said, especially for scientifically challenged people such as I. The last statement …”Attempts to influence global temperatures by controlling carbon dioxide emissions are likely to be both futile and economically disastrous.”…..brings us to the reason why AGW is being pushed. Economics, not temperature, is the reason.

Robert Barry
April 8, 2016 9:41 am

Most compelling and uplifting! Sadly, Egos are wing the day . . .

ossqss
April 8, 2016 9:48 am

Thanks Professor Lloyd, nicely done.
I am hopeful this is viewed by some who have never gotten past the headlines.
Well worth the read for those activist types to better understand the strength of their arguments.
.
Where are all the resident naysayers trying to poke holes in this? .

April 8, 2016 9:56 am

Excellent discussion!
The increase in our global temperature has effected the highest latitudes, the coldest times of year and the coldest times of day the most(you can see this in several metrics like record high minimums far exceeding record high maximum temps)
Meteorology 101 tells us that when the warming is greater at higher latitudes, decreasing the meridional(north to south) temperature gradient that it decreases many types of extreme weather.
The pressure gradient is reduced……..along with wind. Jet streams are weaker. Energy for mid latitude cyclones is reduced. The number of violent tornadoes(and the most damaging severe thunderstorms) are reduced.
Observations confirm this.
Increasing the global temperature has however, allowed the atmosphere to hold more moisture. This HAS increased extremely heavy rain amounts and flooding for many events with a similar synoptic scale set up to the past.
With additional low level moisture and precipitable water, many rain events, even smaller scale ones have the potential to dump more rain and faster. Again, this is meteorology 101.
Overall, the effects of increasing CO2 on our atmosphere/the weather and life on this planet, viewed objectively are a net positive. In fact, the past 3 decades have featured the best weather/climate and growing conditions on this planet in almost 1,000 years(since the Medieval Warm Period that was warmer than this in many places).
If you dial in the contribution from the effects of photosynthesis, the current atmosphere is benefiting life on this greening planet more than the Medieval or Warm Period’s by a wide margin.

John W. Garrett
April 8, 2016 10:05 am

Fantastic!
Well done !!

D Lake
April 8, 2016 10:14 am

Good article. I have a question. Is it possible that an increase in C02 in the upper atmosphere creates a slight cooling by reradiating the suns long wave radiation back into space and allowing less to reach the surface of the earth?

Reply to  D Lake
April 8, 2016 1:18 pm

A basic application of S-B equation shows that at -40C (lower troposphere) the theoretical emission of a black body is a fraction of a body at 15C (surface). That T^4 makes a big difference. And when applied to a grey body with emissivity included and a CO2 emissivity of essentially zero turns the entire re-radiation into nonsense. It’s the water vapor that runs the greenhouse not the pitiful ppm of GHGs.

Reply to  D Lake
April 8, 2016 1:21 pm

D Lake,
The sun’s IR is a lot nearer the visible spectrum than the IR which is emitted by the earth’s temperature. That makes that CO2 is transparent for the sun’s (IR) spectrum, while opaque (in some narrow bands) for the earth’s radiation… Water on the other side may absorb and re-emit to space some of the incoming near-IR from the sun. See the spectra here:
http://earthguide.ucsd.edu/eoc/special_topics/teach/sp_climate_change/p_solar_radiation.html

jorgekafkazar
Reply to  Ferdinand Engelbeen
April 8, 2016 2:05 pm

Thanks, F.E.

April 8, 2016 10:15 am

Re: Five points about climate change, 4/8/16, says,
The carbon dioxide which is inevitably emitted accumulates in the atmosphere and the result is “climate change.”
The assumption that CO2 accumulates in the atmosphere is essential to the AGW story, but it is invalid. It is essential in order that the Keeling Curve, which is not CO2 measurements at MLO but a doubly smoothed representation of those measurements, be attributed to a global phenomenon. For the story, CO2 must be a long-lived greenhouse gas, and be well-mixed in the atmosphere, two assumptions seriously challenged, if not contradicted, by satellite images of atmospheric CO2.
IPCC termed the Keeling Curve the master time series, which it then used to calibrate its global network of CO2 measuring stations into agreement. However, MLO measurements are local. The Observatory sits in the seasonal wind-modulated plume of Eastern Equatorial Pacific outgassing from ancient bottom waters, a terminus of MOC/THC, the carbon pump. The flow is massive, estimated between 15 and 50 Sv, where the sum flow of all rivers and streams on Earth is 1 Sv.
The conjecture that CO2 accumulates in the atmosphere is based on the false presumption that the surface of the ocean is in thermodynamic equilibrium so that the surface layer has the corresponding thermodynamic equilibrium distribution of carbon molecules. Instead, the surface layer comprises bubbles of gaseous CO2 and aqueous CO2, forms not found in TE. The surface layer is the buffer accumulator for CO2, not the atmosphere. The ocean is not a bottleneck to the dissolution of atmospheric CO2.
The rise in CO2 in the atmosphere is largely paralleled by the increase in fossil fuel combustion. Combustion of fossil fuels results in emission of CO2, so it is eminently reasonable to link the two increases.
This combustion conjecture is one of IPCC’s two human fingerprints on atmospheric CO2. IPCC’s combustion curve parallels the Keeling Curve because IPCC graphed the two records on separate ordinates, then adjusted the scales of the ordinates to make the records appear parallel. This is chartjunk, designed to deceive, and the result cannot be honestly reproduced.
The increase in CO2 in the atmosphere is incontrovertible.
Henry’s Law of Solubility predicts the rise in global atmospheric CO2 based on an equivalent global average surface ocean temperature. Global warming increases atmospheric CO2, not the reverse. Global warming leads atmospheric CO2, not the reverse. The ocean regulates the concentration of CO2 in the atmosphere.
Injecting into the air 12C-rich CO2 from fossil fuels should therefore cause the 13C in the air to drop, which is precisely what is observed: [Chart]. So the evidence that fossil fuel burning is the underlying cause of the increase in the CO2 in the atmosphere is really conclusive.
The parallel between CO2 emissions and isotopic concentration is IPCC’s second human fingerprint on the climate. It, too, is false, created again by chartjunk.
To conclude, our five steps have shown: [¶] • the combustion of ever increasing quantities of fossil fuel has boosted the carbon dioxide concentration of the atmosphere.
The principle that every effect has a cause is causation, and that every cause must precede all its effects is causality. No evidence exists that the AGW theoreticians have even attempted to show causality, an essential in demonstrating science literacy.
The temperature side of the debunked Climate Change conjecture fairs no better than the CO2 side. Just as the ocean regulates atmospheric CO2, cloud cover is a negative feedback to global warming that mitigates warming from any cause.
At the same time, cloud cover is a positive feedback to solar radiation, amplifying TSI by the burn-off effect. The AGW models don’t have dynamic cloud cover, so they miss its two feedbacks, the largest feedbacks in all of climate. The Global Average Surface Temperature (GAST) follows solar radiation, well-represented by a simple, two-pole transfer function. The accuracy of that model is quite close to the accuracy of IPCC’s 22-year smoothed representation of GAST. And of course, there is no human fingerprint on the Sun (either).

Reply to  Jeff Glassman
April 8, 2016 12:40 pm

Agree with ‘most everything you say here.
“The Global Average Surface Temperature (GAST) follows solar radiation, well-represented by a simple, two-pole transfer function.”
Do you have a link to share?

Pauly
Reply to  Jeff Glassman
April 8, 2016 1:12 pm

Jeff Glassman,
The following link is to a paper that specifically investigated causality between atmospheric CO2 concentrations and temperature:
http://dx.doi.org/10.4236/ijg.2010.13014
Its conclusion was that increasing temperature always precedes increasing atmospheric CO2 concentrations, but that increasing levels of atmospheric CO2 did not lead to increasing temperature.

Reply to  Pauly
April 8, 2016 2:03 pm

Pauly,
That paper says nothing about the cause of the increase of CO2 in the atmosphere, besides pointing to human CO2. They talk about temperature change preceding CO2 change, not about temperature preceding CO2 increase…

Reply to  Jeff Glassman
April 8, 2016 1:27 pm

Per IPCC AR5 Table 6.1 anthropogenic CO2 output between 1750 and 2011 (this number takes some major WAGing) is twice the 133 ppm concentration increase. Uh-oh, big problem. So IPCC AR5 created entirely new spontaneous magic unicorn sinks (Table 6.1) under which to sequester/sweep 57% of the anthro contribution. That leaves 43% or 4 Gt/y residual. Figure 6.1 goes from a net 0.4 Gt/y of sinks before 1750 to 2011 ocean & vegetation sinks of 2.8 Gt/y. That’s a yuuuuge change. Even a trivial change in the 45,000 Gt of reservoirs could easily account for the change.

Reply to  Jeff Glassman
April 8, 2016 1:56 pm

Jeff,
Quite a lot of remarks, but with very little basis…
1. Keeling Curve the master time series, which it then used to calibrate its global network of CO2 measuring stations into agreement.
This is not based on reality: there are more than 70 CO2 monitoring stations all over the oceans, of which 10 by NOAA, the rest are maintained by different people of independent different organizations and different countries. The only calibration done (nowadays by NOAA) is of the calibration gases, used to calibrate all measurement devices all over the world. But even so, other organizations like Scripps still make their own calibration gases…
2. The surface layer is the buffer accumulator for CO2, not the atmosphere.
Never heard of the Revelle factor? The chemistry of the ocean surface makes that any change in the atmosphere is followed by an only 10% change in the ocean surface. As the resp. quantities are 1000 GtC (mixed layer) and 800 GtC (atmosphere), the ocean surface is a small sink (~0.5 GtC/year) for the atmospheric increase. The main sink is in the deep oceans, but these exchanges are limited (~40 GtC/year via the MOC/THC, ~3 GtC more sink than source).
3. IPCC graphed the two records on separate ordinates, then adjusted the scales of the ordinates to make the records appear parallel.
That simply is not true:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/temp_emiss_increase.jpg
4. Henry’s Law of Solubility predicts the rise in global atmospheric CO2 based on an equivalent global average surface ocean temperature.
Henry’s law predicts not more than 16 ppmv/°C for the change in steady state between ocean surface and atmosphere. That is all. Not the 110 ppmv increase above pre-industrial for 0.8°C temperature increase. The net CO2 flux is from atmosphere into the oceans, not reverse, as also DIC and pH measurements show.
5. The parallel between CO2 emissions and isotopic concentration is … false, created again by chartjunk.
Sorry, fully based on measurements.
6. every cause must precede all its effects is causality.
Human emissions precede its effects in the atmosphere for every observation:
http://www.ferdinand-engelbeen.be/klimaat/co2_origin.html
Thus Jeff, your story not only lacks evidence on about every point, it is a disserve to the many skeptics who want to tackle the AGW story where the real battle should be: the lack of strong response of temperature to the increased CO2 levels…

jorgekafkazar
Reply to  Ferdinand Engelbeen
April 8, 2016 2:11 pm

And thanks again.

bw
Reply to  Jeff Glassman
April 8, 2016 3:27 pm

Correct. CO2 never “accumulates” in Earth’s atmosphere. All atmospheric CO2 is a small part of a global biogeochemical carbon cycle. That is, a river that flows from sources to sinks. Carbon dioxide seeping from deep ocean geology is grossly underestimated by the IPCC. The amount of anthropogenic CO2 in the global carbon cycle is only about 3 percent of the annual stream. The deep ocean CO2 sinks are also underestimated. Most of the increased CO2 in the atmosphere reflects shifts in the deep long term exchanges.
https://chiefio.wordpress.com/2009/02/25/the-trouble-with-c12-c13-ratios/
Changes in atmosphere CO2 13C/12C ratio are mostly natural. Only about 20ppm of atmospheric CO2 is due to fossil fuels, and that addition is entirely beneficial. The water cycle accounts for 95 percent of the so-called greenhouse effect. CO2 is about 4 percent according to Lindzen.

Reply to  bw
April 9, 2016 12:42 am

bw,
1. Human CO2 is not part of the cycle. It is additional. It is like adding an extra flow to a lake where a rather constant river flow goes in and out. Besides the small natural variability in the river flow, the extra flow, whatever small it is, will increase the lake’s level, not the river, whatever high its flow is.
2. Most of the deep vents CO2 remains in the deep oceans, not even measurable in the huge mass there.
3. 3% additional to a rather constant cycle adds to the cycle and amounts will increase, which disturbs the cycle and some of the additional CO2 will be absorbed, but not all. Even the only 3% is responsible for near the full increase, as the average sink rate is only 1.5%.
4. The deep oceans are a net sink of ~3 GtC/year. Total sinks ~4.5 GtC/year. Human emissions are ~9 GtC/year…
5. The huge downward change in 13C/12C ratio in the atmosphere is all human: oceans releases, volcanoes, rock weathering,… all increase the ratio in the atmosphere and the biosphere as a whole is a net sink for CO2, thus also increasing the 13C/12C ratio by its preference for 12C. The only known huge source of low-13C is fossil fuels, besides a very small addition of natural methane.
6. about 8% of the atmosphere is original human, but that says next to nothing about the cause of the increase, as 20% of all CO2 in the atmosphere is exchanged with CO2 from other reservoirs, thus diluting the human “fingerprint”.

Reply to  bw
April 9, 2016 1:11 pm

“1. Human CO2 is not part of the cycle. It is additional. It is like adding an extra flow to a lake where a rather constant river flow goes in and out.”
It is peeing in the Meuse. Your suggestion that it is “rather constant” is pure speculation. You base it on ice core proxies for the past but: A) the ice core measurements cannot be independently verified B) as they always warn regarding mutual funds, past performance is not indicative of future results.
“4. The deep oceans are a net sink of ~3 GtC/year. Total sinks ~4.5 GtC/year. Human emissions are ~9 GtC/year…”
Ridiculous pseudo-mass balance argument again. Your persistent failure to understand why it is so abjectly flawed reveals that you do not understand the nature of dynamic systems, and your static accounting is not up to the task.

Reply to  bw
April 9, 2016 3:14 pm

Bart,
It is peeing in the Meuse. Your suggestion that it is “rather constant” is pure speculation.
It is ~6% of the natural cycle, or the whole town of Liège peeing in the Meuse…
That it is rather constant is proven by the small variability (maximum +/- 1.5 ppmv) around the trend and a rather constant seasonal cycle as well in CO2 changes as in δ13C changes over the past 55 years. That is the period with the largest increase in CO2 amount and speed over the past 800,000 years.
You base it on ice core proxies
No, it is based on current information, ice cores can’t give any information about the carbon cycles over the seasons, which are the largest fluxes within a year or even inter-annual.
Indeed there is little possibility to confirm the ice core data, except for a 20 year overlap with direct measurements: they are much better than any proxy…
Ridiculous pseudo-mass balance argument again
As long as you can’t produce any evidence that the natural carbon cycle increased a fourfold over the past 55 years in lockstep with human emissions, increase in the atmosphere and net sink rate, your objections against the mass balance argument are of no value…

Reply to  Jeff Glassman
April 10, 2016 8:33 pm

“Henry’s Law of Solubility predicts the rise in global atmospheric CO2 based on an equivalent global average surface ocean temperature. Global warming increases atmospheric CO2, not the reverse. Global warming leads atmospheric CO2, not the reverse. The ocean regulates the concentration of CO2 in the atmosphere.”
Thanks for this Jeff Glassman. Would you have a reference or two that clearly demonstrates (1) that it is Global temperature that drives atomospheric CO2 levels, and (2) that the Oceans are the dominant driver (both as source and sink)? Thanks for that, would be much appreciated….

GTL
April 8, 2016 10:24 am

Fortunately isotopes come to our aid. There are two primary plant chemistries, called C3 and C4. C3 plants are ancient, and they tend to prefer the 12C carbon isotope to the 13C. Plants with a C4 chemistry are comparatively recent arrivals, and they are not so picky about their isotopic diet. Fossil fuels primarily come from a time before C4 chemistry had evolved, so they are richer in 12C than today’s biomass. Injecting into the air 12C-rich CO2 from fossil fuels should therefore cause the 13C in the air to drop, which is precisely what is observed:
So C-3 uses 12C, C-4 uses 12C and 13C, ergo emitting 12C reduces 13C. Can someone connect the dots for me? A great article but I do grasp this piece of the puzzle.

GTL
Reply to  GTL
April 8, 2016 10:25 am

First paragraph should be in quotes.

GTL
Reply to  GTL
April 8, 2016 10:26 am

do not grasp

Reply to  GTL
April 8, 2016 10:51 am

GTL, let me try. It is the ratio of 12C to 13C. Fossil fuels produce 12C cause thats what C3 photosynthesis plants consumed to produce the fossil fuels in the first place. 15% of terrestrial plant biomass is C4 (e.g grasses and derivatives like corn and sugarcane). Dunno about aquatic plants. Those C4 grasses take up 12C and 13C equally. So over time, biological sinks reduce 13C relative to 12C. Only 12C added by FF; 12C and 13C being biologically sinked.

GTL
Reply to  ristvan
April 8, 2016 11:15 am

So it is the ratio of 12C to 13C, not an absolute reduction of 13C. Thank you, that makes perfect sense.

Reply to  ristvan
April 8, 2016 12:34 pm

It’s just a rationalization, on which I will expand below.

Reply to  ristvan
April 8, 2016 3:36 pm

It’s a ratio change and an absolute reduction. The oceans take C-12 and C-13 out of the atmosphere.

April 8, 2016 10:29 am

Great article. One minor correction. The finest grid in CMIP5 is 110×110 km at yhe equator. Typical is 250×250 km. regional weather models are run at 5km typical; the finest are now 1.5 to 2km grids. That is because they do not model the planet, only regions. And theymonly run out typically 7 days, not 100 years.
Doubling resolution (halving grid sides) increases compution one order of magnitude since time steps need to more than half per the CFL constraint. Climate models have a 6-7 orders of magnitude computational limitation.

April 8, 2016 10:37 am

Yes, excellent article, however: considering IPCC AR5 Figure 6.1 & Table 6.1 I am of the opinion that the magnitudes of the reservoirs and sinks and the uncertainty bands are such that anthropogenic contributions are insignificant. Same with the power flux, 2 W/m^2 out of 340 and ebbs and flows with +/- 20 uncertainties. And the C13/14 concentrations are too small to “prove” much of anything. Volcanic activity is another possible source and could easily originate from ocean floor geothermal heat flow and volcanic vents of which almost nothing is known.

April 8, 2016 10:49 am

I read your report and it is one of the best on this website so far this year.
I think your conclusion should have been first, used as an executive summary, but that’s a minor point
I do think you didn’t directly make these points, and should have:
(1) Anyone who thinks 1750 was the “ideal climate” , and considers any changes from 1750 to be bad news, is a “warmunist” or an imbecile (I repeat myself sometimes!).
(2) The average temperature in 1750 was near the coldest since the glaciation peak about 20,000 years ago, and well below average for our planet’s entire history (assuming estimates of the prior 4.5 billion years are reasonable)
(3) CO2 levels in the 1750 atmosphere may have been near the lowest ever (assuming estimates of the prior 4.5 billion years are reasonable)
(4) Humans generally did not like the cool centuries from 1300 to 1800, and left much anecdotal evidence describing their unhappiness about the cool weather.
(5) Green plants certainly did not prefer 250 ppmv airborne plant food in 1750 (I speak for them)
(6) In summary, we have warmunists looking at unpleasant, bad news climate in 1750, declaring that climate to be normal, and getting hysterical about any changes since 1750.
In fact, the warming and additional CO2 in the air since 1750 are GREAT NEWS for people and plants.
I do not subscribe to the left-wing belief that CO2 caused the warming since the late 1600 Maunder Minimum trough, in the absence of any scientific proof — I think it is more likely that natural causes of warming led to the oceans releasing more CO2 into the air.
Hating the warming and more CO2 in the air since 1750 has nothing to do with common sense, or science — but there has to be some reason(s) for the false demonization of CO2, and the false glorification of the climate in 1750 !
In my opinion, the reasons are getting attention, money, and an excuse to expand the government and more tightly control the general public, in the name of “fighting” CAWG to save the planet — the imaginary left-wing crisis !
As an aside, warmunists completely ignore real time CO2 measurements, and use ice core proxies for CO2 levels from 1750 to 1958, which may not be as accurate as the old chemical measurement methodologies.
I think you ignored those measurements too.
YOU WROTE:
“There were only sporadic readings of CO2 before 1958, no continuous measurements. Nevertheless, there is sufficient information to construct a view back to 1850:”
MY COMMENTS:
Not correct.
More than 90,000 individual chemical measurements of CO2 levels were recorded between 1812 and 1961.
Warmunists ignore all of them.
For one example:
– Wilhelm Kreutz, working at a Giessen (Germany) meteorological station, used a closed, volumetric, automatic system designed by Paul Schuftan, the father of modern gas chromatography.
Kreutz compiled over 64,000 single measurements — about 120 samples per day — in an 18-month period during 1939–1941.
Kreutz’s measurements were precise enough to capture the seasonal CO2 cycle, and weather events around the city of Giessen.
Kreutz’s real-time measurements found persistent CO2 levels above 400 ppm over most of a 2-year period … while warmunists claim 300 ppm in that two-year period per ice core air bubble proxies.
Above sentences from my climate blog post — full post located here
http://elonionbloggle.blogspot.com/2016/04/historical-or-hysterical-co2-levels.html
Source data from here:
http://www.friendsofscience.org/assets/files/documents/CO2%20Gas%20Analysis-Ernst-Georg%20Beck.pdf

CaligulaJones
Reply to  Richard Greene
April 8, 2016 10:57 am

“(2) The average temperature in 1750 was near the coldest since the glaciation peak about 20,000 years ago, and well below average for our planet’s entire history (assuming estimates of the prior 4.5 billion years are reasonable)

(4) Humans generally did not like the cool centuries from 1300 to 1800, and left much anecdotal evidence describing their unhappiness about the cool weather.”
Exactly. There is a reason why so many Christmas carols have to do with the warmth of Christmas cheer compared to the bloody cold and snowy outdoors…they were written when it was “nice” and cheerful and life-threateningly dangerous out.
But seriously, when you ask any warmunist when was a better time than now, its exceedingly easy to find a dozen or so indicators that they wouldn’t want to live under. It would suck to be a vegan jonesing for a soy latte in Dickens’ London during Yule.

Reply to  CaligulaJones
April 8, 2016 2:23 pm

Yeah, but them half rotten and moldy veggies were all ORGANIC!

Reply to  Richard Greene
April 8, 2016 2:30 pm

Richard Greene,
Most of the historical wet chemical CO2 measurements are of no value at all, because they were taken in places (towns, fields, forests,…) too close to huge sources and sinks. They show local values which could change hundreds of ppmv within a full 24 h day. That includes the Giessen data. There is a modern station, a few km from the historical one, which shows the problem:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/giessen_background.jpg
Further, the sampling in Giessen was not automatic, it was 3 times a day at 4 heights, or 12 samples a day, manually measured. The other measurements were temperature, humidity, wind speed and direction.
Only these samples taken over the oceans or coastal with wind from the sea are of value. These show observations around ice core CO2 levels for the same period.
See further my comment on the work of the late Ernst Beck:
http://www.ferdinand-engelbeen.be/klimaat/beck_data.html

Richard Greene
Reply to  Ferdinand Engelbeen
April 9, 2016 12:28 pm

I don’t see why a measurement is automatically bad because it varies too much during a day.
That’s what averages are for.
Is it not true that a large majority — perhaps 75% — of Mauna Loa raw measurements are discarded because they ‘vary too much’.?
There is more CO2 in the air than in the 1700s.
I thinks that great news for green plants,
But how accurate are the 1700s estimates based on climate proxies?.
After all, we have a large number of people claiming 1750 climate was “normal”, and getting hysterical about any changes since then — so claims about 1750 CO2 levels have become very important (at least to them).
When throwing out real time measurements to rely on climate proxies (ice core air bubbles), the next question is whether the proxy data are better than the real time measurements — you say so — you are a bright man — so I will agree with you … but the more important question is whether the proxy data are good enough as a substitute for accurate real-time measurements?
In my opinion, climate proxy data are NEVER good enough to be a substitute for accurate real time measurements of average CO2 levels and the average temperature.
They are rough estimates.
How rough is the relevant question.
Warmunists using ice core data for historical CO2 “measurements” seem to reject any climate proxy data that do not benefit their CAGW theory, so I’m always suspicious.
Sometimes they twist and splice dubious climate proxy data to support CAWG (Mann Hockey Stick).

Reply to  Ferdinand Engelbeen
April 9, 2016 2:44 pm

Richard,
The main point is what you want to measure. In the case of CO2 one is mainly interested in the levels in the bulk of the atmosphere, as that influences the capture of IR in the atmosphere. Biologists may be interested in the diurnal CO2 changes and a lot of tall towers try to register the CO2 fluxes in and out vegetation to detail that carbon cycle.
In 95% of the atmospheric mass the CO2 levels are within +/- 2% of full scale, despite the fact that about 20% of all CO2 in the atmosphere is exchanged each year with other reservoirs over the seasons. That is all over the oceans and over a few hundred meters over land. In 95% of the atmosphere, that is in the first few hundred meters over land, CO2 is not well mixed. That is where most of the historical measurements were taken. The modern Giessen data show an average bias of +40 ppmv compared to background, the historical data show a variability of 68 ppmv (one sigma). Compare that to Mauna Loa: 4 ppmv for all raw data.
Indeed Mauna Loa has a luxury problem: they sample so many data that they can throw out any suspect data if the wind is downwind from the volcanic vents or upwind from the valleys. Still more that enough data left to give the nice Keeling curve. It doesn’t matter that you keep all the data or use only “clean” data; the difference is less than 0.1 ppmv in trend. Here for Mauna Loa and the South Pole in 2008 (mind the scales!):
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_mlo_spo_raw_select_2008.jpg
Ice cores CO2 is not a proxy. It is CO2 as it was in the air at the bottom of the firn at the moment that the bubble closed. That is not the composition of one year, but the average of 10-600 years, depending of how fast the bubbles were closed and that depends of the snow accumulation rate. For the period around 1750 we have ice cores with a resolution of 20 years, sharp enough to measure a global change of 2 ppmv sustained over the full 20 years or a two-year peak of 40 ppmv.
Moreover there is a direct overlap of three ice cores (Law Dome) with the direct measurements at the South Pole (1960-1980) which were within the variability of 1.2 ppmv (1 sigma) of the ice core CO2 measurements.
Compare that to the wild variability of the historical data: 250 ppmv in the US in the same year that in Giessen 450 ppmv was measured. Beck’s compilation shows an increase of +80 ppmv over a period of 7 years and down again in 7 years. That should be clearly visible in all high to medium resolution ice cores and about every available proxy, but it is not.
Thus only rigorous selection of the best situated high quality historical data may be of help. That is what Callendar had done… His average values (around 310 ppmv) was years later confirmed by ice core CO2 data…

Reply to  Ferdinand Engelbeen
April 9, 2016 2:46 pm

Of course, the second:
In 95% of the atmosphere, that is in the first few hundred meters over land
must be
In 5% of the atmosphere, that is in the first few hundred meters over land

Reply to  Richard Greene
April 8, 2016 2:51 pm

Agree Richard,
We have rescued life on this planet from dangerously low levels of atmospheric CO2.
Incredible that a group of humans would define these low levels of CO2 and the harshness of a cooler planet as the “ideal conditions” before humans began burning fossil fuels and declare that changes since then from burning fossil fuels………increase in beneficial CO2 and modest, beneficial global warming is detrimental.
Then, generate global climate models based on theoretical equations that project catastrophic results……….which have not been verifying for 2 decades and use them to hijack climate science to accomplish political objectives.

Brian H
Reply to  Mike Maguire
April 9, 2016 5:28 pm

“before humans began burning fossil fuels” — interesting that the new CO2 sensor sats see huge flows from vegetated areas, and NOTHING from urban and industrial ones.

afonzarelli
Reply to  Mike Maguire
April 9, 2016 5:48 pm

Brian, the reason that co2 flows from vegetated areas is that trees sequester carbon and then when they die they release that carbon from those areas. So it looks as though trees produce carbon but in reality they are just redistributing carbon…

Richard Greene
Reply to  Mike Maguire
April 11, 2016 10:18 am

Reply to Ferdinand’s post above:
YOU WROTE:
“Ice cores CO2 is not a proxy.”
MY COMMENT:
I strongly disagree:
A global CO2 level measured real time with accurate instruments in 1750 is what we really want.
Since those data do not exist, something else must be used.
A “proxy” is something else (used as a substitute).
The air bubbles are a climate proxy for several reasons:
– The assumption must be made that CO2 in ice cores measured centuries later is still representative of the average global level of CO2 in 1750.
The air bubbles may have changed over time, and under pressure, or from the process of drilling ice cores, moving them from the field to a laboratory, and then making measurements in a laboratory.
The reasons I am concerned about trashing these historical real time measurements:
(1) They seem to have been dismissed very quickly, as if they had no value at all,
(2) I’ve found almost no discussion of them in the past ten years — your page from 2008 is a rare exception,
(3) I’ve found too little skepticism of the accuracy of ice core data, and
(4) Warmunists have a history of ignoring data that do not support their beliefs … and a history of ignoring or altering climate proxy data — with CO2 data from ice cores being a rare exception where they do use proxy data.
I still find it hard to believe 100% of Pettenkofer CO2 data are completely worthless … while ice core proxy data are treated as 100% accurate, with possible margins of error rarely discussed.
In my climate change reading since 1997, I have been skeptical the whole time , but usually not skeptical enough!
I agree there is more CO2 in the air since 1750, and I’m glad there is
I wonder how far 280 ppm in 1750, from ice cores, is from what the CO2 level would have been (higher or lower) if accurate real time measurements could have been made in 1750?

Reply to  Mike Maguire
April 11, 2016 3:32 pm

Richard Greene,
One need to make a distinction between a proxy, which is a measurement that is based on some variable which is correlated to another variable one is interested in and a measurement of the same variable sampled in a locked, but smoothed form as the one where some is interested in.
Take e.g. stomata index data: that is a proxy for CO2 levels, as the density of stomata is reversely correlated with the local CO2 levels of the previous growing season. With all the problems that may give of local bias and other influences (like extra drought).
Take ice core CO2 data: that are direct CO2 measurements with the same equipment as is used for atmospheric measurements. With one main problem: the CO2 levels are not from one moment in time, but the average over 10 to 600 years sampling.
The assumption must be made that CO2 in ice cores measured centuries later is still representative of the average global level of CO2 in 1750.
Based on several tested and proven assumptions (like firn densification models), CO2 in several ice cores is representative for the average CO2 level over 1740-1760 with an accuracy of +/- 5 ppmv.
The air bubbles may have changed over time, and under pressure, or from the process of drilling ice cores, moving them from the field to a laboratory, and then making measurements in a laboratory.
Many of these objections were answered by the work of Etheridge e.a. (1996) on three ice cores at Law Dome: three different drilling techniques were used (wet and dry), same results. Different flask types were used for local firn CO2 sampling: one type was rejected.
CO2 was measured in firn, sampled in flasks on site and in ice after transport by the same apparatus (GC), which shows that in the transition zone with part firn and part ice, CO2 levels were equal in both.
CO2 levels in ice cores with extreme differences in accumulation rates and temperature, thus extreme differences in depth / pressure for the same average gas age differ not more than +/- 5 ppmv.
There is a theoretical possibility that CO2 migrates through remaining water at the inter-crystalline surfaces of the ice. That was -theoretically- derived from the increased CO2 levels near melt layers of -relative- “warm” (-23°C) coastal ice cores. The only result was a broadening in resolution from 20 to 22 years at medium depth and to 40 years at full depth. No such migration is possible in the inland ice cores at -40°C.
(1) They seem to have been dismissed very quickly, as if they had no value at all
The main problem still is where was measured. If you measure in or near a forest, you can have levels of 250 ppmv on a sunny day in daylight and 600 ppmv at night of the same day… Many times there were no series, only sporadic samples once in a while. Most series were with one sample a day, only one had three samples a day at fixed times (but even 15 minutes later or earlier could give a difference of 40-50 ppmv!).
Thus really all the measurements taken over land didn’t give you any clue of the real “background” CO2 levels of that time…
See it as the equivalent of taking temperature measurements from a thermometer hut above an asphalted parking lot. No “correction” can change that into a valuable series showing that the earth is warming -or not-.
Only those taken on sea or coastal with wind from the seaside are of value and these show values around the ice cores.
I’ve found almost no discussion of them in the past ten years
That is because most scientists, warmistas and skeptics alike, accept that the ice core data are far more accurate than the historical measurements… As for the recent past (the past ~10,000 years) the resolution of the ice cores is 10-40 years, any deviation of CO2 from the average temperature – CO2 ratio like the current one would be noticed in every ice core…
I’ve found too little skepticism of the accuracy of ice core data
The only objections I have read were from 1992 by the late Dr. Jaworowski, who was a specialist for the radioactive fallout of the Chernobyl accident, including in ice cores. With no experience – as far as I know – of CO2 in ice cores. Most of his objections were refuted by the 1996 work of Etheridge. And Dr. Jaworowski did make some remarks which were, nicely said, quite remarkable: the opposite of physical laws…
I wonder how far 280 ppm in 1750, from ice cores, is from what the CO2 level would have been (higher or lower) if accurate real time measurements could have been made in 1750?
Taken into account the ~20 year smoothing of the data and the repeatability of samples at the same depth within the same ice core of 1.2 ppmv (1 sigma), the sampling in 1750 at suitable locations (like Mauna Loa) could give +/- 50 ppmv around the ice core value without being noticed in the ice core: +/- 40 ppmv for a one-year peak and +/- 10 ppmv for the best available historical technique. If they took four years of samples (even only 1 sample per year*), that should be already better: +/- 20 ppmv around the ice core, of which +/-10 ppmv for a 4-year peak and 10 ppmv for the method. A sustained + or – 2 ppmv deviation over the full 20 years would be noticed in the ice core where historical measurements should be +/- 10 ppmv around the + or – 2 ppmv.
The point is that the ice cores best resolution in that period is ~20 years, that gives averaged values, but that doesn’t change the average ppmv measured in the ice core over that period.
Compare that to current times where the year by year global variability is not more than +/- 1.5 ppmv around the trend. I see no direct reason that the historical variability was much larger.
(*) Ignoring the average +/- 4 ppmv seasonal variation and the +/- 4 ppmv local variability.

1 2 3