Opinion by Dr. Tim Ball
I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts. – Arthur Conan Doyle. (Sherlock Holmes)
Create The Facts You Want.
In a comment about the WUWT article “The Record of recent Man-made CO2 emissions: 1965-2013”, Pamela Gray, graphically but pointedly, summarized the situation.
When will we finally truly do the math? The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different. Take out just the anthropogenic CO2 and rerun the past 30 years of weather. The exact same weather pattern variations would have occurred. Or maybe because of the random nature of weather we would have had it worse. Or it could have been much better. Now do something really ridiculous and take out just China’s portion. I know, the post isn’t meant to paint China as the bad guy. But. Really? Really? All this for something so tiny you can’t find it? Not even in a child’s balloon?
The only quibble I have is that the amount illustrates the futility of the claims, as Gray notes, but the Intergovernmental Panel on Climate Change (IPCC) and Environmental Protection Agency (EPA) are focused on trends and attribution. It must have a human cause and be steadily increasing, or, as they prefer – getting worse.
Narrowing the Focus
It’s necessary to revisit criticisms of CO2 levels created by the IPCC over the last several years. Nowadays, a measure of the accuracy of the criticisms, are the vehemence of the personal attacks designed to divert from the science and evidence.
From its inception, the IPCC focused on human production of CO2. It began with the definition of climate change, provided by the UNFCCC, as only those caused by humans. The goal was to prove their hypothesis that increase of atmospheric CO2 would cause warming. This required evidence that the level increased from pre-Industrial times, and would increase each year because of human industrial activity. How long before they start reducing the rate of CO2 increase to make it fit the declining temperatures? They are running out of guesses, 30 at latest count, to explain the continued lack of temperature increase now at 17 years and 10 months.
The IPCC makes the bizarre claim that up until 1950 human addition of CO2 was a minor driver of global temperature. After that over 90 percent of temperature increase is due to human CO2.
Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.
The claim that a fractional increase in CO2 from human sources, which is naturally only 4 percent of all greenhouse gases, become the dominant factor in just a couple of years is incredulous. This claim comes from computer models, which are the only place in the world where a CO2 increase causes a temperature increase. It depends on human production and atmospheric levels increasing. It assumes temperature continues to increase, as all three of IPCC scenario projections imply.
Their frustration is they control the CO2 data, but after the University of Alabama at Huntsville (UAH) began satellite global temperature data, control of temperature data was curtailed. It didn’t stop them completely, as disclosures by McIntyre, Watts, Goddard, the New Zealand Climate Science Coalition among others, illustrated. They all showed adjustments designed to enhance and emphasize higher modern temperatures.
Now they’re confronted with T. H. Huxley’s challenge,
The Great Tragedy of Science – the slaying of a beautiful hypothesis by an ugly fact.
This article examines how the modern levels of atmospheric CO2 were determined and controlled to fit the hypothesis. They may fit a political agenda, but they don’t fit nature’s agenda.
New Deductive Method; Create the Facts to Fit the Theory
Farhad Manjoo asked in True Enough: Learning To Live In A Post-fact Society,
“Why has punditry lately overtaken news? Why do lies seem to linger so long in the cultural subconscious even after they’ve been thoroughly discredited? And why, when more people than ever before are documenting the truth with laptops and digital cameras, does fact-free spin and propaganda seem to work so well?”
Manjoo’s comments apply to society in general, but are enhanced about climate science because of differing public abilities with regard to scientific issues. A large majority is more easily deceived.
Manjoo argues that people create facts themselves or find someone to produce them. Creating data is the only option in climate science because, as the 1999 NRC Report found, there is virtually none. A response to February 3, 1999 US National Research Council (NRC) Report on Climate Data said,
“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.
The situation is worse today. The number of stations used is dramatically reduced and records adjusted to lower historic temperature data, which increases the gradient of the record. Lack of data for the oceans was recently identified.
“Two of the world’s premiere ocean scientists from Harvard and MIT have addressed the data limitations that currently prevent the oceanographic community from resolving the differences among various estimates of changing ocean heat content.”
Oceans are critical to CO2 levels because of their large sink or source capacity.
Data necessary to create a viable determination of climate mechanisms and thereby climate change, is completely inadequate. This applies especially to the structure of climate models. There is no data for at least 80 percent of the grids covering the globe, so they guess; it’s called parameterization. The 2007 IPCC Report notes,
Due to the limited resolutions of the models, many of these processes are not resolved adequately by the model grid and must therefore be parameterized. The differences between parameterizations are an important reason why climate model results differ.
Variable results occur because of inadequate data at the most basic level and subjective choices by the people involved.
The IPCC Produce The Human Production Numbers
In the 2001, IPCC Report identified 6.5 GtC (gigatons of carbon) from human sources. The figure rose to 7.5 GtC in the 2007 report and by 2010 it was 9.5 GtC. Where did they get these numbers? The answer is the IPCC has them produced and then vet them. In the FAQ section they ask, “How does the IPCC produce its Inventory Guidelines?”
Utilizing IPCC procedures, nominated experts from around the world draft the reports that are then extensively reviewed twice before approval by the IPCC.
They were called Special Report on Emissions Scenarios (SRES) until the 2013 Report, when they became Representative Concentration Pathways (RCP). In March 2001, John Daly reports Richard Lindzen referring to the SRES and the entire IPCC process including SRES as follows,
In a recent interview with James Glassman, Dr. Lindzen said that the latest report of the UN-IPCC (that he helped author), “was very much a children’s exercise of what might possibly happen” prepared by a “peculiar group” with “no technical competence.”
William Kininmonth, author of the insightful book “Climate Change: A Natural Hazard”, was former head of Australia’s National Climate Centre and their delegate to the WMO Commission for Climatology. He wrote the following in an email on the ClimateSceptics group page.
I was at first confused to see the RCP concept emerge in AR5. I have come to the conclusion that RCP is no more than a sleight of hand to confuse readers and hide absurdities in the previous approach.
You will recall that the previous carbon emission scenarios were supposed to be based on solid economic models. However, this basis was challenged by reputable economists and the IPCC economic modelling was left rather ragged and a huge question mark hanging over it.
I sense the RCP approach is to bypass the fraught economic modelling: prescribed radiation forcing pathways are fed into the climate models to give future temperature rise—if the radiation forcing plateaus at 8.5W/m2 sometime after 2100 then the global temperature rise will be 3C. But what does 8.5 W/m2 mean? Previously it was suggested that a doubling of CO2 would give a radiation forcing of 3.7 W/m2. To reach a radiation forcing of 7.4 W/m2 would thus require a doubling again—4 times CO2 concentration. Thus to follow RCP8.5 it is necessary for the atmospheric CO2 concentration equivalent to exceed 1120ppm after 2100.
We are left questioning the realism of a RCP 8.5 scenario. Is there any likelihood of the atmospheric CO2 reaching about 1120 ppm by 2100? IPCC has raised a straw man scenario to give a ‘dangerous’ global temperature rise of about 3C early in the 22nd century knowing full well that such a concentration has an extremely low probability of being achieved. But, of course, this is not explained to the politicians and policymakers. They are told of the dangerous outcome if the RCP8.5 is followed without being told of the low probability of it occurring.
One absurdity is replaced by another! Or have I missed something fundamental?[1]
No, nothing is missed! However, in reality, it doesn’t matter whether it changes anything; it achieves the goal of increasing CO2 and its supposed impact of global warming. Underpinning of IPCC climate science and the economics depends on accurate data and knowledge of mechanisms and that is not available.
We know there was insufficient weather data on which to construct climate models and the situation deteriorated as they eliminated weather stations, ‘adjusted’ them and then cherry-picked data. We know knowledge of mechanisms is inadequate because the IPCC WGI Science Report says so.
Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.
or
For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.
Two critical situations were central to control of atmospheric CO2 levels. We know Guy Stewart Callendar, A British steam engineer, cherry-picked the low readings from 90,000 19th century atmospheric CO2 measures. This not only established a low pre-industrial level, but also altered the trend of atmospheric levels. (Figure 1)
Figure 1 (After Jaworowski; Trend lines added)
Callendar’s work was influential in the Gore generated claims of human induced CO2 increases. However, the most influential paper in the climate community, especially at CRU and the IPCC, was Tom Wigley’s 1983 paper “The pre-industrial carbon dioxide level.” (Climatic Change. 5, 315-320). I held seminars in my graduate level climate course about its validity and selectivity to establish a pre-industrial base line.
I wrote an obituary on learning of Becks untimely death.
I was flattered when he asked me to review one of his early papers on the historic pattern of atmospheric CO2 and its relationship to global warming. I was struck by the precision, detail and perceptiveness of his work and urged its publication. I also warned him about the personal attacks and unscientific challenges he could expect. On 6 November 2009 he wrote to me, “In Germany the situation is comparable to the times of medieval inquisition.” Fortunately, he was not deterred. His friend Edgar Gartner explained Ernst’s contribution in his obituary. “Due to his immense specialized knowledge and his methodical severity Ernst very promptly noticed numerous inconsistencies in the statements of the Intergovernmental Penal on Climate Change IPCC. He considered the warming of the earth’s atmosphere as a result of a rise of the carbon dioxide content of the air of approximately 0.03 to 0.04 percent as impossible. And it doubted that the curve of the CO2 increase noted on the Hawaii volcano Mauna Loa since 1957/58 could be extrapolated linear back to the 19th century.” (This is a translation from the German)
Beck was the first to analyze in detail the 19th century data. It was data collected for scientific attempts to measure precisely the amount of CO2 in the atmosphere. It began in 1812, triggered by Priestly’s work on atmospheric oxygen, and was part of the scientific effort to quantify all atmospheric gases. There was no immediate political motive. Beck did not cherry-pick the results, but examined the method, location and as much detail as possible for each measure, in complete contrast to what Callendar and Wigley did.
The IPCC had to show that,
· Increases in atmospheric CO2 caused temperature increase in the historic record.
· Current levels are unusually high relative to the historic record.
· Current levels are much higher than pre-industrial levels.
· The differences between pre-industrial and current atmospheric levels are due to human additions of CO2 to the atmosphere.
Beck’s work showed the fallacy of these claims and in so doing put a big target on his back.
Again from my obituary;
Ernst Georg Beck was a scholar and gentleman in every sense of the term. His friend wrote, “They tried to denounce Ernst Georg Beck in the Internet as naive amateur and data counterfeiter. Unfortunately, Ernst could hardly defend himself in the last months because of its progressive illness.” His work, determination and ethics were all directed at answering questions in the skeptical method that is true science; the antithesis of the efforts of all those who challenged and tried to block or denigrate him.
The 19th-century CO2 measures are no less accurate than those for temperature; indeed, I would argue that Beck shows they are superior. So why, for example, are his assessments any less valid than those made for the early portions of the Central England Temperatures (CET)? I spoke at length with Hubert Lamb about the early portion of Manley’s CET reconstruction because the instruments, locations, measures, records and knowledge of the observers were comparable to those in the Hudson’s Bay Company record I was dealing with.
Once the pre-industrial level was created it became necessary to ensure the new CO2 post-industrial trend continued. It was achieved when C.D.Keeling established the Mauna Loa CO2 measuring station. As Beck notes,
Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC.
Keeling’s son operates Mauna Loa and as Beck notes, “owns the global monopoly of calibration of all CO2 measurements.” He is also a co-author of the IPCC reports, which accept Mauna Loa and all other readings as representative of global levels. So the IPCC control the human production figures and the atmospheric CO2 levels and both are constantly and consistently increasing.
This diverts from the real problem with the measures and claims. The fundamental IPCC objective is to identify human causes of global warming. You can only determine the human portion and contribution if you know natural levels and how much they vary and we have only very crude estimates.
What Values Are Used for Each Component of the Carbon Cycle?
Dr. Dietrich Koelle is one of the few scientists to assess estimates of natural annual CO2 emissions.
Annual Carbon Dioxide Emissions GtC per annum
1.Respiration (Humans, animals, phytoplankton) 45 to 52
2. Ocean out-gassing (tropical areas) 90 to 100
3. Volcanic and other ground sources 0.5 to 2
4. Ground bacteria, rotting and decay 50 to 60
5. Forest cutting, forest fires 1 to 3
6. Anthropogenic emissions Fossil Fuels (2010) 9.5
TOTAL 196 to 226.5
Source: Dr. Dietrich Koelle
The IPCC estimate of human production (6) for 2010 was 9.5 GtC, but that is total production. One of the early issues in the push to ratify the Kyoto Protocol was an attempt to get US ratification. The US asked for carbon credits, primarily for CO2 removed through reforestation, so a net figure would apply to their assessment as a developed nation. It was denied. The reality is the net figure better represents human impact. If we use human net production (6) at 5 GtC for 2010, then it falls within the range of the estimate for three natural sources, (1), (2), and (4).
The Truth Will Out.
How much longer will the IPCC continue to produce CO2 data with trends to fit their hypothesis that temperature will continue to rise? How much longer before the public become aware of Gray’s colorful observation that, “The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different.” The almost 18-year leveling and slight reduction in global temperature is essentially impossible based on IPCC assumptions. One claim is already made that the hiatus doesn’t negate their science or projections, instead of acknowledging it, along with failed predictions completely rejects their fear mongering.
IPCC and EPA have already shown that being wrong or being caught doesn’t matter. The objective is the scary headline, enhanced by the constant claim it is getting worse at an increasing rate, and time is running out. Aldous Huxley said, “Facts do not cease to exist because they are ignored.” We must make sure they are real and not ignored.
[1] Reproduced with permission of William Kininmonth.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
rgbatduke says: August 5, 2014 at 5:39 am
“So much for CO_2 being a well-mixed gas.”
As with much of Beck’s data, you are seeing a daily cycle dominated by plant respiration/photosynthesis. That is close to the ground in Europe. If you get away from that, as at these sites, for example, you’ll avoid that daily cycle, and the measures are in close agreement, which indicares good mixing.
Here is a graphic which shows how closely the far separated stations agree on CO2 ppmv. There is variation i n the annual cycle, but the means track well.
Steven Mosher, you must be kidding. Take out just the anthropogenic portion of CO2 radiative affects and rerun weather (or if you prefer, climate) models at a 30 year time span (along with the necessary multiple trials). Run them just like the IPCC does. You would not be able to use the difference between the two sets of multiple spaghetti runs to say anything at all about the weather future. And you know that. In those spaghetti graphs, the ups and downs of the scenario results will have such a broad (and broadening) road, you might as well flip a coin to get better results. I stand completely behind my thought experiment and will not give an inch to you. We could have had the same weather, worse weather, or better weather. Anthropogenic CO2 radiative affects do not determine weather, therefore they cannot determine climate.
Look folks, the thing that determines weather thus climate is geography and your location in it, interacting with large and small scale oceanic/atmospheric teleconnected pressure systems. It is the battle of pressure systems, air heated or cooled, ladened with or not ladened with moisture, and traveling over your geographic location. Which one of these could anthropogenic CO2 substantially change, and even create a trend? It would have to be able to get in that powerful mix and muscle it around. It’s like saying the mouse lifts the elephant and hurls him out of the room instead of the elephant leaving under its own power.
So back to you Mosher. I am not saying that atmospheric gasses are not capable of reabsorbing and re-emitting longwave infrared radiation. Of course they are. I am saying that the anthropogenic CO2 molecules (a tiny, tiny fraction of all the LWIR absorbing/reemitting molecules present) in the atmosphere at any given time are not capable of changing the weather, thus the climate. Doesn’t have the cojones and the noise of natural forces buries it.
From the original post:
The reality is the net figure better represents human impact. If we use human net production (6) at 5 GtC for 2010, then it falls within the range of the estimate for three natural sources, (1), (2), and (4).
Well if the net figure is a better representation then we should use it for the natural sources as well, unfortunately for your thesis it’s overall negative, i.e. about -3GtC, which is why you don’t use it.
The point that the increase of CO2 over the past 30 years could be removed with no significant impact on weather seems valid since weather patterns over the past 30 years are basically indistinguishable from even longer historical trend records.
However, the assertions that the IPCC is incontrol of CO2 records and other inflammatory and easily disputed/disproven claims only distracts from the issue.
How dead is dead? It’s farcical, who doesn’t agree with the premise that if the forecast does not meet the reality, the forecast is wrong, as is the theory underpinning it. The question now becomes how long can the Post AGW debate stagger on.
I was tempted to add to my article a paragraph predicting who would react immediately and what they would say. They didn’t let me down.
Two comments by others expose false IPCC assumptions. First, that CO2 is evenly distributed through the atmosphere and second that somehow properties of CO2 don’t apply in air near the ground – insolation and IR pass through the entire atmospheric column.
Pamela Gray,
You mean something like this?
http://i81.photobucket.com/albums/j237/hausfath/ScreenShot2014-08-05at73255AM_zps8775e38b.png
Generally model runs with and without anthropogenic forcings have pretty distinctly different trajectories over the last 30 years.
rgbatduke says:
August 5, 2014 at 5:39 am
rg, CO2 is not well mixed in 5% of the atmosphere: the first few hundred meters over land near huge sources and sinks. Plants are huge sources at night (respiring up to 60 GtC summed over a year) and huge sinks during daylight (120 GtC intake over a year, but decay from falling leaves etc. add some 60 GtC/year again to the atmosphere).
CO2 is well mixed in 95% of the atmosphere: on mountain tops, in deserts and everywhere over the oceans or coastal with wind from the seaside.
Several tall towers measure CO2 at different heights (to calculate in/out fluxes) over land, which shows the difference in variability. Here for Cabauw (The Netherlands):
http://www.ferdinand-engelbeen.be/klimaat/klim_img/cabauw_day_week.jpg
The problem with many historical data is exactly that they were taken near ground over land: the middle of towns, under inversion, mountain valleys, forests,… mostly unsuitable to give even an idea of the background CO2 levels of that period.
Except if there was a lot of wind, then it is possible to estimate the background levels as wind mixes most differences out. Unfortunately, the longest series don’t have enough datapoints at high wind speed to make the calculation.
The before mentioned station at Giessen (Germany) was one of the cornerstones of Beck’s data. The historical data show a 1-sigma variability of 68 ppmv. In comparison, the modern station halves that (still very high) but Mauna Loa is around 4 ppmv, including the huge seasonal variation.
Integrated modern monthly data from Giessen are not very good to:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/giessen_mlo_monthly.jpg
and show a positive bias against “background” CO2.
Re comments about “World” CO2 being measured at Mauna Loa.
This has alway stuck me as asinine really.
Mauna Loa system, including the ongoing thirty years old, Pu`u `O`o, on-going eruption of Kilauea nearby, is the largest, most active volcano on the Planet at the moment. The magma reservoir is again refilling faster than Pu`u `O`o can erupt it and so conditions may soon be right for another large eruption from Mauna Loa itself. Meanwhile fumerols continue to pump out vast amounts of CO2, all across the Big Island’s active zones. Isn’t this what we are measuring? Surely it would make more sense to measure CO2 at some neutral point, like Mount Everest, or Mount Kilimanjaro , or somewhere that CO2 isn’t being emitted from all around the measuring instruments.
If man made CO2 is ~4%…..and it’s cumulative and what’s making CO2 levels rise….
Then you’re not going to get the straight line linear increase in CO2 that all the measurements show….
…you would have an exponential increase
Warmist Claptrap says:
August 5, 2014 at 7:57 am
Isn’t this what we are measuring?
If the wind is downstream of the slope of the volcanic vents at Mauna Loa, the measurements over an hour show a lot of variability. If that exceeds 0.25 ppmv (1 sigma) the data are not used for daily, monthly and yearly averages. The same happens with upwind conditions in the afternoon, when slightly depleted CO2 levels are measured from the valleys.
The including or excluding of outliers doesn’t change the average or trend with more than 0.1 ppmv at the end of the year. Here the 2008 raw hourly data + “cleaned” averages from Mauna Loa and the South Pole:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/co2_mlo_spo_raw_select_2008.jpg
but mind the scale!
But there are lots of other places where CO2 is measured, the South Pole started even before Mauna Loa, but misses a few years of continuous measurements (but still had flask sampling). Therefore Mauna Loa is often used as the reference. See:
http://www.esrl.noaa.gov/gmd/dv/iadv/
For the “global” CO2 average, Mauna Loa is not even used, only sealevel stations are used, spread over different latitudes…
Warmist Claptrap,
Hate to break it to you, but its not just Mauna Loa: http://www.esrl.noaa.gov/gmd/ccgg/ggrn.php
There are many areas of real uncertainty in climate science that are interesting to discuss. Beck’s theories, unfortunately, are not one of them.
Ferdinand Engelbeen says:
August 5, 2014 at 7:35 am
” Plants are huge sources at night (respiring up to 60 GtC summed over a year) and huge sinks during daylight (120 GtC intake over a year, but decay from falling leaves etc. add some 60 GtC/year again to the atmosphere).”
________________________
Just considering terrestrial plants, such a balance might only be true if one considers leaf mass/other plant material which might completely cycle on an annual basis, but plant CO2 uptake sequesters C in woody mass, as well. Overall, the biosphere sequesters more C each year than it produces, as witness such things as topsoil and tree rings, or measurably, by the known- increasing sink rate. One could say that the natural course of the biosphere as a whole, is to eat itself out of house and home, replenished historically on a geologic time scale, by periodic glaciation events, which more or less, start the whole process over again. Now, here we are with our annual emissions intervening in the slow, but inexorable process of the biosphere bankrupting itself. We can’t say for certain what may result from our inadvertent fertilization of the whole life process, because we’ve never been here before.
rgbatduke, here’s some Japanese measurements that show variations up to 650 ppm http://www.terrapub.co.jp/journals/GJ/pdf/4106/41060429.pdf
If memory serves the level in a corn field can go to zero at midday since the plants use it so aggressively.
You’re correct about the missing error bars. They’re generally missing in Climate Science(tm) as far I can tell and the ones that do get displayed are ridiculously small. Measuring ocean temps to 0.001 K, really?
“While you denigrate Pauling with regards to the use of vitamin C to combat the common cold I ask, have you personally tried it?
Well I have and I haven’t had a cold in over 15 years to 20 years.”
Sorry, but that is hardly a valid argument. I have NEVER taken vitamin C and avoid all citrus
fruits and haven’t had a cold in 30 years. So much for your “evidence.”
Alan Robertson says:
August 5, 2014 at 8:40 am
In a mature forest, as is mostly the case in the tropics, the balance is quite neutral, except for short disturbances like an El Niño. Extra-tropical forests indeed expand and are destroyed by ice ages and interglacials. We may be of some help with our extra CO2…
The extra uptake is more or less known out of the oxygen balance:
http://www.sciencemag.org/content/287/5462/2467.short and
http://www.bowdoin.edu/~mbattle/papers_posters_and_talks/BenderGBC2005.pdf
Not quite what she said: variation. If you take a particular model, and run it many times, it produces a (usually enormously wide) range of outcomes. What is very interesting is to compare and contrast the distribution of these outcomes, specifically to attempt to resolve:
a) The marginal probability of observing the actual present behavior of the climate, assuming as a null hypothesis that “this is a perfectly correct model”. The p-value is the basis of a hypothesis test — if it is a very low number (less than 0.05 traditionally) we are justified in rejecting the null hypothesis as our model is probably wrong.
b) The marginal shift in the probability distribution and/or p-value with CO_2. This is the basis of a Bayesian analysis that can actually estimate the posterior probability of the CO_2-specific component of the model being correct, for example.
Note well that one cannot legitimately average many (marginally failing) models and expect to get a successful one, in spite of the fact that this is precisely what is done, repeatedly, in climate science and specifically in AR5. Note that it is also a “capital mistake” to assume that it is nature that is in error or doing something “unlikely” rather than the models. Sure, maybe, “p happens” (to quote Marsaglia, a master of the hypothesis test) but in science and physics the second law basically states that “but don’t bet on it”. If a model marginally fails a p-value-based hypothesis test, the best you can say is “Answer cloudy, try again later”. If it decisively fails, it is time to pitch the model.
One model at a time. Not collectively. You cannot make ten Hartree models equal Hartree-Fock, or a hundred Hartree-Fock models give you the correct correlation/exchange energy for an electron in an atom. An incorrect, or approximate, model, cannot generally be corrected by using lots of equally incorrect, approximate models. The circumstances where this is not true — and they can be so corrected — are both very specific and very unlikely, and as a pure matter of fact are not realized in climate models.
So sure, Ms. Gray was speaking hastily, and Dr. Ball might have done better than to quote her, but the point is still the same. The variation of climate models in comparison with the observed climate is evidence for the assertion “CO_2 variation is empirically irrelevant to the climate’s variation”, as the actual climate is following the track indicated for no CO_2 increase while CO_2 is increasing. This is evidence for the assertion “this model is wrong” (one model at a time, for most of the CMIP5 models). It is not evidence for the assertion “this model is correct”, one model at a time, for most of the CMIP5 models.
Note well I say nothing about “proving” or “disproving” the models. Hypothesis testing is not usually that sharp (at least until p-values descend below 0.01 into the range of remote probability). However, “the pause” is certainly not evidence for the correctness of the models, and it is absurd to pretend that its continuation has (or should have) no impact on our confidence that the models are — one at a time — correct.
rgb
rgb
Latitude says:
August 5, 2014 at 8:09 am
…you would have an exponential increase
It is slightly quadratic:
http://www.ferdinand-engelbeen.be/klimaat/klim_img/temp_emiss_increase.jpg
Nick Stokes said : ” It’s hard to say human emissions had nothing to do with the CO2 rise.”
Apparently quite hard, since nobody said such a thing.
Which was precisely my point. This is also true of all of the other climate measures from the historical past. Where/when did they measure temperature? Near ground over land: the middle of towns, under inversion, surrounded by mountains or in forests or agricultural land and at widely variable times of day — mostly unsuitable to give even an idea of the background global average temperatures of that period. Where they did not regularly or accurately sample temperature until the very recent past includes: 70% of the Earth’s surface in one fell swoop (the oceans), 2/3 or thereabouts of the Earth’s continental surface land area (Antarctica, Siberia, much of China, much of Africa, Asia, and South America and even the US and Canadian West), and where they did sample it was corrupted with the “Human Habitation Effect” — humans alter their living environment from a “state of nature” to something that suits humans better. UHI is just one component of the HHE. HHE is overwhelmingly a source of local warming — local to the human habitations — but that is also precisely where things like temperature and CO_2 level and rainfall and wind speed/direction have historically been sampled. People don’t live so much in the middle of the South Pacific or the middle of Antarctica or in the North Atlantic at a depth of 100 meters or along ridge lines of mountains or in deserts.
The truly laughable thing is that when e.g. GISS corrects for UHI/HHE, it manages to make it relatively warm the present compared to the past by some truly awe inspiring legerdemain. HADCRUT4 doesn’t even bother — they just present UHI/HHE corrupted temperature series with the additional urban/human habitation warming projected onto the entire global average.
In a way I can respect that — HADCRUT4 at least can be viewed as a time dependent upper bound on the actual global temperature that is strictly increasing (relative to the expected “true” average”) from the historical past to the present. So when HADCRUT4 indicates (say) 0.4-0.6C of warming over the last fifty years we can be certain that the actual number is smaller than this although we cannot really say by how much. Whenever anybody tries to determine how much, it seems as though at least half of the warming observed is HHE error. That leaves only half of the warming to be explained by both CO_2 and natural variation, which might well leave only 0.1-0.2 C of actual CO_2 driven warming, with a total expected sensitivity well under 1 C for the rest of the century.
That’s one key part of AR5’s repeated SPM assertion that they are ever so certain that at least half of the warming is due to human CO_2. What they really should mean is that half of the warming is due to the HHE and is an error, with the other half attributable in an unresolvable mix to natural and anthropogenic causes. Unresolvable because to resolve it we’d have to be able to solve the Navier-Stokes equation with unknown initial conditions on an absurdly coarse grid a mere five orders of magnitude larger than the Kolmogorov scale for the dynamics, with nothing but highly biased guesses for how to project the microdynamics onto the coarse grained model solvers.
Truly, the miracle is that they get anybody to believe all of this stuff.
rgb
It is slightly quadratic:
====
then the hypothesis is not correct
If nature is able to use a little of it……then nature can use it all
When will we finally truly do the math? The anthropogenic only portion of atmospheric CO2, let alone China’s portion, does not have the cojones necessary to make one single bit of “weather” do a damn thing different.
Remember to multiply all the factors. China’s portion of the 3% antro to total CO2, then factor in that water vapor is at least 60% of the GHG factor. Then remember that levels of radiative heat transfer, as stated in an engineering manual, – at room temperatures radiative heat transfer can generally be ignored. Evaporation / convection are the major drivers.
Preindustrial CO2 levels were one of the ‘facts’ that sceptics had remarkably not challenged in any significant way. Interestingly, on the the climate facts piece on Portugal :
http://wattsupwiththat.com/2014/08/05/surprising-facts-about-climate-change-in-portugal-why-the-climate-catastrophe-is-not-happening/
I commented this before reading this current thread:
” Gary Pearse says:
August 5, 2014 at 6:22 am
I’m sceptical that CO2 levels were below 285 over the past couple of thousand years. During the MWP, wine grapes were grown in Scotland, farmsteads fluorished in Greenland, etc . Low CO2 doesn’t jibe with this kind of situation. That CO2 is higher today than previously during the last 1000 years or so is the next bit of climate sophistry that is going to bite the dust.”
I wasn’t aware that it had already gotten underway, starting with Pamela’s comment and Tim’s post (that there was criticism before had been well managed and stifled by the team, as this is the foundation of their theory – sceptics had largely been arguing temperature aspects, but the ‘pause’ best showed the divergence between temp and CO2 so was a natural progression to look more closely at CO2 ‘data’). Yes, this is the final major item that needs rooting out. All the critiques raised concerning temperature leading CO2, CO2 being higher during some ice ages etc, but the fact that CO2 was still bubbling away with temps flat and declining for a period as long as that of the modern day global warming ‘era’ put it in the spotlight. I now look forward to an avalanche of papers on CO2 level proxies and the giving of the little foraminifera thermometers a well deserved rest.
rgbatduke:
Many thanks for your superb post at August 5, 2014 at 9:38 am which is here. It concludes saying
I completely agree, and my agreement is not surprising because your post I have here linked can e considered as being an exposition of the point I made in my post at August 5, 2014 at 2:36 am which is here here and concludes y saying with a reference
Richard