Guest essay by Donald C. Morton
Herzberg Program in Astronomy and Astrophysics, National Research Council of Canada
ABSTRACT
The Report of the Intergovernmental Panel on Climate Change released in September 2013 continues the pattern of previous ones raising alarm about a warming earth due to anthropogenic greenhouse gases. This paper identifies six problems with this conclusion – the mismatch of the model predictions with the temperature observations, the assumption of positive feedback, possible solar effects, the use of a global temperature, chaos in climate, and the rejection of any skepticism.
THIS IS AN ASTROPHYSICIST’S VIEW OF CURRENT CLIMATOLOGY. I WELCOME CRITICAL COMMENTS.
1. INTRODUCTION
Many climatologists have been telling us that the environment of the earth is in serious danger of overheating caused by the human generation of greenhouse gases since the Industrial Revolution. Carbon dioxide (CO2) is mainly to blame, but methane (CH4), nitrous oxide (N2O) and certain chlorofluorocarbons also contribute.
“As expected, the main message is still the same: the evidence is very clear that the world is warming, and that human activities are the main cause. Natural changes and fluctuations do occur but they are relatively small.” – John Shepard in the United Kingdom, 2013 Sep 27 for the Royal Society.
“We can no longer ignore the facts: Global warming is unequivocal, it is caused by us and its consequences will be profound. But that doesn’t mean we can’t solve it.” -Andrew Weaver in Canada, 2013 Sep 28 in the Globe and Mail.
“We know without a doubt that gases we are adding to the air have caused a planetary energy imbalance and global warming, already 0.8 degrees Celsius since pre-industrial times. This warming is driving an increase in extreme weather from heat waves to droughts and wild fires and stronger storms . . .” – James Hansen in United States, 2013 Dec 6 CNN broadcast.
Are these views valid? In the past eminent scientists have been wrong. Lord Kelvin, unaware of nuclear fusion, concluded that the sun’s gravitational energy could keep it shining at its present brightness for only 107 years. Sir Arthur Eddington correctly suggested a nuclear source for the sun, but rejected Subrahmanyan Chandrasekhar’s theory of degenerate matter to explain white dwarfs. In 1983 Chandrasekhar received the Nobel Prize in Physics for his insight.
My own expertise is in physics and astrophysics with experience in radiative transfer, not climatology, but looking at the discipline from outside I see some serious problems. I presume most climate scientists are aware of these inconsistencies, but they remain in the Reports of the Intergovernmental Panel on Climate Change (IPCC), including the 5th one released on 2013 Sep 27. Politicians and government officials guiding public policy consult these reports and treat them as reliable.
2. THEORY, MODELS AND OBSERVATIONS
A necessary test of any theory or model is how well it predicts new experiments or observations not used in its development. It is not sufficient just to represent the data used to produce the theory or model, particularly in the case of climate models where many physical processes too complicated to code explicitly are represented by adjustable parameters. As John von Neumann once stated “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.” Four parameters will not produce all the details of an elephant, but the principle is clear. The models must have independent checks.
Fig. 1. Global Average Temperature Anomaly (°C) upper, and CO2 concentration (ppm) lower graphs from http://www.climate.gov/maps-data by the U.S. National Oceanic and Atmospheric Administration. The extension of the CO2 data to earlier years is from the ice core data of the Antarctic Law Dome ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/law/law_co2.txt.
The upper plot in Fig. 1 shows how global temperatures have varied since 1880 with a decrease to 1910, a rise until 1945, a plateau to 1977, a rise of about 0.6 ºC until 1998 and then essentially constant for the next 16 years. Meanwhile, the concentration of CO2 in our atmosphere has steadily increased. Fig. 2 from the 5th Report of the Intergovernmental Panel on Climate Change (2013) shows that the observed temperatures follow the lower envelope of the predictions of the climate models.
Fig. 2. Model Predictions and Temperature Observations from IPCC Report 2013. RCP 4.5 (Representative Concentration Pathway 4.5) labels a set of models for a modest rise in anthropogenic greenhouse gases corresponding to an increase of 4.5 Wm–2 (1.3%) in total solar irradiance.
Already in 2009 climatologists worried about the change in slope of the temperature curve. At that time Knight et al. (2009) asked the rhetorical question “Do global temperature trends over the last decade falsify climate predictions?” Their response was “Near-zero and even negative trends are common for intervals of a decade or less in the simulations, due to the model’s internal climate variability. The simulations rule out (at the 95% level) zero trends for intervals of 15 yr or more, suggesting that an observed absence of warming of this duration is needed to create a discrepancy with the expected present-day warming rate.”
Now some climate scientists are saying that 16 years is too short a time to assess a change in climate, but then the rise from 1978 to 1998, which was attributed to anthropogenic CO2, also could be spurious. Other researchers are actively looking into phenomena omitted from the models to explain the discrepancy. These include
1) a strong natural South Pacific El Nino warming event in 1998 so the plateau did not begin until 2001,
2) an overestimate of the greenhouse effect in some models,
3) inadequate inclusion of clouds and other aerosols in the models, and
4) a deep ocean reservoir for the missing heat.
Extra warming due to the 1978 El Nino seems plausible, but there have been others that could have caused some of the earlier warming and there are also cooling La Nina events. All proposed causes of the plateau must have their effects on the warming also incorporated into the models to make predictions that then can be tested during the following decade or two of temperature evolution.
3. THE FEEDBACK PARAMETER
There is no controversy about the basic physics that adding CO2 to our atmosphere absorbs solar energy resulting in a little extra warming on top of the dominant effect of water vapor. The CO2 spectral absorption is saturated so is proportional to the logarithm of the concentration. The estimated effect accounts for only about half the temperature rise of 0.8 ºC since the Industrial Revolution. Without justification the model makers ignored possible natural causes and assumed the rise was caused primarily by anthropogenic CO2 with reflections by clouds and other aerosols approximately cancelling absorption by the other gases noted above. Consequently they postulated a positive feedback due to hotter air holding more water vapor, which increased the absorption of radiation and the backwarming. The computer simulations represented this process and many other effects by adjustable parameters chosen to match the observations. As stated on p. 9-9 of IPCC2013, “The complexity of each process representation is constrained by observations, computational resources, and current knowledge.” Models that did not show a temperature rise would have been omitted from any ensemble so the observed rise effectively determined the feedback parameter.
Now that the temperature has stopped increasing we see that this parameter is not valid. It even could be negative. CO2 absorption without the presumed feedback will still happen but its effect will not be alarming. The modest warming possibly could be a net benefit with increased crop production and fewer deaths due to cold weather.
4. THE SUN
The total solar irradiance, the flux integrated over all wavelengths, is a basic input to all climate models. Fortunately our sun is a stable star with minimal change in this output. Since the beginning of satellite measures of the whole spectrum in 1978 the variation has been about 0.1% over the 11-year activity cycle with occasional excursions up to 0.3%. The associated change in tropospheric temperature is about 0.1 ºC.
Larger variations could explain historical warm and cold intervals such as the Medieval Warm Period (approx. 950 – 1250) and the Little Ice Age (approx. 1430 – 1850) but remain as speculations. The sun is a ball of gas in hydrostatic equilibrium. Any reduction in the nuclear energy source initially would be compensated by a gravitational contraction on a time scale of a few minutes. Complicating this basic picture are the variable magnetic field and the mass motions that generate it. Li et al. (2003) included these effects in a simple model and found luminosity variations of 0.1%, consistent with the measurements.
However, the sun can influence the earth in many other ways that the IPCC Report does not consider, in part because the mechanisms are not well understood. The ultraviolet irradiance changes much more with solar activity, ~ 10% at 200 nm in the band that forms ozone in the stratosphere and between 5% and 2% in the ozone absorption bands between 240 and 320 nm according to DeLand & Cebula (2012). Their graphs also show that these fluxes during the most recent solar minimum were lower than the previous two reducing the formation of ozone in the stratosphere and its absorption of the near UV spectrum. How this absorption can couple into the lower atmosphere is under current investigation, e. g. Haigh et al. (2010).
Fig. 3 – Monthly averages of the 10.7 cm solar radio flux measured by the National Research Council of Canada and adjusted to the mean earth-sun distance. A solar flux unit = 104 Jansky = 10-22 Wm-2 Hz-1. The maximum just past is unusually weak and the preceding minimum exceptionally broad. Graph courtesy of Dr. Ken Tapping of NRC.
Decreasing solar activity also lowers the strength of the heliosphere magnetic shield permitting more galactic cosmic rays to reach the earth. Experiments by Kirkby et al. (2011) and Svensmark et al. (2013) have shown that these cosmic rays can seed the formation of clouds, which then reflect more sunlight and reduce the temperature, though the magnitude of the effect remains uncertain. Morton (2014) has described how the abundances cosmogenic isotopes 10Be and 14C in ice cores and tree rings indicate past solar activity and its anticorrelation with temperature.
Of particular interest is the recent reduction in solar activity. Fig. 3 shows the 10.7 cm solar radio flux measured by the National Research Council of Canada since 1947 (Tapping 2013) and Fig. 4 the corresponding sunspot count. Careful calibration of the radio flux permits reliable comparisons
Fig. 4. Monthly sunspot numbers for the past 60 years by the Royal Observatory of Belgium at http://sidc.oma.be/sunspot-index-graphics/sidc_graphics.php.
over six solar cycles even when there are no sunspots. The last minimum was unusually broad and the present maximum exceptionally weak. The sun has entered a phase of low activity. Fig. 5 shows that previous times of very low activity were the Dalton Minimum from about 1800 to 1820 and the Maunder Minimum from about 1645 to 1715 when very few spots were seen. Since
these minima occurred during the Little Ice Age when glaciers were advancing in both Northern and Southern Hemispheres, it is possible that we are entering another cooling period. Without a
physical understanding of the cause of such cool periods, we cannot be more specific. Temperatures as cold as the Little Ice Age may not happen, but there must be some cooling to compensate the heating that is present from the increasing CO2 absorption.
Regrettably the IPCC reports scarcely mention these solar effects and the uncertainties they add to any prediction.
5. THE AVERAGE GLOBAL TEMPERATURE
Long-term temperature measurements at a given location provide an obvious test of climate change. Such data exist for many places for more than a hundred years and for a few places for much longer. With these data climatologists calculate the temperature anomaly – the deviation from a many-year average such as 1961 to 1990, each day of the year at the times a measurement
is recorded. Then they average over days, nights, seasons, continents and oceans to obtain the mean global temperature anomaly for each month or year as in Fig. 1. Unfortunately many parts of the world are poorly sampled and the oceans, which cover 71% of the earth’s surface, even less so. Thus many measurements must be extrapolated to include larger areas with different
climates. Corrections are needed when a site’s measurements are interrupted or terminated or a new station is established as well as for urban heat if the meteorological station is in a city and altitude if the station is significantly higher than sea level.
Fig. 5. This plot from the U. S. National Oceanic and Atmospheric Agency shows sunspot numbers since their first observation with telescopes in 1610. Systematic counting began soon after the discovery of the 11-year cycle in 1843. Later searching of old records provided the earlier numbers.
The IPCC Reports refer to four sources of data for the temperature anomaly from the Hadley Centre for Climate Prediction and Research and the European Centre for Medium-range Weather Forcasting in the United Kingdom and the Goddard Institute for Space Science and the National Oceanic and Atmospheric Administration in the United States. For a given month they can differ by several tenths of a degree, but all show the same long-term trends of Fig. 1, a rise from 1978 to 1998 and a plateau from 1998 to the present.
These patterns continue to be a challenge for researchers to understand. Some climatologists like to put a straight line through all the data from 1978 to the present and conclude that the world is continuing to warm, just a little more slowly, but surely if these curves have any connection to reality, changes in slope mean something. Are they evidence of the chaotic nature of climate with abrupt shifts from one state to another?
Essex, McKitrick and Andresen (2007) and Essex and McKitrick (2007) in their popular book have criticized the use of these mean temperature data for the earth. First temperature is an intensive thermodynamic variable relevant to a particular location in equilibrium with the measuring device. Any average with other locations or times of day or seasons has no physical meaning. Other types of averages might be more appropriate such as the second, fourth or inverse power of the absolute temperature, each of which would give a different trend with time. Furthermore it is temperature differences between two places that drive the dynamics. Climatologists have not explained what this single number for global temperature actually means. Essex and McKitrick note that it “is not a temperature. Nor is it even a proper statistic or index. It is a sequence of different statistics grafted together with ad hoc models.”
This questionable use of a global temperature along with the problems of modeling a chaotic system discussed below raise basic concerns about the validity of the test with observations in Section 2. Since climatologists and the IPCC insist on using this temperature number and the models in their predictions of global warming, it still is appropriate to hold them to comparisons with the observations they consider relevant.
6. CHAOS
Essex and McKitrick (2007) have provided a helpful introduction to this problem. Thanks to the pioneering investigations into the equations for convection and the associated turbulence by meteorologist Edward Lorenz, scientists have come to realize that many dynamical systems are fundamentally chaotic. The situation often is described as the butterfly effect because a small change in initial conditions such as the flap of a butterfly wing can have large effects in later results.
Convection and turbulence in the air are central phenomenon in determining weather and so must have their effect on climate too. The IPCC on p. 1-25 of the 2013 Report recognizes this with the statement “There are fundamental limits to just how precisely annual temperatures can be projected, because of the chaotic nature of the climate system.” but then makes predictions with confidence. Meteorologists modeling weather find that their predictions become unstable after a week or two, and they have the advantage of refining their models by comparing predictions with observations.
Why do the climate models in the IPCC reports not show these instabilities? Have they been selectively tuned to avoid them or are the chaotic physical processes not properly included? Why should we think that long-term climate predictions are possible when they are not for weather?
7. THE APPEAL TO CONSENSUS AND THE SILENCING OF SKEPTICISM
Frequently we hear that we must accept that the earth is warming at an alarming rate due to anthropogenic CO2 because 90+% climatologists believe it. However, science is not a consensus discipline. It depends on skeptics questioning every hypothesis, every theory and every model until all rational challenges are satisfied. Any endeavor that must prove itself by appealing to consensus or demeaning skeptics is not science. Why do some proponents of climate alarm dismiss critics by implying they are like Holocaust deniers? Presumably most climatologists disapprove of these unscientific tactics, but too few speak out against them.
8. SUMMARY AND CONCLUSIONS
At least six serious problems confront the climate predictions presented in the last IPCC Report. The models do not predict the observed temperature plateau since 1998, the models adopted a feedback parameter based on the unjustified assumption that the warming prior to 1998 was primarily caused by anthopogenic CO2, the IPCC ignored possible affects of reduced solar activity during the past decade, the temperature anomaly has no physical significance, the models attempt to predict the future of a chaotic system, and there is an appeal to consensus to establish climate science.
Temperatures could start to rise again as we continue to add CO2 to the atmosphere or they could fall as suggested by the present weak solar activity. Many climatologists are trying to address the issues described here to give us a better understanding of the physical processes involved and the reliability of the predictions. One outstanding issue is the location of all the anthropogenic CO2. According to Table 6.1 in the 2013 Report, half goes into the atmosphere and a quarter into the oceans with the remaining quarter assigned to some undefined sequestering as biomass on the land.
Meanwhile what policies should a responsible citizen be advocating? We risk serious consequences from either a major change in climate or an economic recession from efforts to reduce the CO2 output. My personal view is to use this temperature plateau as a time to reassess all the relevant issues. Are there other environmental effects that are equally or more important than global warming? Are some policies like subsidizing biofuels counterproductive? Are large farms of windmills, solar cells or collecting mirrors effective investments when we are unable to store energy? How reliable is the claim that extreme weather events are more frequent because of the global warming? Is it time to admit that we do not understand climate well enough to know how to direct it?
References
DeLand, M. T., & Cebula, R. P. (2012) Solar UV variations during the decline of Cycle 23. J. Atmosph. Solar-Terrestr. Phys., 77, 225.
Essex, C., & McKitrick, R. (2007) Taken by storm: the troubled science, policy and politics of global warming, Key Porter Books. Rev. ed. Toronto, ON, Canada.
Essex, C., McKitrick, R., & Andresen, B. (2007) Does a Global temperature Exist? J. Non-Equilib. Thermodyn. 32, 1.
Haigh. J. D., et al. (2010). An influence of solar spectral variations on radiative forcing of climate. Nature 467, 696.
IPCC (2013), Climate Change 2013: The Physicsal Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change, http://www.ipcc.ch
Li, L. H., Basu, S., Sofia, S., Robinson, F.J., Demarque, P., & Guenther, D.B. (2003). Global
parameter and helioseismic tests of solar variability models. Astrophys. J., 591, 1284.
Kirkby, J. et al. (2011). Role of sulphuric acid, ammonia and galactic cosmic rays in atmospheric
aerosol nucleation. Nature, 476, 429.
Knight, J., et al. (2009). Bull. Amer. Meteor. Soc., 90 (8), Special Suppl. pp. S22, S23.
Morton, D. C. (2014). An Astronomer’s view of Climate Change. J. Roy. Astron. Soc. Canada, 108, 27. http://arXiv.org/abs/1401.8235.
Svensmark, H., Enghoff, M.B., & Pedersen, J.O.P. (2013). Response of cloud condensation nuclei (> 50 nm) to changes in ion-nucleation. Phys. Lett. A, 377, 2343.
Tapping, K.F. (2013). The 10.7 cm radio flux (F10.7). Space Weather, 11, 394.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Stephen Wilde says:
“fluxes during the most recent solar minimum were lower than the previous two reducing the formation of ozone in the stratosphere and its absorption of the near UV spectrum. ”
“Apparently the sign of the ozone response was reversed above 45 km and given that the descending stratospheric polar vortices would bring that reversed response down towards the surface at the poles we should be looking for a warmer stratosphere and lower tropopause heights at the poles whilst the sun is less active.
That would then be the cause of more and larger parcels of cold polar air surging across middle latitudes in winter”
I have noted similarities with this Winter and the bitter cold ones of 1970’s, particularly 1976/77. During that Winter, the “Polar Vortex” repeatedly dropped south into the US and Southern Canada very much like this Winter(though the upper level ridge at higher latitudes this time has been farther west). This occurred much more frequently in the 1970’s, then became rare after that………..after the sun became more active.
Note the similarities with the current sun and it being less active during the 70’s.
Also, last Winter in the Southern Hemisphere July/August 2013, we witnessed the same thing happening. The Polar Vortex is much harder to temporarily displace there(it is much stronger at the South Pole) but intense cold penetrated great distances towards the equator(coffee growing regions of Brazil saw their coldest weather in decades) with record cold and snow into Northern Argentina and Southern Brazil.
At the same time, the Arctic last Summer was having one of its coldest Summer’s and earlier this Summer, the Antarctic had widespread cold anomalies sitting on top of it for many consecutive weeks.
Dr Norman Page says:
February 17, 2014 at 8:24 am
For some years I have suggested in various web comments and on my blog that the Hadley Sea Surface Temperature data is the best metric for the following reasons .
I agree. And what does Hadsst3 show? There has been no warming for 13 years and 2 months since December 2000. See:
http://www.woodfortrees.org/plot/hadsst3gl/from:2000.9/plot/hadsst3gl/from:2000.9/trend
Furthermore, the January 2014 anomaly of 0.341 would only rank 2014 at 11th place if it stayed this way.
richardscourtney says:
February 17, 2014 at 8:21 am
“Please read Appendix B of this.”
I have already, thanks. The point I was making was that regardless of what the data is tracking, just observing the changing deltas to those figures and without in any way manipulating or further extending them DOES provide a potentially useful metric.
All on its own and without any necessary link to a wider picture.
These sampling points have changed is this way (assuming that instrument changes , etc. have been correctly dealt with) over this time.
To extend beyond that is to my mind the fatal flaw. To somehow think that we can from those changes derive some understanding about what an almost mythical ‘temperature field’ has done.
Steven Mosher:
At February 17, 2014 at 8:58 am you assert
How? And why?
Richard
Steven Mosher says:
February 17, 2014 at 8:58 am
“Now the CO2 line most definitely does NOT explain those extra, only 50, years. Of non-declining temps as CO2 would require.”
Huh, C02 sure does work. http://static.berkeleyearth.org/img/annual-with-forcing-small.png”
Interesting. Would you care to give your Nyquist sampling rate before, say 1900, as to the accuracy of those figures?
And your 1*1 degree grid percentage as I have repeatedly asked you for (not station count as that is an almost useless statistic).
“And on your proxies.. you need to recalibrate them.”
Really. They are calibrated to HadCrut4 as per their literature mostly says. What would you suggest I calibrate them to?
Thank you, a very effective summary of the manifold and manifest weaknesses in the warmists’ arguments.
One typo: “affects” should be “effects” in the conclusion: “…the IPCC ignored possible affects of reduced solar activity during the past decade,…”
Don’t forget ‘falsifiable hypothesis’, which was only indirectly referred to in the abstract. It’s not the be-all and end-all, but it is a ‘third leg’ to stand on wrt scientific hypothesis, isn’t it?
Sure there is. Or are you discounting those that say CO2 has no climate effect whatsoever?
richardscourtney says:
February 17, 2014 at 9:05 am
“How? And why?”
Because a simple greater than 15 year low pass ‘Gaussian’ filter makes for very awkward explanations otherwise. Please note he did not ask first HOW I calibrated them, just asserted it was wrong.
Thank you Dr. Morton for a concise summary of the skeptic side of eth mainstream climate debate..
The “mainstream” global warming debate centres on the magnitude of Equilibrium Climate Sensitivity (“ECS”) to atmospheric CO2, which is the primary subject of contention between global warming alarmists (aka “warmists”) and climate skeptics (aka “skeptics”).
Warmists typically say ECS is high, greater than ~~3 degrees C [3C/(2xCO2)] and therefore DANGEROUS global warming will result, whereas skeptics say ECS is 1C or less and any resulting global warming will NOT be dangerous.
The scientific evidence to date strongly suggests that if one had to pick a side, the skeptics are more likely to be correct.
However, BOTH sides of this factious debate are in all probability technically WRONG. In January 2008 I demonstrated that CO2 LAGS temperature at all measured time scales*, so the mainstream debate requires that “the future is causing the past”, which I suggest is demonstrably false.
In climate science we do not even agree on what drives what, and it is probable that the majority, who reside on BOTH sides of the ECS mainstream debate, are both technically WRONG.
Hypothesis:
Based on the preponderance of evidence, temperature drives CO2 much more than CO2 drives temperature, so ECS may not exist at all at the “macro” scale, and may be utterly irrelevant to climate science except at the “micro” (and materially insignificant) scale.
There may be other significant sources of CO2 that contribute to its increase in the atmosphere, but increasing CO2 just does not have a significant or measureable impact on global warming (or cooling), which is almost entirely natural in origin.
I therefore suggest that the oft-fractious “mainstream debate” between warmists and skeptics about the magnitude of ECS is materially irrelevant. ECS, if it exists at all, is so small that it just does not matter.
Wait 5 to 10 more years – I suggest that by then most serious climate scientists will accept the above hypo. Many will claim they knew it all along… 🙂
________
* If ECS (which assumes CO2 drives temperature) actually exists in the Earth system, it is so small that it is overwhelmed by the reality that temperature drives CO2.
Proof:
In this enormous CO2 equation, the only signal that is apparent is that dCO2/dt varies ~contemporaneously with temperature, and CO2 lags global Lower Troposphere temperatures by about 9 months.
http://icecap.us/index.php/go/joes-blog/carbon_dioxide_in_not_the_primary_cause_of_global_warming_the_future_can_no/
CO2 also lags temperature by about 800 years in the ice core record on a longer time scale.
To suggest that ECS is larger that 1C is not credible. I suggest that if ECS exists at all, it is much smaller than 1C, so small as to be essentially insignificant.
Regards, Allan
The important senstence below is incomplete.
“Is it time to admit that we do not understand climate well enough to know how to direct it?”
It should read simething like:
Is it time to admit that we do not understand climate well enough to know how to direct it and is our ability to do this not highly questionable?
The point is that the CAGW crowd not only claim (in the fce of contrary evidence) that the extra CO2 is the main driver of rising temperatures but that we can slow the temperature rise down by tinkeing with out CO2 producisn rate. Tbis must surely be the height of unbridled hubris.
John W. Garrett says:
February 17, 2014 at 8:38 am
I don’t believe the alleged temperature data as adjusted by NASA, NOAA & HadCRU. In fact I’m sure they’re are fictional, & getting more so with each passing month.
The 1930s were warmer not only in the US than the 1990s, but globally, just there were 30-year phases of the Medieval Warm Period hotter than now, more of them in the Roman WP, more yet in the Minoan WP & lots of them during the Holocene Climatic Optimum. Not to mention most of the Eemian Interglacial, without benefit of a Neanderthal Industrial Age.
RichardLH:
Thankyou for your reply to me which you provide at February 17, 2014 at 9:02 am.
Clearly, I am failing to understand what you are trying to say to me.
You write
OK. Let me break down what I am not understanding because that may unblock the impasse.
What would be the “potentially useful metric”?
What would it indicate and why would that be “useful”?
What is an “almost mythical ‘temperature field’” and how is it defined?
How does applying the Nyquist Limit do other than reject sample points?
Indeed, what relevance has the Nyquist Limit; e.g. would you reject a region because it only contains one measurement site?
Please note that I am not agreeing and not disagreeing with you. I cannot because I am trying to understand your point which I cannot accept or dispute until I do understand it.
Richard
“Consequently they postulated a positive feedback due to hotter air holding more water vapor, which increased the absorption of radiation and the backwarming.”
————————–
This is a testable hypothesis. If correct , we ought to have some sort of record of increasing total atmospheric water vapor with time or some sort of positive correlation between global water vapor & global temperature (which might even allow defining a “feedback parameter” more accurately.
Does anyone know if a water vapor data set exists? & if so a link to it online ?
I’m disappointed by the essay. One would expect something much more analytical from an astrophysicist than an intelligent narrative re-hash of points already made many times here on WUWT.
For example, the radiative forcing of CO2 is well-established, as Dr. Morton notes (1): “There is no controversy about the basic physics that adding CO2 to our atmosphere absorbs solar energy…”
But his conclusion (2), “… resulting in a little extra warming on top of the dominant effect of water vapor.” begs the climatological question. Radiative absorption by CO2 puts energy into the atmosphere (step 1), true. But how the climate system partitions that extra kinetic energy is the question at hand.
The IPCC and its minions claim all that extra energy warms the oceans and the atmosphere. However, the terrestrial climate has many more response channels than that. The energy could increase the strength of convective updrafts, for example. It could change the cloud fraction. It could increase tropical rainfall.
Any of these could disperse the energy produced by CO2 absorption so that there is no detectable increase in sensible atmospheric heat at all.
All the predicted baseline warming due to the radiative forcing of CO2 is calculated under conditions of ceteris paribus — everything else remains equal. That is, all climatological conditions remain constant: no change in clouds, convection, or tropical rainfall ≈ 1 C of air temperature change per doubling. But all those things will change. And no one knows how or by how much.
The entire IPCC message depends on neglecting our profound ignorance about how the terrestrial climate operates. It’s all pretend science (I’ll have an article in an up-coming E&E issue about the battening on ignorance of consensus climatology).
And that reminds me. GCMs don’t make scientific predictions, because they do not converge to unique physical solutions. Comparing their outputs to observations is completely pointless. Doing so raises GCMs to a status they absolutely do not deserve.
I’d expect an astrophysicist to know all this, and do a few calculations to make the point that only small adjustments of the climate can obviate any possible effect of doubling CO2 from 300 to 600 ppmv. That would be a far stronger repudiation of the IPCC and its betrayers of science than just the narrative displays we got here.
Dr. Morton says:
“Experiments by Kirkby et al. (2011) and Svensmark et al. (2013) have shown that these cosmic rays can seed the formation of clouds.”
This is not correct. Their experiments showed that new particles in the nanometer range can be created, but left unclear whether they can grow to the 50-nm cloud condensation nuclei (CCN) diameters needed to promote cloud formation. This further step needs to be clarified before the Svensmark hypothesis can be accepted.
richardscourtney says:
February 17, 2014 at 9:18 am
What would be the “potentially useful metric”?
That a sampling point(s) had changed in this way.
What would it indicate and why would that be “useful”?
It would show an evolution over time for that spot(s) (only)
What is an “almost mythical ‘temperature field’” and how is it defined?
Trying to work out what has happened around and between the sampling points based on what happens to them. ( a futile quest as Nyquist has hardly been honoured at all).
How does applying the Nyquist Limit do other than reject sample points?
Shows there are insufficient sampling points to provide a true estimate of the field between the points. Especially true the further back in time we go.
Indeed, what relevance has the Nyquist Limit; e.g. would you reject a region because it only contains one measurement site?
See above
Please note that I am not agreeing and not disagreeing with you. I cannot because I am trying to understand your point which I cannot accept or dispute until I do understand it.
My fault in explanations probably.
We don’t have a crises in climatology.
We all know who is sprouting nonsense and who is delivering real science.
We have an integrity crises in politics.
That’s an entirely different ball game.
2kevin says:
February 17, 2014 at 7:40 am
“One thing I’ve never understood since I am not knowledgeable on the subject:
(Like the idea that a pound of lead and two pounds of lead can be the same temperature but have different quantities of heat.) If so, what does it mean?”
——————
What that mean is, …… “pounds” are a measurement of the total molecular weight of all the lead molecules in each volume – 1 pound verses 2 pounds.
Temperature is a measurement of the “heat induced” excited state of each and every lead molecule in their respective volume of lead. (for this example that is)
Thus, it requires twice (2X) the amount of heat energy to excite all the molecules in 2 pounds of lead to a specific temperature …. as it does to excite all the molecules in 1 pound of lead to the same specific temperature.
Or, it means that, it takes 4 times as much heat to get 1 gallon of water to boil as it does to get 1 quart of water to boil (given they are both at the same temperature at the start).
Thus the temperatures are the same but the quantity of heat in each one is different.
Steven Mosher says:
“Now the CO2 line most definitely does NOT explain those extra, only 50, years. Of non-declining temps as CO2 would require.”
Huh, C02 sure does work.
http://static.berkeleyearth.org/img/annual-with-forcing-small.png
+++++++++++++++++++++++++++++
That chart reminds me of this one.
Note that the red line isn’t CO2. It’s the U.S. CPI…
Lance Wallace says:
February 17, 2014 at 9:23 am
Am I mistaken in thinking that Svensmark, Enghoff & Pederson in Physical Letters A (2013) showed that, where ultraviolet light produces aerosols from trace amounts of ozone, sulfur dioxide & water vapor, clusters of these cloud condensation nuclei precursors produced by gamma ray ionization all grew up to diameters larger than 50 nm?
http://www.sciencedirect.com/science/article/pii/S0375960113006294
What is really frustrating is that May, as a pioneer of chaos theory must surely appreciate the inherent complexity of complex non-linear dynamic systems. Yet he is religiously warmist!
dbstealey says:
February 17, 2014 at 9:37 am
“That chart reminds me of this one.
Note that the red line isn’t CO2. It’s the U.S. CPI…”
With error bars as big as they are in BEST you could draw almost anything through it and get it to fit. Remember it is only the outer edges of the envelope that count, not that over convenient centre line.
Whoops–I apologize for my hasty comment above that Svensmark had not shown growth to CCN size of 50 nm. He has in fact reported (2013) on a chamber experiment showing such growth in the presence of ions when the concentration of sulfuric acid is kept constant. The remaining problem then is to find a mechanism in the real atmosphere where the sulfuric acid can be replenished. Given Svensmark’s dogged persistence in testing his hypothesis, it would be unwise to bet against him.
RichardLH,
Here is the BEST graph from the same Berkeley group: adjusted & raw data.
You can do lots of things with graphs, once you start ‘adjusting’.