Yet another significant paper finds low climate sensitivity to CO2, suggesting there is no global warming crisis at hand

Hot on the heels of the Lewis and Curry paper, we have this new paper, which looks to be well researched, empirically based, and a potential blockbuster for dimming the alarmism that has been so prevalent over climate sensitivity. With a climate sensitivity of just 0.43°C, it takes the air out of the alarmism balloon.

The Hockey Schtick writes: A new paper published in the Open Journal of Atmospheric and Climate Change by renowned professor of physics and expert on spectroscopy Dr. Hermann Harde finds that climate sensitivity to a doubling of CO2 levels is only about [0.6C], about 7 times less than the IPCC claims, but in line with many other published low estimates of climate sensitivity.  

The paper further establishes that climate sensitivity to tiny changes in solar activity is comparable to that of CO2 and by no means insignificant as the IPCC prefers to claim.

The following is a Google translation from the German EIKE site with an overview of the main findings of the paper, followed by a link to the full paper [in English].

Assessment of global warming due to CO2 and solar influence

Currently climate sensitivity (discussed for example here ) is claimed by the IPCC mid-value to be 3.0 C (AR4) as the most probable value, but others have determined much lower values ​​of 1.73C or 1C or even 0.43C. Prof. Hermann Harde, renowned physicist and Spektral analytiker has determined from his new paper the climate sensitivity is [0.6 C]

Only a few spectral lines from CO2 absorbed Image

Transmission and absorption spectrum of the terrestrial radiation in the atmosphere.
Transmission and absorption spectrum of the terrestrial radiation in the atmosphere.
Editor’s note: The “climate sensitivity” said quantity was invented to carry the presumption in meaningful ways into account that the global mean temperature of the atmosphere could possibly be driven in a certain way by increase in carbon dioxide concentration in the air. To this end, forces defined (postulated) called. “Forcings”, whose influence, by means of certain physically based and mostly plausible assumptions, to accomplish this increase as migration out of balance. One of the factors is required for the climate sensitivity. It indicates how much K (° C) doubling the heating of the CO2 concentration rises.

Advanced Two-Layer Climate Model for the Assessment of Global Warming by CO2

Hermann Harde* , Experimental Physics and Materials Science , Helmut-Schmidt-University, Hamburg , Germany

Open Journal of Atmospheric and Climate Change, In Press

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

182 Comments
Inline Feedbacks
View all comments
Ron C.
October 14, 2014 5:56 am

Your post on Hermann Harde’s work highlights a valuable analysis. He has shown that even if you adopt the Trenberth energy balance model (including the notion of the earth as a flat disk illuminated constantly by the sun at 1/4 power), empirical evidence gives a small fraction of the potential climate sensitivity claimed from increasing CO2.

Doug Allen
October 14, 2014 6:05 am

I think the key inference is the saturation effect, “The line-by-line calculations for solar radiation from 0.1–8 mm (sw radiation) as well as for the terrestrial radiation from 3–100 mm (lw radiation) show, that due to the strong overlap of the CO2 and CH4 spectra with water vapour lines the influence of these gases significantly declines with increasing water vapour pressure, and that with increasing CO2-concentration well noticeable saturation effects are observed limiting substantially the impact of CO2 on global warming.”
As a conservationist and progressive and teacher, I’ve tried, mostly unsuccessfully, to educate my tribe in the science of climate science using, among other things, this elementary essay on climate sensitivity-
http://climatesensitivity.blogspot.com/

george e. smith
Reply to  Doug Allen
October 14, 2014 10:17 am

Well Doug, I notice you excerpted a section that has puzzled me.
“””… “The line-by-line calculations for solar radiation from 0.1–8 mm (sw radiation) as well as for the terrestrial radiation from 3–100 mm (lw radiation)…”””
Did they really integrate Solar radiation from 100 nano-meters all the way out to 8 millimeters, and IR from 3 microns out to 100 millimeters.
98% of solar radiant energy, is contained between 250 nano-meters, and 4.0 microns wavelength. Any solar radiant energy beyond even 10 microns, wouldn’t be worth a tinkers damn.
And 98% of the mean surface LWIR lies between 5 microns, and 80 microns, so once again, radiation beyond say 200 microns, would be quite negligible.
Now 0.1 to 8.0 microns, and 3.0 to 100 microns, would both be very reasonable and generous spectrum widths to deal with.
Could this be just a typo; read “micron” for “mm”.
The text as presented above reads impressively; I can’t say, I can grasp the detail without pictures.
I’m suspicious of the mention of the word “equilibrium” in there. That’s news to me, and if he uses Trenberth’s energy fluxes, it would seem to imply a non-rotating earth, as would the assertion of equilibrium.
I’ve done a lot of computer modeling; both electronic circuits (SPICE like, and down to the bare metal (Si)) and also Optical (imaging and non-imaging designs), but in every case, I have only modeled the real actual physical elements or objects, that were in the real system; or would be in it, when manufactured to my design, and strictly obeying laws of physics; sometimes approximations (geometrical ray optics) and sometimes more physical (diffraction based), and I’ve never ever gotten a result, that wasn’t replicated in the final as made system.
So I don’t quite grasp the concept of doing modeling on a system, that you know a priori is totally different from the actual real system you wish to explain.
But I guess, I will try and wade through this paper. It is a shame if it has typos that totally change the numbers.

george e. smith
Reply to  Doug Allen
October 14, 2014 3:50 pm

Well Doug,
There may be overlap (what is “strong” overlap) of the absorption bands, of the various GHGs, with water notably having plenty of those, but those bands are hordes of very fine lines, and there is not much likelihood of a lot of overlap of those narrow lines. So they are more likely to be additive.

Richard M
October 14, 2014 6:13 am

I didn’t see any reference to the influence of ocean cycles. This likely means either the solar or CO2 (or both) values are too high.

Randy in Ridgecrest
October 14, 2014 6:15 am

Would someone weigh in on the graph? i don’t get what this is showing. Isn’t the wavelength way off? If the gray line is a blackbody curve it’s pretty cool, like ATM average temp? That’s certainly not the BB curve for light from the sun which is peaked at around 0.5 um.

MikeB
Reply to  Randy in Ridgecrest
October 14, 2014 7:01 am

The plank curve in the top graph represents the emission from the surface of the Earth. It looks like it assumes a temperature of 300K, which is normal. It peaks about 10 microns. The graph is meant to show that radiation from the Earth can escape relatively unhindered through the ‘atmospheric window’, a wavelength band ranging from about 8 to 14 microns. Above 14 microns the radiation is blocked by the CO2 absorption band, below 8 microns it is severely attenuated by water vapour.

Randy in Ridgecrest
Reply to  MikeB
October 14, 2014 10:31 am

Thanks Mike! I’m dense at 7AM.
Randy

James M. VanWinkle
October 14, 2014 6:18 am

In the model used for this paper does the model make the usual prediction of a hot spot in the upper troposphere (found to be missing in reality)? If the model does not make a prediction of such a ‘hot spot’ that would certainly indicate a big improvement in modeling.

October 14, 2014 6:20 am

Versions of this work have been around since early 2011: Consider a Spherical Truncated Icosahedron
Another version and discussions from 2013: Hermann Harde: greenhouse effect 30% smaller than IPCC says.
which refers to this publication.: Radiation and Heat Transfer in the Atmosphere: A Comprehensive Approach on a Molecular Basis, Hermann Harde, International Journal of Atmospheric Sciences, Volume 2013 (2013), Article ID 503727, 26 pages, http://dx.doi.org/10.1155/2013/503727
In the first two articles above there are links to additional discussions.

October 14, 2014 6:24 am

Yet another paper which justifies abandoning the IPCC model outputs as the basis for even discussing future climate let alone using them as a basis for climate policy.
The modelling approach is inherently of no value for predicting future temperature with any calculable certainty because of the difficulty of specifying the initial conditions of a sufficiently fine grained spatio-temporal grid of a large number of variables with sufficient precision prior to multiple iterations. For a complete discussion of this see Essex: https://www.youtube.com/watch?v=hvhipLNeda4
Models are often tuned by running them backwards against several decades of observation, this is
much too short a period to correlate outputs with observation when the controlling natural quasi-periodicities of most interest are in the centennial and especially in the key millennial range. Tuning to these longer periodicities is beyond any computing capacity when using reductionist models with a large number of variables unless these long wave natural periodicities are somehow built into the model structure ab initio.
In addition to the general problems of modeling complex systems as in the particular IPCC models have glaringly obvious structural deficiencies as seen in fig 2-20 from AR4 WG1- this is not very different from Fig 8-17 in the AR5 WG1 report)
The only natural forcing in both of the IPCC Figures is TSI, and everything else is classed as anthropogenic. The deficiency of this model structure is immediately obvious. Under natural forcings should come such things as, for example, Milankovitch Orbital Cycles, lunar related tidal effects on ocean currents, earth’s geomagnetic field strength and most importantly on millennial and centennial time scales all the Solar Activity data time series – e.g., Solar Magnetic Field strength, TSI, SSNs, GCRs, (effect on aerosols, clouds and albedo) CHs, MCEs, EUV variations, and associated ozone variations.”
More and more people are realizing that the GCM’s are inherently meaningless.
It is well past time that the climate discussion moved past the consideration of the these useless models to evaluating forecasts using a completely different approach based on the natural quasi-periodicities so obviously seen in the temperature and driver record.
Earth’s climate is the result of resonances and beats between various quasi-cyclic processes of varying wavelengths combined with endogenous secular earth processes such as, for example, plate tectonics. It is not possible to forecast the future unless we have a good understanding of the relation of the climate of the present time to the current phases of these different interacting natural quasi-periodicities which fall into two main categories.
a) The orbital long wave Milankovitch eccentricity,obliquity and precessional cycles which are modulated by
b) Solar “activity” cycles with possibly multi-millennial, millennial, centennial and decadal time scales.
The convolution of the a and b drivers is mediated through the great oceanic current and atmospheric pressure systems to produce the earth’s climate and weather.
After establishing where we are relative to the long wave periodicities to help forecast decadal and annual changes, we can then look at where earth is in time relative to the periodicities of the PDO, AMO and NAO and ENSO indices and based on past patterns make reasonable forecasts for future decadal periods.
For forecasts of the timing and amount of the probable coming cooling based on the natural 1000 year and 60 year periodicities in the temperature record and using the 10Be and neutron count data as the best proxy for solar “activity”go to
http://climatesense-norpag.blogspot.com

latecommer2014
Reply to  Dr Norman Page
October 14, 2014 8:11 am

Perhaps you should consider the “lag”‘ and through that consider what forces what. Warmer water releases more CO2…. Which is cause and which is effect? Perhaps they are both?

eyesonu
Reply to  Dr Norman Page
October 14, 2014 9:06 am

Dr. Page,
I have been to your site and find it interesting.
With regards to the climate I certainly don’t know the answers. There are many variables over many varying time scales. There is room for argument for the short term time frames and it is obvious that absolutely nothing is settled. One argument that I can’t buy is the one presented that CO2 is a major factor in the earth’s climate and negative at that. It would seem that 600 to 800 ppm could be beneficial overall.

Reply to  Dr Norman Page
October 14, 2014 9:07 am

I can see ECS taking some amount of time to stabilize with constant inputs. But I think TCS should be instantaneous, speed of light, er better speed of IR. A flashlight in a fog, happens immediately, DWLR should respond instantly as the Co2 get various levels of illumination in LW IR.

as well as for the terrestrial radiation from 3–100 mm (lw radiation) show, that due to the strong overlap of the CO2 and CH4 spectra with water vapour lines the influence of these gases significantly declines with increasing water vapour pressure, and that with increasing CO2-concentration well noticeable saturation effects are observed limiting substantially the impact of CO2 on global warming.

Even with this showing how much water vapor saturates at least the main 15-16u Co2 bands, when I measure the zenith (DWIR) with my IR Thermometer, when it’s cool and dry, I’m measuring -40F to -60F, and in the winter it was colder than the thermometer reliably reads (-80F). I’ve been measuring 80F-100F differences in temps between concrete and zenith. If there isn’t any water vapor, Co2 isn’t going to keep us warm.
On the other hand, I’ve long wondered if Co2 helps boost a planets temp from Co2 melting temps up to water melting temps, at which point a water cycle can warm it up even more.

October 14, 2014 6:31 am

~0.5! I guess that makes me a lukewarmer. I expected we would settle on ~1.0 the way the estimates were moving. If solar variations can be ignored, then ~0.7. I guess the big question has to be, why, with all these physicists in the fray and everyone telling us “its the physics”, they didn’t DO the physics!

Reply to  Gary Pearse
October 14, 2014 6:40 am

My guess was zero ±1ish.
At the lowest levels the other factors are so significant as to make determining any effect from CO2… well, it’s impossible to distinguish from the unknowns over those time periods.
Dr Norman Page at October 14, 2014 at 6:24 am, just above, makes the point in detail (but with more certainty than I could muster).

Reply to  M Courtney
October 14, 2014 11:03 am

Reduces co2 to background noise. I have it, when it was warming, at about 3% of the total warming, so about 0.015 C . Other factors simply overwhelm that warming. I’m not to sure co2 has any affect at all, other than maybe slowing the rate of fall of real temperatures by 3%. If co2 actually worked the way the IPCC says it does, then temps are actually falling. Which kind of proves that other mechanisms overwhelm co2 in climate control.

Richard M
October 14, 2014 6:43 am

Still waiting for Mosher to tell us why this can’t be true ….

Schrodinger's Cat
October 14, 2014 6:47 am

It looks as though band saturation by water vapour was under estimated in earlier studies and even the warming induced evaporation (seen by alarmists as positive feedback) would ensure further saturation, thus diminishing the potential of GHG warming by CO2.

Reply to  Schrodinger's Cat
October 14, 2014 6:54 am

Which has been obvious since AR3 squibbed and the Tropical Hotspot didn’t show up.
Not that I’m criticising your comment.
I’m merely asking how anyone was still able to think that warming induced evaporation could be a positive feedback without the Tropical Hotspot being easy to find?

knr
October 14, 2014 6:55 am

No problem, now they can tell us how even such small increases in temperature will lead to ‘climate doom’ therefore we have to act now. It is a very rich and comfortable gravy train with lots of people on board so its going to take much more than is to derail it.

Mervyn
October 14, 2014 7:23 am

Oh oh!!! It’s not really looking too good for the IPCC. We were told its 2007 4th Assessment Report was the “gold standard in climate science”… the “settled science”… incontrovertible… based solely on peer reviewed science!
Well, Mother Nature has rendered that report obsolete. The report was all about rising temperature trends based on various rising CO2 emissions scenarios. The hotter temperatures supposedly would cause all those nasty extreme weather events. Nothing in the report about a flat temperature trend or possibly even slight cooling.
RSS has confirmed 18 years of no discernible warming. No warming. That’s big news… huge!
Since 2007, we’ve seen what… some 17 papers demonstrating climate sensitivity is significantly lower than that used in climate models.
So when is someone going to call this global warming alarmism for what it is … a massive deception… a scientific fraud?

Chip Javert
Reply to  Mervyn
October 14, 2014 10:53 am

Ok; for the (roughly) 2 millionth time here at WUWT:
Global warming alarmism is … a massive deception… a scientific fraud.

October 14, 2014 7:23 am

With 24 years of deductive reasoning under my belt as a chemist, I already knew that Gore’s claim about global warming didn’t make logical sense. http://chrisskates.com/novel-becomes-energy-crystal-ball/

Tilo
October 14, 2014 7:54 am

These numbers agree with some of the studies that were produced years ago by Roy Spencer and Richard Lindzen.
In the meantime, Svalgaard continues to pretend that Svensmark’s research on cosmic rays doesn’t exist.
My take, for the last 8 years has been that the number is between .5C and 1.2C for 2XCO2.

Russ R.
October 14, 2014 7:57 am

“We present an advanced two-layer climate model…”
Sorry, but I thought we were supposed to be skeptical of model results…

Reply to  Russ R.
October 14, 2014 8:29 am

We are sceptical of model results. But this model uses known (measured inputs) from a spectroscopy database , HITRAN08 and plugs them into a simple model. A simple model with a simple aim, to “calculate the influence of an increasing CO2-concentration and a varying solar activity on global warming.”
It isn’t trying to predict the oceans and the weather systems.
From the introduction:

Many climate models, particularly the Atmosphere-Ocean General Circulation Models (AOGCMs) [2] were developed not only to simulate the global scenario, but also to predict local climate variations and this as a function of time. Therefore, they have to solve a dense grid of coupled nonlinear differential equations depending on endless additional parameters, which make these calculations extremely time consuming and even instable. So, smallest variations in the initial constraints or corrections on a multidimensional parameter platform already cause large deviations in the final result and can dissemble good agreement with some observations but with completely wrong conclusions

In other words, they try to do too much and fail to replicate anything – except through fiddling the parameters to tune the past. You’ve seen our scepticism of this in the past.
But they go on to state that their model is meant to look at climate sensitivity only – and it describes how it works so anyone can see the assumptions involved. The assumptions involved in solving for the movement of the oceans and atmosphere are so many and complex that they don’t test anything except the faith of the reader.
But here:

In contrast to the RF-concept and the extremely complex AOGCMs here we present an advanced two layer climate model, especially appropriate to calculate the influence of increasing CO2 concentrations on global warming as well as the impact of solar variations on the climate.
The model describes the atmosphere and the ground as two layers acting simultaneously as absorbers and Planck radiators, and it includes additional heat transfer between these layers due to convection and evaporation. At equilibrium both, the atmosphere as well as the ground, release as much power as they suck up from the sun and the neighbouring layer. An external perturbation, e.g., caused by variations of the solar activity or the GH-gases then forces the system to come to a new equilibrium with new temperature distributions for the Earth and the atmosphere.

So you can see what they are testing.

Reply to  Russ R.
October 14, 2014 9:17 am

“… I thought we were supposed to be skeptical of model results…”
Yes, it would be foolish not to be skeptical of all models, because they all tend to be biased from unavoidable assumptions, approximations and economies of scale.
Having said that, I would like to point out that it is impossible to measure anything without using a model of some kind.
Temperature? Temperature is only an abstract concept defined by physicists. So, thermometers can’t measure temperature directly, but must rely on a model, based on a variety of proxies like thermodynamic thickness of a column of mercury or the electrical resistance of a thermocouple, all subject to errors of implementation and interpretation. All are severely limited in scale. The thermometer in your weather station would be useless for measuring temperatures inside a furnace (and vice versa).
But many of these so-called ‘thermometers’ are often useful, and sometimes seem to provide acceptably consistent readings over a narrow range of values in selected locations. Assuming, of course, that they are free from manufacturing defects, improper installation or misreading of values.
Yes, be very skeptical of _all_ models.

charles
Reply to  Russ R.
October 14, 2014 6:21 pm

Models are essential to understanding the physical world we live in. The key is that to be useful the model results have to agree with experimental results and/or the physical world. So if the model correctly predicts observation then it is useful. If not, it is junk.

jpatrick
October 14, 2014 9:03 am

Okay. So, we are lowering the estimate for the rate of change of a dependent variable we can’t really measure. Yeah.

October 14, 2014 9:14 am

Letter submitted to Raleigh (NC) News Observer (Dec. 24, 2004) by S Fred Singer:

Prof Wm Schlesinger (12/23/04) concludes that if climate is currently warming at 0.08 degC per decade, then temperatures will “increase between 4 degrees F and 10 degrees F by the end of this century”.– with all sorts of dire consequences. This remarkable demonstration of arithmetic assumes also that all of the current increase is due to human influences rather than natural ones — and ignores the fact that greenhouse theory predicts a less-than-proportional temperature increase with increasing carbon dioxide. My considered estimate for 2100 is at most one degree F — based not on climate models but on the observational evidence.

Fred Singer wrote that 10 years ago. My money has always been on him being about right and the recent slew of papers on climate sensitivity all seem to be heading in one direction – down.

Reply to  ThinkingScientist
October 14, 2014 2:22 pm

And just for those (like me) who prefer SI to Imperial units, 1.0 deg F is approximately 0.6 deg C.
[Is that 0.6 C before or after adjusting for the TOBS adjustments and station movements between the equator, Paris and the north pole since the meter was defined back in the French Revolution? 8<) .mod]

John F. Hultquist
Reply to  ThinkingScientist
October 14, 2014 10:31 pm

I thought I’d check your math so I asked a search box to tell me what 1.0 deg F is when expressed as Celsius.
The answer provided was -17.22222.
Perhaps you meant what 1.0 F degree is expressed as Celsius.

milodonharlani
Reply to  ThinkingScientist
October 15, 2014 9:20 am

John, ThinkingScientist was talking not about the temperature 1.0 degree F, but what percent of a Celsius degree is a Fahrenheit degree. That is easily derived by noting that freezing in F is 32 degrees but 0 degrees C, & that boiling is 212 F but 100 C. Hence, there are 180 F degrees between these states of water, while only 100 C degrees. Thus, there are 1.8 F degrees per C degree.

Alcheson
October 14, 2014 9:21 am

The 0.6C value is just about what the logical person would have guessed was the expected value. Given that ~1.1C is the non-feedback value, and in stable systems feedbacks are very highly likely to be net negative, an actual value less than 1.1C should have been expected. The surprising result would have been if it was greater than 1.1C and yet our climate system remains stable.

Rud Istvan
October 14, 2014 9:32 am

I went and read the entire paper. On the plus side, Table 5 inputs are well founded, and using them innthe absence of feedbacks produces either sensitivity of 1.11 or 1.2 depending on assumptions. Those are the cannonical values, and show that the GHG radiative transfer calculations by layer agree with other much simpler calculations of the zero feedback sensitivity to CO2 doubling.
But the lower feedback sensitivity results from two rather clear conceptual errors.
With respect to water vapor feedback, an incorrect saturation assumtion is made that effectively negated any positive water vapor feedback. The error is two fold. Even saturated layers still reradiate in all directions, so that some of all the absorbed wavelengths will reach the next higher layer, contrary to the explicit assumption made. Second, the number of layers saturated increases (optical depth, ‘top of fog’ rises. That is the classical no feedback mistaken physics.
And cloud feedback is taken to be strongly negative. The derivation is wrong, since it isnot linear as assumed, and the ISCCP observational value is for Cc 66%, not Cc zero. As used in the paper for example equation 77.
Lewis and Curry are quite persuasive. As mathematically intense as this model is, it is not persuasive.

milodonharlani
Reply to  Rud Istvan
October 14, 2014 10:25 am

On a demonstrably homeostatic, watery planet, net feedbacks should be assumed negative rather than positive. Ruling out net positive feedbacks from water vapor, clouds, etc, then the lab result of ~1.1 degree C for a doubling from 280 to 560 ppm at equilibrium ought to be considered the upper limit of potential warming.
I’m with Dr. Singer, quoted above, as estimating perhaps one degree F over GASTA in AD 1850 by AD 2100, assuming that earth reaches a concentration of 560 ppm by then. Much of that presumed warming must already have occurred transiently, given the logarithmic nature of the effect.
IOW, not only nothing to worry about, but indeed a good thing.

Reply to  Rud Istvan
October 15, 2014 6:54 am

In addition they don’t seem to have considered LW from the top of cloud which seems a significant omission.

Schrodinger's Cat
October 14, 2014 11:28 am

Rud Istvan – If you are up to date with these matters, what assumptions are made about the probability of collision of excited GH gases with atmospheric gas molecules before re-emission has had a chance to occur?
My question is (a) generally in GCMs and (b) in this paper.
Thanks in advance if you can answer.

Labmunkey
October 14, 2014 11:50 am

I was initially concerned about the use of models, and I admit to not being in a position to evaluate their calculations or validity.
However, he fact they detail all the assumptions and parameters on the model and further, highlight some of the limitations fills me with confidence.
Have and pro cAGW papers done similar with their models?

October 14, 2014 12:04 pm

According to this, if you reduce the CO2 level to zero, average temps drop only 2 deg C. That seems far too little. What water partial pressure does he propose when CO2 is at zero? Are there no changes in ice caps and albedo?
Or is it that as the CO2 drops, water vapor drops causing a faster drop in albedo via fewer clouds?

whiten
Reply to  Stephen Rasey
October 15, 2014 11:56 am

Stephen Rasey.
You say:
According to this, if you reduce the CO2 level to zero, average temps drop only 2 deg C. That seems far too little.
————
Now been used to the IPCC’s climate science big numbers play, a drop of only 2 deg C seems really far too little indeed, but while looking in the climate data, the average temp drop from the Holocene Optimum (the warmer period of the holocene) to the mids of the LIA ( Little Ice Age) does not make it even to a full 1C drop. Imagine any further 1C drop from that point on…… and is no hard to picture a LIA becoming an Ice Age, provided that that drop of 2C lasts longer.
A drop of around 2.2C to 2.4C will mean a significant cooling in the case of a long term trend…..
cheers

milodonharlani
Reply to  whiten
October 15, 2014 12:08 pm

“Climate scientists” have tried to get rid of the Holocene Climatic Optimum as well as the Medieval Warm Period. The Optimum was at least a degree C warmer than now globally (probably more), & much more at high latitudes. The LIA was about a degree C colder than now, so the worldwide average range from HCO peak to LIA trough was at least two degrees C.

whiten
Reply to  whiten
October 15, 2014 5:02 pm

Thanks for the reply.
YOU COOLD BE RIGHT, YES.
2 deg C sitll remains a significant cooling even in the way of your response…..besides I am a bit confused why should you estimate the cooling from the Holocene Optimum peak to LIA trough by taking in to consideration the warming at the present..strange…LIA did hapen prior and there is data directly showing the correlation without the need of the present… our present warming is not mature enough in the means of data as to be ready to compare (in) with long term data. In the future this warming of present you consider will look different in the long term data than now. Sorry but seems like a bit of acrobatics with the data….
But anyway that does not necesary mean that you are wrong….but as far as I can tell one of the reasons for the ridding off of Holocene data is that there is no enough cooling in the long term cooling trend and there is a lot of warming in the long warming trend prior to Holocene Optimum.
Your 2C cooling is a good enough one and very pleasing to the “Climate scientists” you mention.
Anyway is one thing that I would not bet either way at the moment.:-)
Thanks milodonharlani
cheers

October 14, 2014 12:08 pm

{bold emphasis mine – JW}
OPEN JOURNAL OF ATMOSPHERIC AND CLIMATE CHANGE
‘Advanced Two-Layer Climate Model for the Assessment of Global Warming by CO2’
Hermann Harde
7.0 Conclusion
[. . .]
From our investigations, which are based on actual spectroscopic data and which consider all relevant feedback processes as well as the solar influence, we can conclude, that a CO2 climate sensitivity larger 1 °C seems quite improbable, whereas a value of 0.5 – 0.7 °C – depending on the considered solar anomaly – fits well with all observations of a changing solar constant, the cloud cover and global temperature. A climate sensitivity in agreement with the IPCC specifications (1.5 – 4.5 °C) would only be possible, when any solar influence could completely be excluded, and only CO2 induced thermal cloud feedback would be assumed, then yielding a value of 1.7 °C.
[. . .]

– – – – – – – – – –
The paper makes a reasonable point, to get values within the IPCC specified climate sensitivity range requires excluding any solar sensitivity influence coupled with using an implausibly myopic view of only one aspect of cloud feedback.
John

Reply to  John Whitman
October 15, 2014 9:31 am

Exactly. A fine summary statement, really nails the issue.

Max Totten
October 14, 2014 12:09 pm

Still trying to see if anyone has noted that EPA reported that there had been no warming in last 100 years at http://www.epa.gov/climatechange/indicators. This was quite an admission.

Reply to  Max Totten
October 15, 2014 4:51 am

Don’t see them saying what you say. Their page on US temps says +1.4F/century since 1901.

Richard M
October 14, 2014 2:21 pm

I assume this model continues to ignore all the other “stuff” in the atmosphere which would also mean it is overestimating the warming. Dust, salt, pollen, smoke, bacteria, etc. all absorb radiation as well which must reduce the amount available for GHGs.

Verified by MonsterInsights