Forecasting the Earth’s Temperature
by David Whitehouse via Benny Peiser’s CCnet
The recent spate of scientific papers that are attempting to predict what the earth’s temperature might be in the coming decades, and also explain the current global temperature standstill, are very interesting because of the methods used to analyse temperature variations, and because they illustrate the limitations of our knowledge.
Recall that only one or two annual data points ago many scientists, as well as the most vocal ‘campaigners,’ dismissed the very idea that the world’s average annual temperature had not changed in the past decade. Today it is an observational fact that can no longer be ignored. We should also not forget that nobody anticipated it. Now, post facto, scientists are looking for an explanation, and in doing so we are seeing AGW in a new light.
The main conclusion, and perhaps it’s no surprise, to be drawn about what will happen to global temperatures is that nobody knows.
The other conclusion to be drawn is that without exception the papers assume a constantly increasing AGW in line with the increase of CO2. This means that any forecast will ultimately lead to rising temperatures as AGW is forever upward and natural variations have their limits. But there is another way of looking at the data. Instead of assuming an increasing AGW why not look for evidence of it in the actual data. In other words let the data have primacy over the theory.
Lean and Ride try to isolate and analyse the various factors that affect decadal changes in the temperature record; El Nino, volcanic aerosols, solar irradiance and AGW. Their formula that links these factors together into a time series is quite simple (indeed there is nothing complicated about any of the papers looking at future temperature trends) though in the actual research paper there is not enough information to follow through their calculations completely.
El Nino typically produces 0.2 deg C warming, volcanic aerosols 0.3 deg C cooling on short timescales, solar irradiance 0.1 deg C (I will come back to this figure in a subsequent post) and the IPCC estimate of AGW is 0.1 deg C per decade.
It should also be noted that natural forces are able to produce a 0.5 deg C increase, although over a longer period. The 0.5 deg C warming observed between say 1850 and 1940 is not due to AGW.
The temperature increase since 1980 is in fact smaller than the rise seen between 1850 – 1940, approx 0.4 deg C. This took place in less than two decades and was followed by the current standstill. A fact often overlooked is that this recent temperature increase was much greater than that due to the postulated AGW effect (0.1 deg C per decade). It must have included natural increases of a greater magnitude.
This is curious. If the recent temperature standstill, 2002-2008, is due to natural factors counteracting AGW, and AGW was only a minor component of the 1980 -1998 temperature rise, then one could logically take the viewpoint that the increase could be due to a conspiracy of natural factors forcing the temperature up rather than keeping the temperature down post 2002. One cannot have one rule for the period 2002 – 2008 and another for 1980 -1998!
Lean and Rind estimate that 73% of the temperature variability observed in recent decades is natural. However, looking at the observed range of natural variants, and their uncertainties, one could make a case that the AGW component, which has only possibly shown itself between 1980 – 98, is not a required part of the dataset. Indeed, if one did not have in the back of one’s mind the rising CO2 concentration and the physics of the greenhouse effect, one could make out a good case for reproducing the post 1980 temperature dataset with no AGW!
Natural variations dominate any supposed AGW component over timescales of 3 – 4 decades. If that is so then how should be regard 18 years of warming and decades of standstills or cooling in an AGW context? At what point do we question the hypothesis of CO2 induced warming?
Lean and Rind (2009) look at the various factors known to cause variability in the earths temperature over decadal timescales. They come to the conclusion that between 2009-14 global temperatures will rise quickly by 0.15 deg C – faster than the 0.1 deg C per decade deduced as AGW by the IPCC. Then, in the period 2014-19, there will be only a 0.03 deg C increase. They believe this will be chiefly because of the effect of solar irradiance changes over the solar cycle. Lean and Rind see the 2014-19 period as being similar to the 2002-8 temperature standstill which they say has been caused by a decline in solar irradiance counteracting AGW.
This should case some of the more strident commentators to reflect. Many papers have been published dismissing the sun as a significant factor in AGW. The gist of them is that solar effects dominated up to 1950, but recently it has been swamped by AGW. Now however, we see that the previously dismissed tiny solar effect is able to hold AGW in check for well over a decade – in fact forcing a temperature standstill of duration comparable to the recent warming spell.
At least the predictions from the various papers are testable. Lean and Rind (2009) predict rapid warming. Looking at the other forecasts for near-future temperature changes we have Smith et al (2007) predicting warming, and Keenlyside et al (2008) predicting cooling.
At this point I am reminded that James Hansen ‘raised the alarm’ about global warming in 1988 when he had less than a decade of noisy global warming data on which to base his concern. The amount of warming he observed between 1980 and 1988 was far smaller than known natural variations and far larger than the IPCC would go on to say was due to AGW during that period. So whatever the eventual outcome of the AGW debate, logically Hansen had no scientific case.
There are considerable uncertainties in our understanding of natural factors that affect the earth’s temperature record. Given the IPCC’s estimate of the strength of the postulated AGW warming, it is clear that those uncertainties are larger than the AGW effect that may have been observed.
References:
Lean and Rind 2009, Geophys Res Lett 36, L15708
Smith et al Science 2007, 317, 796 – 799
Keenlyside et al 2008, Nature 453, 84 – 88
Our web site has a couple of sections which are of relevance. At..
http://www.climatedata.info/Forcing/Forcing/volcanoes.html
.. we show that the only cooling mechanism in GCMs is the effect of volcanoes.
And at…
http://www.climatedata.info/Forcing/Forcing/0scillations.html
.. we illustrate the combined effect of El Nino and volcanoes on short term (few year) temperature variations.
If models (*) are mostly uncovered or even undressed it is because it’s hot, if models are not then it is going to be cold but anyway cool.
(*) fashion models not Hansen’s
And the Optimum Population Trust figures we can solve the whole problem thru contraception: http://www.telegraph.co.uk/earth/environment/climatechange/6161742/Contraception-cheapest-way-to-combat-climate-change.html .
As usual.
Well can we get some concensus here; are we talking about computed anomalies or actual temperatures. I for one don’t believe we can even measure the true earth (surface) mean temperature to anything like the accuracy implied by the published anomaly graphs.
GISStemp and other anomaly measures don’t have any real relationship to true global surface temperatures; the sampling density simply is orders of magnitude less than required for noise free recovery of the complete surface temperature map, and hence accurate computation of its average value; even its average value for s single instant; let alone averaged over a complete sun orbit. And even if we could measure it; that number we obtain is as useful as the average telephone number in the Manhattan phone directory; it tells us nothing about the global energy balance.
But I suspect that GISStemp is a reasonable representation of the tiny network of thermometers (owl boxes) that make up the (near) surface network.
It is one thing to apply some AlGorythm to an arbitrary set of data sensors, to compute some defined function of those numbers; but it is a completely different problem to connect that set to the actual planet we live on.
George
PS Yes I believe that it warms and cools; and no I don’t believe we have anything much to do with it.
1. Is there agreement that the ‘way we measure temperature’ is ‘consistent’, ‘reliable’ and ‘settled’?
2. If not is this argument pointless?
3. If so, when will people stop discussing it until point one IS settled?
4. Is that likely to be after Al Gore’s ‘cleantech boom-bust’ cycle has made him a billionaire and a bunch of silly investors bankrupt?
This is very similar to the forecast I have made for the next decade or so, I would though, on the basis of recent drops in global temperatures being far stronger than any AGW warming component, expect to to see even larger drops through this extended cold period, rather than reduced levels of warming. Murphy`s winter, here we come again!
http://climaterealists.com/forum/viewtopic.php?f=4&t=208
http://landscheidt.auditblogs.com/2008/06/03/the-sunspot-cycle-and-c24/
For my own assumptions, I like to use a probabilistic approach. The basic question asked is: is it more likely that temperatures are in a generally rising or generally falling trend?
Judging from past responses to things like PDO and ENSO, it seems reasonable that temperatures will generally decline for some period of time going forward, probably for something like 25-ish years beginning in 2006. So I expect temperature declines, in a general trend sense, until sometime around 2030.
Judging from the steady decline we have seen over the past three consecutive years in the continental US, there is empirical evidence that reinforces that conclusion. ( go here, in the Period pull-down, scroll all the way to the bottom, select “latest 12-month period”. In “First year to display”, select 1999, then click “submit” at the bottom of the form.)
There is a good chance we are in the same place in a naturally occurring cycle that we saw from 1950 to 1975 (plug those numbers into the above referenced form for first and last years to display to see the trend).
Now as for duration and slope of the cooling trend, that takes a crystal ball. But overall, I would say there is enough evidence to state with some degree of confidence that it is more probably that the next several years will be cooler than this year. Nothing tells me how MUCH cooler, just that it will generally be cooler. So if I were a farmer, I would plan on strains of crop that do well with a shorter growing season.
If you go back to that form and click “precipitation” rather than “temperature” you will see that over the same time there was a trend if increasing precipitation, in general, across the continental US during that same period. So cooler, wetter, conditions are probably in store generally but can vary widely locally. Locations that saw prolonged drought might see a return of those conditions. Locations that saw flooding and increases in snowfall might also see a return of those conditions though maybe not as great or maybe to a greater extent than that period.
Judging by the slope of the current trend, it appears that the current cooling trend might be greater than the 1950 to 1975 trend but there aren’t enough data yet to tell with any degree of real confidence.
If I were a farmer or a heating oil salesman or in the snow removal business or any business that is impacted by weather, I would look at my local trends between those years in the past and position myself generally to take greatest advantage to the extent possible of a likely repeat of those conditions.
As for making exact predictions from one year to the next, I will leave that up to the psychic down the street because even within that period of general down trend, it was not a monotonic decline, one year might still be warmer than the previous.
The interesting point is that these are presumably peer reviewed papers – and there seems to be a renewed questioning of the consensus view. Opening up the issues of how well the climate is understood is much more interesting than blindly questioning the precision of temperature measurement, which is just a distraction.
I think we can expect more work in this area – enough people are keen enough to have their name on a groundbreaking paper, and this could be their opportunity. There is not so much chance of a breakthrough in supporting the consensus.
The main problem with all models is that they have four independent variables (solar, GHGs, human and volcanic aerosols) and only one dependent: temperature (or two if you include precipitation).
That means that any combination of the four forcings with their own climate sensitivity will give you the right answer: the past temperature trend. Which is a necessary, but far from sufficient condition for a model to be of predicitve value.
All current models have similar sensitivities for the same change in forcing. But that is far from sure: solar has its largest effect in the stratosphere (UV/ozone/jet stream positiont/cloud-rain patterns) and in the light/energy penetration of the ocean’s surface in the tropics, while GHGs have their largest effect spread over the latitudes and the lower troposphere. This was seen in a test of the HadCM3 model by Stott e.a.:
http://climate.envsci.rutgers.edu/pdf/StottEtAl.pdf
The main conclusion (within the constraints of the model, like a fixed influence of human aerosols): solar variation may be underestimated by a factor 2 (at the cost of the influence of GHGs).
Further, the influence of human aerosols is quite unsure (even the sign can be discuted!), but that has a huge influence on the (possible) sensitivity (including feedbacks) of GHGs. See:
http://www.ferdinand-engelbeen.be/klimaat/oxford.html
“Instead of assuming an increasing AGW why not look for evidence of it in the actual data. In other words let the data have primacy over the theory.”
Imagine that!
Data over theory.
Sadly, in scientific discipline after scientific discipline, theory trumps data.
The imagnination and desires of men have a firm grip on Science.
No surprise there.
But when theory trumps data, one can not claim the state of Science is “clearly healthy!” with a straignt face and retain credibility.
But perhaps after this experience with the inscrutible Sun men will be humble and more open-minded and less attached to figments of imagnination flickering on the cave wall.
“Lean and Rind (2009) look at the various factors known to cause variability in the earths temperature over decadal timescales. They come to the conclusion that between 2009-14 global temperatures will rise quickly by 0.15 deg C – faster than the 0.1 deg C per decade deduced as AGW by the IPCC. Then, in the period 2014-19, there will be only a 0.03 deg C increase. They believe this will be chiefly because of the effect of solar irradiance changes over the solar cycle. Lean and Rind see the 2014-19 period as being similar to the 2002-8 temperature standstill which they say has been caused by a decline in solar irradiance counteracting AGW”.
This publication is nothing more but another attempt to keep the CO2 AGW hoax alive by packing the lack of warming in a semi scientific smoke curtain.
“Solar irradiance counteracting AGW”, how smart a scientist do you have to be to come up with such a BS statement.
The answer to that question is easy.
These guys are collaborators serving a political agenda.
Think Copenhagen December 2009, nothing more nothing less.
You know what they say . . . never make predictions. Especially about the future.
By the way, UAH MSU files, August 2009 saw an anomaly-wise 0.18 °C cooler than July 2009.
This means that about one half of the abrupt warming between June and July was reversed between July and August.
I don’t know what the future will bring but if I am aloud to gamble, I say we will continue to cool.
Rhys Jaggar has it right. Until we can get past his point #1nothing else matters. How can we forecast the temperature when we can’t tell what the temperature is or has been?
This whole piece is a tiresome mixture of half-truths, untruths, misunderstandings and simple mistakes.
There is no “current global temperature standstill”. The trend in temperatures from x-present is not statistically different from the trend from x-1998, or x-2002, or x-2005, or whichever date you want to cherry pick. Yet again, you allow yourself to be fooled by weather. Will this ever stop?
There is no “IPCC estimate of AGW”. The quantification of the expected temperature rise is far more complicated than the single number you wrongly attribute. In fact, in this and the next few decades, the expected warming from the rise in CO2 would be about 0.2°C, not 0.1°C.
“The 0.5 deg C warming observed between say 1850 and 1940 is not due to AGW” – and yet atmospheric CO2 concentrations started rising in about 1750. CO2 did not suddenly become a greenhouse gas at some point after 1940.
The GISS temperature record starts in 1880. By 1988, therefore, there was 108 years of instrumental global temperature data, not “less than a decade of noisy global warming data”.
There is another effect to add, which is the gravitational energy imparted to the ocean due to moving mass inside the Earth. Currently modellers use the GRACE satelite data, I believe, to provide an estimate of ocean circulation. GOCE will provide a higher resolution and as I have read may bound the model uncertainties with regards to deep ocean currents and motion by an order of magnitude. Finer detail may reveal more information concerning the Earth’s oceans oscillation patterns and heat transfer. This could help explain the temperature variations observed on the surface. One of the questions I have is what happens if you don’t vary the sun’s output at all? Can long term (50 years +) temperature variations be described by the dynamics of processes on the Earth alone. Ice ages seem to match orbital effects for instance.
I’d just like to give UK contributers a heads up on the UK Foreign Secretary David Milliband’s on-line blog.
Among other things it’s covering his days leading up to the Copenhagen Climate Summit in December.
In one blog he describes:-
At Paris’ Sciences Po University yesterday I said that if Europe successfully led the way to global climate deal, the EU would come to be recognised as an “Environmental Union”.
Readers comments are welcome, but are as yet very thin indeed.
http://blogs.fco.gov.uk/roller/miliband/
“Lean and Ride(sic! You mean Rind) try to isolate and analyse the various factors that affect decadal changes in the temperature record; El Nino, volcanic aerosols, solar irradiance and AGW.”
“There are considerable uncertainties in our understanding of natural factors that affect the earth’s temperature record.”
I agree that, at first instance, we should look for an explanation without AGW, the more that it is still possible that CO2 is in lagged equilibrium with temperature.
L&R try to account for the recent temperature standstill only through the effect of the given factors on the short term. It is clear that the result is not satisfactory. But there are also changes in the long run, in particular in the oceans. Warming (and cooling) of the ocean occurs on interdecadal and even on centiannial period scales. Does anybody know what’s the temperature in the oceans on a depth of 1 or 2 km or what’s the temperature on the ocean-floor? Possibly these temperature will influence climate on a certain manner within twenty of more years, I don’t know.
It is quite simple to say that any global warming is caused by AGW. It is not simple at all to look for other factors which still have to be discoverd and which are working on the longer term.
It seems me that the authors in this new paper (2009) have an opinion that is opposite to that of 2008. Then they argued: “According to this analysis, solar forcing contributed negligible long-term warming in the past 25 years and 10% of the warming in the past 100 years.”
Now, they state:”But as a result of declining solar activity in the subsequent five years [from 2014 to 2019], average temperature in 2019 is only 0.03±0.01 C warmer than in 2014. This lack of overall warming is analogous to the period from 2002 to 2008 when decreasing solar irradiance also countered much of the anthropogenic warming.”
Have they been overtaken by events??
Murphy’s winter (excellent !) is a part of Murphy’s climate, and it has found a perfect target in the form of the assumptions of Global Warming.
Murphy was last seen heading up Mt. Shasta. 🙂
2 simple & serious problems with the decompositions that are leaned on so heavily:
1) untenable assumption of randomness that goes into their determination.
2) shared variance.
While it isn’t wrong for investigators to do what they can with what is available, statements must be appropriately qualified according to limitations & uncertainties — and to be blunt: we’re generally not seeing that, which suggests either:
a) fundamentally flawed reasoning (such as blind &/or unquestioning acceptance of untenable assumptions), – &/or –
b) deceit.
In summary: Arrogantly unqualified decompositions are more than suspect and will remain so until our understanding of natural climate variations increases by orders of magnitude. There is no way to weasel out of this.
The full paper is available on the NASA GISS website.
http://pubs.giss.nasa.gov/docs/2009/2009_Lean_Rind.pdf is the direct link to the pdf.
The abstract and the link to the pdf are on:
http://pubs.giss.nasa.gov/abstracts/2009/Lean_Rind.html
Excellent article! This is one of many interesting passages:
Those papers are flat wrong. I don’t really care how peer reviewed they are. I’m not judging their worth. Planet Earth is showing us they are wrong. And the planet, unlike people, doesn’t lie. What you see is what you get.
Anyone looking at what has happened over the past decade must admit that any warming due to AGW has been hugely overstated.
When temperatures were temporarily rising concurrently and coincidentally with the rise in CO2, a case could be made that CO2 was a major cause of the warming.
But for most of the past decade temperatures have been flat to falling, while CO2 continues to rise. CO2=AGW is a canard, propped up by endless financial grants, without which it would quickly fade from the public’s consciousness.
It is apparent that CO2 has such an insignificant effect on temperature that it can be completely disregarded for all practical purposes. In fact, the evidence proves that more CO2 is beneficial.
It is hard for people to change their minds, after being told for years that CO2 causes global warming. But given the failure of the climate to warm for most of the last decade as CO2 rises, the conjecture that CO2 causes global warming is unsustainable.
Only one CO2 molecule out of 34 is emitted by human activity. The rest are natural emissions. The AGW conjecture is built on sand. It is preposterous to believe, as the alarmists tell us, that a few percent change in a very minor trace gas will cause the climate to go into runaway global warming. In the geologic past CO2 has ramped up to thousands of ppmv, for hundreds of millions of years at a time. The result? The Earth was teeming with life.
Now that it is clear that CO2 has such an insignificant effect [or possibly no empirically measurable effect; we don’t know for certain], it is time to cut off the enormous flow of wasted taxpayer dollars being funneled into climate studies, and re-direct it to other scientific endeavors that will produce actual benefits.
Sometimes the TV-forecasters’ modify the small label aimed for showing the “current” temperature during the transmissions for their forecasts don’t fall short. For example, a local TV-forecaster predicted T max 35 °C yesterday. We could see how the T on the label of that channel went increasing until reaching 35 °C. However, the T max reported by the airport and our own thermometers was 27 °C in rural locations and 31 °C in urban heat islands. Do you see the trick? People will never see Biocab’s thermometers for checking out the temperature, but the small label on the screen of their TVs. 🙂
Carbon offsets were trading at $7 and have fallen to 25 cents. Must be a lot of hand wringing on this.
“How can we forecast the temperature when we can’t tell what the temperature is or has been?
Is anyone aware of any studies comparing hourly temperature profiles:
(a) seasonally (I found one for Athens), and
(b) sensor base (concrete pad vs grass)
George E Smith.
I’m not so sure your complaint about the sampling density is on track.
http://www.met.tamu.edu/class/atmo632/(77)ShenKN94.pdf
In short we may get a better characterization of the global temp by focusing on site quality rather than site quantity. hence the importance of Anthony’s work.
Ron de Haan,
“I don’t know what the future will bring but if I am aloud to gamble, I say we will continue to cool.”
Of course you allowed to gamble. I have a standing bet with a local “sceptic” which is based precisely on the premise that the world has been cooling since last year. The conditions of the bet are thus:
1. This annual bet concerns the movement in measurements of the troposphere temperature, cumulatively averaged for each calendar year from 2009 onwards compared to a baseline of the average for calendar year 2008.
2. The bet is payable each year by the loser to winner when the last month’s data from a calendar year first becomes available and the cumulative multiyear average can be compared to the baseline.
3. In the event of a tie (!) the bet is not paid for that year.
4. The bet for each year will be £100 multiplied by the square root of the number of years of data i.e. £100 for 2009, £141 for the average of 2009 and 2010, etc.
5. The dataset used will be the latest version of the lower tropospheric global time series released with the data for the last month of the year as published by the University of Alabama, Huntsville, known as UAH TLT and currently in file tltglhman_5.2.
6. Either party can withdraw from the bet at any time without any financial penalty.
I’m willing to bet against you on these terms that we are cooling. Are you on (in a currency multiple of your choice)?
Or, from Mr. Potatoe,
“The future will be better tomorrow”.
I’ll go you one better. There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.
Perhaps Anthony can explain the process whereby the avearage TV meteorologist gets the data to make the forecast.. The way it was explained to me, it all comes from model output generated somewhere, but not by the meteorologist locally. Is this the norm?
My sense is that much of today’s climate forecasting is more akin to fortune-telling than science. Here an astrometeorologist, Theodore White, shares his climate forecast:
http://forums.accuweather.com/index.php?showtopic=13603
Some of his forecast may turn out to be right and some will almost certainly turn out to be wrong, but it all comes across to me as guesswork.
I am sensing that there is somebody in your family who has been sick, someone who is older and you are hoping that they will get better. This person will get through what ails them, but hold them close, because soon their time will come. I am also sensing that you have some money woes. Be frugal and be wise, but don’t fret, because I am seeing that you will soon come into some unexpected money. I am also sensing something with the climate. Yes, it feel like it is going to change. It will get colder and then it will get warmer, and there will be some stormy weather…
“”” steven mosher (13:16:50) :
George E Smith.
I’m not so sure your complaint about the sampling density is on track.
http://www.met.tamu.edu/class/atmo632/(77)ShenKN94.pdf
In short we may get a better characterization of the global temp by focusing on site quality rather than site quantity. hence the importance of Anthony’s work. “””
I didn’t see a single lecture there about the Nyquist Sampling Theorem; or sampled data sytems theory either.
Modern communications sytems simply would not work if the Nyquist Sampling theorem was not valid; and neither will climate sampled data gathering systems; and the other thing that Nyquist teaches us, is that we only need to undersample by a factor of two to get to where even the function average is itself corrupted by aliassing noise. And in fact we know that even the daily max/min temperature samples taken by the global station network thermometers don’t even meet the Nyquist condition for the temporal sampling; so even at a single measurement site, the data is not sufficient to obtain noise free information for that single site; so we don’t need to even consider the orders of magnitude spatial undersampling over the whole globe.
The aliassing noise injected by failure to observe proper sampling protocol, is in band noise so it is permanently unremovable without throwing away good signal as well. No amount of mathematical regression or running averaging, or any other statistical mathematics trickery can correct for aliassing noise.
Why don’t Climate “scientists” assert that they are not subject to the law of gravity either; they might as well, if they cavalierly dismiss the Nyquist sampling criterion.
George
(in reference to last decades lack of rise) ‘We should also not forget that nobody anticipated it’
However in 1983 James Hansen predicted it was possible. He predicted a general warming trend, and noted that it could take as long as 20 years for Co2 warming to overcome natural variation.
So in Hansen knew enough to know that the last 10 years of no warming was possible, but no climate scientists knew enough to predict the timing of such events. Now climate science is learning more and making more efforts to try and predict such medium term variations (i.e. 5 to 20 year range). And being a relatively new effort there will probably be plenty of mistakes early on.
One thing I am always hearing from global warming alarmists when you point out to them our inability to predict short term climate trends is that long term climate trends are easier to predict. How can there be a scientific (as in empirically derived) basis for that statement?
“There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.”
Oh, god, not this again… it’s so depressing to see so many people fall for this tragic nonsense. What does “temperature” mean to you? Only if you think the answer is “nothing, ever” is your previous statement logically possible.
RW whined:
“The GISS temperature record starts in 1880. By 1988, therefore, there was 108 years of instrumental global temperature data, not “less than a decade of noisy global warming data”.”
Yup, and they are still ADJUSTING it to get it right for their theory!!
HAHAHAHAHAHAHAHAHAHAHAHA
Your disengenous propaganda really stands out here!!
“long term climate trends are easier to predict. How can there be a scientific (as in empirically derived) basis for that statement?”
An analogy would be toward prediction the average temperature four months ahead vs. four days ahead. We know that seasonal temperature trends (climate) will override noise (weather) in the long run, but they won’t in the short run.
Number of Days with Temperatures Above 32F at Milwaukee- any Trend?
By Rusty Kapela, WCM, NWS Milwaukee
http://www.crh.noaa.gov/news/display_cmsstory.php?wfo=mkx&storyid=31040&source=0
http://icecap.us/images/uploads/DESMOINESRECORDS.jpg
See full story and graps at http://www.icecap.us (second column)
An interesting read, but this posting presumes that the data set showing the warming is reliable. What I see in it is simply that the data set is not reliable and the processing done on it is worse. For example, there is a large bias toward warming of large chunks of open area via a simple “airport on an island” effect. Perhaps all that has happened in the last decade is that we have built fewer new airports and are flying less.
http://chiefio.wordpress.com/2009/09/08/gistemp-islands-in-the-sun/
There’s a small problem.
The article says : “Lean and Rind (2009) look at the various factors known to cause variability in the earths temperature over decadal timescales. They come to the conclusion that between 2009-14 global temperatures will rise quickly by 0.15 deg C – faster than the 0.1 deg C per decade deduced as AGW by the IPCC. Then, in the period 2014-19, there will be only a 0.03 deg C increase. They believe this will be chiefly because of the effect of solar irradiance changes over the solar cycle.”
(my emphasis)
The L&R paper says [10] “We assume that future irradiance cycles replicate cycle 23, with cycle 24 commencing at the beginning of 2009“.
Would you not say that this assumption has already been shown to be false?
Tom P – good luck with your bet, but I bet (figuratively) that you won’t collect more than once, because the other party can pull out at any time.
George E. Smith (09:37:42) : But I suspect that GISStemp is a reasonable representation of the tiny network of thermometers (owl boxes) that make up the (near) surface network.
It isn’t.
It uses too many airports for UHI adjustment (treating them as rural). It uses places very unlike an urban area and very far away as proxies for it (they are not). It divides the world into 6 latitude bands (and hides the major movement of thermometers south in those wide bands). And much much more…
http://chiefio.wordpress.com/2009/08/23/gistemp-fixes-uhi-using-airports-as-rural/
http://chiefio.wordpress.com/2009/08/30/gistemp-a-slice-of-pisa/
http://chiefio.wordpress.com/2009/08/17/thermometer-years-by-latitude-warm-globe/
http://chiefio.wordpress.com/gistemp/
This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.
Ron de Haan
It must feel a little demeaning to be scrabbling around with US city data rather than making a global inference.
As you’re talking but not taking, can I infer you’re not willing to put your money where your mouth is?
RW (11:57:26) : The GISS temperature record starts in 1880. By 1988, therefore, there was 108 years of instrumental global temperature data, not “less than a decade of noisy global warming data”.
Uh huh… And exactly how many thermometers were around in 1880 and how evenly distributed over the globe? Look again at that phrase “global temperature data”. Guess what, we STILL don’t have global temperature data from any land sources. We have guesses, extrapolation, fudged holes, etc. For a look at percentage of thermometers by latitude by year:
http://chiefio.wordpress.com/2009/08/17/thermometer-years-by-latitude-warm-globe/
I make that 7.3% of them in the Southern Hemisphere in 1879 as we enter the start of the GIStemp history. BTW, there were all of 27 in the Southern Warm latitudes. ALL of it. NONE in the Southern Cold band.
Not exactly what I’d call “Global”…
Further to Nick Stoke’s point, AGW was raised significantly earlier than 1988.
For instance google Jason report, Nierenberg Report, and Charney report. By the last 70s, early 80s, enough climate modelling and research into Co2 had been done that the American government was comissioning committee reports to attempt to summarise the science, kind of a mini IPCC report, but on a national rather than global scale. These reports were certainly not estimating Co2 warming on the basis of past temperature history, as the temperature history for the past 30 years at that stage was believed to be cooling (70s ice age scare).
steven mosher (13:16:50) :
George E Smith.
I’m not so sure your complaint about the sampling density is on track.
http://www.met.tamu.edu/class/atmo632/(77)ShenKN94.pdf
In short we may get a better characterization of the global temp by focusing on site quality rather than site quantity. hence the importance of Anthony’s work.
I think this are orthogonal points.
If you want a true, real, and accurate GLOBAL average, I think George is correct.
But if all you really need (and want) is a valid sample for determining the trend of the temperature series, then a small and high quality sample ought to answer the question “Is there a general warming going on?” (Note, not “Global” but “general” – widespread and of interest, but not spanning a well sampled total global surface.)
So I think you are both right, but in specific and slightly orthogonal domains. (Having staked out the middle ground, all parties can now commence shooting at me in the crossfire 8-}
RW (15:07:00) :
“There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.”
Oh, god, not this again… it’s so depressing to see so many people fall for this tragic nonsense. What does “temperature” mean to you? Only if you think the answer is “nothing, ever” is your previous statement logically possible.
Wrongo!
All it takes is that you have an intrinsic property. An average of an intrinsic property is meaningless. If you don’t know what an intrinsic property is, or means, please take the time to learn. It really does matter. A Lot.
It is very important to keep it clear in your mind that temperatures are an intensive variable. If you glaze at that in the smallest degree and skip over it without an in depth grasp of it, you will continue to waste time and space on a pointless pursuit of the impossible. Most folks, it seems, do exactly that (give how much bandwidth is wasted on the issue to no avail…)
These folks have a nice short description a couple of paragraphs down:
http://www.tpub.com/content/doe/h1012v1/css/h1012v1_30.htm
What that means, “intensive”, is that one instance of the property for one entity means nothing to another instance of another entity. It is not dependent on the mass of the object, for example.
The taste of MY meal means nothing to the taste of YOUR meal, and averaging them together can be done BUT MEANS NOTHING. Were you averaging in one drop of Tabasco sauce from my meal, or one ounce? It matters to the average, but is not known… What is an average of pepper and cloves?
Another, more pertinent, example might be taking two different pots of water and averaging their temperature. You get two numbers, but know nothing about the THERMAL ENERGY in the two pots. The temperatures become an average, but the average means nothing. It certainly is not representative of the average thermal energy.
Take the two pots of water and mix them, the resultant temperature is NOT the same as the average of the two temperatures. You must know the mass of water in each pot to get that result. And we did not measure the mass. Thermal energy is an extrinsic property and the average does have meaning.
The same thing happens on a colossal scale globally. We measure the temperature over a snow field, and ignore the massive heat needed to melt the snow with no change of temperature and ignore the mass of the snow. We measure the temperature of the surface of the ocean and ignore the shallow and great depths. We measure the surface temperature of a forest, and ignore the TONS of water per acre being evaporated by transpiration.
Then we average those temperature readings together and expect them to tell us something about the heat and energy balance of the planet.
That is lunacy in terms of physics and mathematics.
So take all the above, and firmly fix in your mind the truth:
An average of a bunch of thermometer readings MEANS NOTHING.
Got it?
Mike Jonas,
“Tom P – good luck with your bet, but I bet (figuratively) that you won’t collect more than once, because the other party can pull out at any time.”
Even on these forgiving terms I haven’t had a single taker yet. “All talk and no trousers” might be a applicable to any reader on this site who claims the world is cooling.
All it takes is that you have an intrinsic property. An average of an intrinsic property is meaningless. If you don’t know what an intrinsic property is, or means, please take the time to learn. It really does matter. A Lot.
That ought to have said:
All it takes is that you have an intensive property. An average of an intensive property is meaningless. If you don’t know what an intensive property is, or means, please take the time to learn. It really does matter. A Lot.
In keeping with the rest of the posting…
Nick Stokes (16:06:32) :
This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.
So it has always been based on a fantasy about gasses and models and projections, oh my! Ok, got it…
It’s so much easier to understand when you leave the data out and accept that it has always been confirmation bias and self delusion.
Mike Jonas (16:04:52) :
Game theory says that the first winner will pull out, not the first loser.
Anyway, hubris and nemesis are waiting in the wings to teach a lesson to anyone who bets on the weather. Let us know how it works out, Tom.
Smokey,
Look carefully at the bet – the stakes increase as the square root of time to reflect the transition of weather to climate. Of course initial variability might cause early losses against the trend, but the “big” money is there for those who think they can predict the long-term movement of temperatures.
Are you on, or trouserless?
Tom P, in thinking about your bet on the weather, it seems you threw in a ringer: click. There is a natural global warming trend line going back to the LIA [and the last great Ice Age before that].
Natural global warming is entirely within the parameters of natural climate variability, so to be fair to your betting partner, you need to handicap the trend line.
Keep in mind that skeptics aren’t saying there is no global warming; that’s only how the alarmist crowd tries to frame the argument. Skeptics are skeptical of the AGW claims, and in particular, of the speculative notion that carbon dioxide is the culprit in natural climate change.
CO2 may cause slight warming. But then again, it might not. The verdict isn’t in. But what is becoming very apparent is the fact that CO2 is such a weak player that it is overwhelmed by other factors, and it can be disregarded due to its minuscule effect — if, in fact, there is any effect at all.
[“Are you on, or trouserless?” Hey, Tom, I don’t want to embarrass you!]
Smokey,
The plot that you link to is scientifically embarrassing (and hence unpublished) and has no correlation to the published historical temperature profile:
http://img98.imageshack.us/img98/1994/glaciervsinstrumental.png
But your position is at least clear – the world is warming. Despite your protestations there are plenty of contributors to this site who believe otherwise. That belief, though, doesn’t seem to be strong enough for any of them to bet against me – they, at least, are indeed without trousers.
Looking at that plot above – it’s very difficult to conclude that all we’re seeing is natural variability. I suggest you contemplate it for a while.
Both Nick Stokes and RW have raised exceptions to this quotation….
“At this point I am reminded that James Hansen ‘raised the alarm’ about global warming in 1988 when he had less than a decade of noisy global warming data on which to base his concern. The amount of warming he observed between 1980 and 1988 was far smaller than known natural variations”
Irrespective of how much data he had, or what he may have based his hypothesis on, here is Hansen himself…
In Hansen and Lebedeff Geophys. Res. Letters. Vol 15, n. 4, pp 323-326. They conclude
“… the 1987 global temperature relative to the 1951-1980 climatology is a warming of between 2 and 3 standard deviations. If a warming of 3 standard deviations is reached it will represent a trend significant at the 99% confidence level. However, a causal connection of the warming with the greenhouse effect requires examination of the expected climate system response to a slowly evolving climate forcing, a subject beyond the scope of this paper.”
Six months later, in his testimony before congress Hansen stated
“… the global warming is now sufficiently large that we can ascribe with a high (99%) degree of confidence a cause and effect relationship to the greenhouse effect.”
This sounds not only like he is depending on observations for substantiation, it also suggests that he knows the true distribution of earth average temperature, which seems to me doubtful.
Kevin Kilty (17:59:07), nice deconstruction.
Tom P (17:49:45),
Hey there, Tom, that’s a swell new Hockey Stick you linked to. It’s almost as good as Michael Mann’s! [Note that Mann’s Hokey Stick was peer reviewed, too.]
And since I don’t want to embarrass you ‘scientifically’ [or any other way; my U.S. shoe size is 14, equivalent to EU size 48½], I’ll give you another chart to contemplate: click [a Bill Illis chart]. See that trend line? If you want to keep your thumb on the scale to win your bet, don’t tell your pal about it.
Smokey,
Good for you to put up another plot unfit for publication. What is your objection to the Oerlemans’ temperature profile, apart from the shape?
I must say I’m a little disappointed. I visited here to bet there is global warming, and I haven’t had a single person willing to bet otherwise, despite their initial statements to the contrary.
[snip ~ modify your tone or post elsewhere ~ ctm]
Nick Stokes (16:06:32) :
This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.
———————–
I think this ” common fallacy ” is an imaginative creation of yours. Every reader on this site knows of Arrhenius and numerical modeling and we know that your camp offers an endless stream of spurious data as proof of the validity of the hypothesis, not as the premise.
Echoing the exasperation felt by RW ( a poster who ostensibly shares your point of view); “Oh, god, not this again… it’s so depressing to see so many people fall for this tragic nonsense.”
Tom P
I’ll take your bet, but with the caveat that the first year measured against 2008 is 2010 not 2009 to prevent you from having a head start and you begin the cumulative multipliers from 2010 going forward.
ctm,
I’m sorry you felt an edit is necessary. But the fact remains that while there have been plenty of assertions on this site that we are experiencing global cooling, nobody is willing to bet against me on this.
Reply: I said I will take your bet. In fact. I propose you double it. Use 200 pounds as your base. ~ ctm
Ah. Now we’ll see who’s without trousers.
[ctm, you should also factor in the natural warming trend from the LIA. In a wager like this, the deck shouldn’t be stacked.]
Tom: sorry that none of my links are to your satisfaction. OTOH, I personally think your cite is every bit as accurate as Michael Mann’s peer reviewed hokey stick, if that makes you feel any better.
Now, if you want to know if global warming causes more sex [and who doesn’t want the answer to that question: click. Crank up the volume!
Reply: I believe in plain bets without wiggle rooms, ie asteroids, volcanoes, LIA etc. Will it warm or cool? All that matters is the base year, the start year, and the choice of index. ~ ctm
Charles,
Excellent – you’re on! We run the cumulative annual totals starting from 2010 against the 2008 average. What is your preferred currency multiplier?
Glad to see trousers being worn!
I like dollars as worthless as they are likely to become given our current deficit spending. Although we are sort of holding our own against the pound.
I’ll contact you via email.
Charles,
Just to state the conditions in the public domain. Of course if your two predictions are correct, the Earth cools and the dollar plummets, you are on to a very nice earner:
1. This annual bet concerns the movement in measurements of the troposphere temperature, cumulatively averaged for each calendar year from 2010 onwards compared to a baseline of the average for calendar year 2008.
2. The bet is payable each year by the loser to winner in sterling equivalent when the last month’s data from a calendar year first becomes available and the cumulative multiyear average can be compared to the baseline.
3. In the event of a tie (!) the bet is not paid for that year.
4. The bet for each year will be £200 multiplied by the square root of the number of years of data i.e. £200 for 2010, £282 for the average of 2010 and 2011, etc.
5. The dataset used will be the latest version of the lower tropospheric global time series released with the data for the last month of the year as published by the University of Alabama, Huntsville, known as UAH TLT and currently in file tltglhman_5.2.
6. Either party can withdraw from the bet at any time without any financial penalty.
Reply: Agreed. ~ charles the moderator
Tom P has responded to my email. We are in contact. The bet is real. We’ll see at the first milestone in 15 months or so.
Carbon Dioxide IS responsible!
There is now no doubt at all. Allow me to provide precisely why. While we cannot reasonably test the CO2 forcing of climate changes yet to come, we can clearly see in the paleoclimate record the absolute effects of CO2 in the most recent abrupt climate change events, the Dansgaard-Oeschger oscillations between the Holocene and the Eemian. Now, you will need to invoke the overriding Theory of Inverse Reality to fully appreciate what you are about to read. And if you want to see the actual data, you will have to plunk down $32 to Science Direct, but this will necessarily require you to put your money where your climate change mouth is, which I predict few will do. The following taste is from:
Physics Letters A 366 (2007) 184–189
Classification of Dansgaard–Oeschger climatic cycles
by the application of similitude signal processing
Jordi Solé, Antonio Turiel , Josep Enric Llebot
“There are different works that relate the CO2 air concentration
with temperature changes, supposing that CO2 may [12]
or may not drive this temperature increase [20]. In this work
ice-core CO2 time evolution in the period going from 20 to
60 kyr BP [15] has been qualitatively compared to our temperature
cycles, according to the class they belong to. It can be
observed in Fig. 6 that class A cycles are completely unrelated
to changes in CO2 concentration. We have observed some correlation
between B and C cycles and CO2 concentration, but of
the opposite sign to the one expected: maxima in atmospheric
CO2 concentration tend to correspond to the middle part or the
end the cooling period. The role of CO2 in the oscillation phenomena
seems to be more related to extend the duration of the
cooling phase than to trigger warming. This could explain why
cycles no coincident in time with maxima of CO2 (A cycles)
rapidly decay back to the cold state.
“Using our technique, we have been able to put into correspondence
and to compare cycles happening at different locations
in the time series. We have so being able to identify three
different types of cycles, all of them sharing the first warming
phase but differing in the speed at which they relax back
to the cold state. Some striking consequence appearing from
the mere classification is that Younger/Dryas–Bolling/Allerod
(Y/D–B/A) cycle cannot be considered a unique cycle any
longer, as it is just a class B cycle similar to the other six we
have identified. Due to the importance given in recent scientific
literature to Younger–Dryas because of its influence in global
climatology [11], we have tried to identify the causes justifying
the apparition and type of the observed oscillations. One
key point is to explain the observed different cooling phases,
what we have done by crossing our evidences with independent
data on CO2 atmospheric concentration and testing the results
with theoretical reasoning about astronomical cycles. Nor CO2
concentration either the astronomical cycle change the way in
which the warming phase takes place. The coincidence in this
phase is strong among all the characterised cycles; also, we
have been able to recognise the presence of a similar warming
phase in the early stages of the transition from glacial to
interglacial age. Our analysis of the warming phase seems to
indicate a universal triggering mechanism, what has been related
with the possible existence of stochastic resonance [1,13,
21]. It has also been argued that a possible cause for the repetitive
sequence of D/O events could be found in the change in the
thermohaline Atlantic circulation [2,8,22,25]. However, a cause
for this regular arrangement of cycles, together with a justification
on the abruptness of the warming phase, is still absent in
the scientific literature.”
You see its really just that’s simple! Just reverse the polarity of your brain and read backwards, and almost instantaneously, you will have an epiphany and almost as quickly the Theory of Inverse Reality will take over and not only allow you to think in reverse, but mandate that you celebrate your genetic heritage and focus on just a single variable, much like we focused on rocks (or stones) for 2.8 million years (since H. habilis, also known as the Stone Age). It is, after all, what we are genetically best at. Do not deny your heritage! Fight for your right for a continued single variable future!
Or at least until the next eccentricity maxima…..
“An examination of the fossil record indicates
that the key junctures in hominin evolution reported nowadays at
2.6, 1.8 and 1 Ma coincide with 400 kyr eccentricity maxima, which
suggests that periods with enhanced speciation and extinction
events coincided with periods of maximum climate variability on
high moisture levels.”
state Trauth et al in Quaternary Science Reviews 28 (2009) 399–411.
Smokey,
What are your specific criticisms of the Oerlemans dataset? Saying you don’t like it because it reminds you of something else is really rather lame.
Tom P (19:09:49) :
Smokey,
Good for you to put up another plot unfit for publication. What is your objection to the Oerlemans’ temperature profile, apart from the shape?
I must say I’m a little disappointed. I visited here to bet there is global warming, and I haven’t had a single person willing to bet otherwise, despite their initial statements to the contrary.
[snip ~ modify your tone or post elsewhere ~ ctm]
I don’t like to bet and I won’t bet; I will only say that I have said many times that there is a natural succession of warmhouses-icehouses which occurs on this planet since its origin. I would bet that the current cooling is a very brief episode which will be tracked by a prolonged warming period. I insist on it is not a manmade problem; it’s completely natural:
http://www.biocab.org/Geological_TS_SL_and_CO2.jpg
Carbon dioxide doesn’t heat anything because it is not a primary source of energy. It is the energy absorbed by the carbon dioxide which is released by the same molecule, but there is not nuclear fission or fusion neither any combustion of carbon dioxide in the atmosphere. The Sun is a primary source of energy, so the carbon dioxide would absorb, in connection with its absorbency, as much energy as the surface of the Earth is capable of absorbing solar energy and radiating or transferring it latter.
Nick Stokes has an excellent comment for many scientifically trained observers of the great climate debates:
Nick Stokes (16:06:32) :
This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.
So, Hansen’s projections are physics based, eh? Fair enough. Einstein (full disclosure – I ain’t no Einstein) liked his Gedankens, here’s one: I live on a platform suspended in the sky, an area of the sky that never has wind, & have done enough ball throwing to figure out that v=a*t & d=.5*a*t*t & the gravitational acceleration constant a is 9.8 m/sec. Here’s my physics-based projection: at a height of 1960 meters over the ocean, I drop a tennis ball. The projection is that the tennis ball hits the water (which with my eagle eyes I can see) in 20 seconds and its velocity is 196 meters/sec. Since gravity continues to operate on the tennis ball after it hits the water, in another 10 seconds its speed is 294 m/sec & it travelled 2450 meters under the ocean surface. The physics in my model is unassailable and I have no reason to doubt my results.
But, of course everyone knows the results of this physics-based model are silly; the time the ball takes to hit the water is much greater than 20 seconds, and the tennis ball will not continue through the water constantly increasing its speed even though it is still subjected to constant gravitational acceleration. There are other forces at work that act on the ball and many will counteract and some will overpower the gravitational force on the ball.
The problem I have with the climate model believers is not that I think they are bad people (many seem quite genuine) or ignorant (many seem quite intelligent), or that the models they support are anti-science (I bet there is some fine climate physics in these models). The problem is that they have grasped the tail of the climate elephant and insist on telling everyone else that the climate is absolutely a long and thin object. Evidence to the contrary from others grabbing a leg or a belly is typically shouted down as heresy, very often with a condescending remark. (Although, to be fair I have seen beaucoup unwarrented patronizing remarks from the skeptic side as well).
The climate models seem woefully incomplete and based on thin facts (Temperature proxies in trees? Really, this is your data?). I was involved in the early days of the modeling of plastic flow into injection mold cavities, and I cantell you the early years of these “physics-based models” were spectacularly craptastic. The only correct simulations were when the model was forcibly adjusted after the fact to a known result. Change the cavity geometry, change the material (even subtly) and? Pretty crappy results again. Have these models improved? Absolutely. Do the software models of today resemble the original models? Err maybe as much as an F-15 resembles a Sopwith Camel. Trial & error & adjust, thousands of times repeated, with many new variables introduced (many of which are discarded later) as experience is gained; that’s how every decent model of physical phenomena that I am aware of has evolved.
These climate guys are young in their efforts and I wish them the best, but I believe they are uncomfortable adjusting their fundamental approach and do not embrace new data & new, potentially previously overlooked physical phenomena as they arise in the climate discussion. They are much more vigorous defending old & stale projections, and no modeling effort ever succeeded following that approach.
In a previous posting, the author looked at the long term temperature trends based on the East English 1659-2008 data. This analysis compared different method of filtering to bring out more of the long term “climate” trends as opposed to the more “weather” or short term trends. These analysis methods included:
40 year moving average
40 year low-pass Fourier Convolution filter
40 year Chebushev recursive filter.
The Fourier Convolution filter is preferred in that it covers the most recent end point, which the MOV and Recursive does not. However, the recursive filter does provide a check on the other two filters, sort of a voting member in case of a difference of opinion.
The data set is a composite using some of the longest European temperature records available. They are available on Rimfrost and include both monthly and yearly averages. For this study, the data was copied into EXCEL and the yearly average column was placed adjacent to the next. The first column was the East English data from 1659 to 2008. the next was the Uppsla data from 1722-2008. Successive columns included:
Berlin 1701-2008
Paris 1757-2008
Geneve 1753-2008
Basel (not used since Geneve was used)
Praha 1775-2005
Stockholm 1755-2005
Budapest 1780-2008 *
Hohenpeissenberg Ger. 1780-2008 *
Muchen 1781-2008
Edinburgh 1785-1993
Warsaw 1792-2008
Bologna 1814-2008
Oslo 1816-2008
The * indicates later records “homogenized with GISS more recent records.
The average yearly composite was summed across the rows with a EXCEL VB program. Any missing number or a number with 99 was ignored. Initially only one sample was used from England, but as the years progressed more data was available until all 14 samples per year were included in the composite mean. Not perfect math, but “barn yard” math my old Fluid Mechanics professor would use. The area was basically western Europe, as that is where the only really long term records are.
This was the first step in a “roll your own” global temperature that has some trace ability. The 14 means 14 locations were used, as noted above, and the resultant average plot is shown below:
http://www.imagenerd.com/uploads/ave14-hadcet-raw-uKebB.gif
In addition, the composite temp (Ave14) was compared to the Hadcet global temperature shown in the figure. The basic shapes seem about the same, but the Ave14 data set shows the warm period prior to 1850, that is missed in the Hadcet data. Also the Hadcet “spread” is less, then the Ave14, probably due to the smaller number of samples in the Ave14 set.
The next step was to filter in raw data, using 40 year filters. These were:
40 year moving average (Mov)
40 year Fourier (FFT)
40 year Recursive (Rec) Chebushev 4 pole (fc=0.025 cycles/year)
These are shown below, plotted against the raw data:
http://www.imagenerd.com/uploads/ave14-raw-smoothed-wv2un.gif
Since the Chebushev filter introduces a phase or time delay, it is adjusted back in time. For a 4 pole filter, the delay is 180 deg. or 20 years at the for frequencies at the cut off. Hence a 40 year cycle would be adjusted back 20 years. The figure below shows the recursive filter shifted back 20 years, tracking very close to the Fourier filter.
http://www.imagenerd.com/uploads/ave14-raw-smoothed-ph-adj-GYsRY.gif
From the raw data, one can see that Europe seemed to have a warm period from 1750 to 1800, that equaled the period in the mid 1900’s. It also appeared to have one in the mid to late 1600’s, but that is based only on the East English location.
The other item is the 50-60 year cycle that shows up. A interesting recent ref. is:
Arctic Climate Variability – 60 Year Cycles, by V. Smolyanitsky, etc., of the Arctic & Antarctic Research Institute or St. Petersburg Russia, 2008. This paper shows the ~60 year cyclic conditions in the Arctic.
So in summary, putting this simple composite of long term European temperatures, allows a glimpse of temperatures from the 1700’s to the current time in a direct fashion. It also allows some interesting comparisons to other natural cycles such as ocean and arctic cycles. A follow up would be to add more locations, to see how close it would come to the Hadcet data.
It seems to me that IPCC would like us to believe they understand climate, however not one single model predicted the current cooling. The bottom line means there is a factor that affects climate that is bigger than ANYTHING anyone at the IPCC knows about. It makes no difference if the factor was solar, galactic or little fairies — they have no clue about it. Given this, explain to me again, why anything else these models predict is of any value since we’re not talking about them being off by 10% or 15% we’re talking about being 100% dead wrong — it’s cooling and they said it should be warming.
Even the folks over at realclimate had to finally admit we’re cooling (confirming the models are all wrong and clueless) but what I found more interesting is they say that “IF [their new] HYPOTHESIS is correct, the era of consistent record-breaking global mean temperatures will not resume until roughly 2020.”
http://www.realclimate.org/index.php/archives/2009/07/warminginterrupted-much-ado-about-natural-variability/
So essentially realclimate admits that in reality they have no clue what is going on with the climate and are now GUESSING as to what MIGHT happen next. They don’t explain how they arrived at 2020 but let me see … 2009+11 (average solar cycle length) is 2020 so it seems they know full well solar cycles drive climate and are simply hoping that the next solar cycle is an active one (unlike right now) and I suppose they’ll then tell us in 2020 that their new “hypothesis” is correct because they “predicted 2020 back in 2009”.
However, it seems like RC may be in for a surprise because NASA is predicting that solar cylce 25 will be one of the weakest in CENTURIES:
“Solar Cycle 25 peaking around 2022 could be one of the weakest in centuries.”
http://science.nasa.gov/headlines/y2006/10may_longrange.htm
“Using historical sunspot records, Hathaway has succeeded in clocking the conveyor belt as far back as 1890. The numbers are compelling: For more than a century, ‘the speed of the belt has been a good predictor of future solar activity.’
If the trend holds, Solar Cycle 25 in 2022 could be, like the belt itself, ‘off the bottom of the charts.’ “
It is unclear to me what the author means by “At least the predictions from the various papers are testable. ” None of the predictions of anyone’s climate model are testable in a laboratory sense. Maybe I’m splitting hairs, but it seems to me that this is the great flaw in our treatment of climate models. They are prediction systems, not test systems. Only by letting nature take it’s course and comparing the conditions present at the end of the prediction period to those predicted can we determine the fidelity of the model, and even then we can only say the model’s results were valid (or not) for that specific period, not necessarily for all later periods. Thus it would be more correct to say “At least the accuracy of the predictions for the models used in these papers is knowable within the lifetime of most people living today; whereas the accuracy of the predictions of any model for the end of the century are not (barring dramatic increases in life expectancy).
It is interesting that warmists completely ignore warm and cool oceanic cycles, which explain repeated warming and cooling, observed also in 20th century. Instead, they patch the inconvenient cooling in 1950-1980 by ad-hoc theory of sulfate aerosols.
My bet is ups and downs will continue on top of underlying trend of future solar activity. Now we are heading down, which will be further accelerated by coming Sun minimum.
I see E. M. Smith answered you much better than I could have. Even Hansen can’t given an answer on the subject as to why it’s meaningful.
Nick Stokes, what you are saying is that AGW is not based on physical observations but only on the weak AGW hypothesis/belief? I knew that this was true but to hear it from a Canutist is brilliant. Model based hysteria and pseudo science. The house of cards is falling.
Btw the biggest laugh of all is the bandying about of global avarage temperature rises of 0.1 to 0.5C per CENTURY. Average daily temperature can’t be measured accurately at a point to anywhere near that accuracy and to then apply it worldwide is breathtaking nonesense. The data is basically random number generation which can be made to show anything we want like Alice Through the Looking Glass. Sorry, the use of proxies such as tree rings to show average temperature is even more laughable.
I have said many times that the general public don’t believe in AGW but they are being bombarded with Goebels like propaganda from one side only and eventually they will take it seriously. I work in Syria and spend time in Malaysia, they find the concept hilarious and that is how normal people should view it.
Jeff Alberts (14:19:02) :
I’ll go you one better. There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.
In astrophysics, one defines an ‘effective temperature’ of a body X as the temperature of that blackbody that radiates as much X. Makes perfect sense. A different question is how well we can measure the effective temperature with the thermometer network we have.
interested spectator (21:16:03) :
NASA is predicting that solar cycle 25 will be one of the weakest in CENTURIES
They also predicted that cycle 24 would be one of the strongest in centuries…
It’s a non-bet. Because no one can pick a CO2 signal out of climate data. If they say they can they’re lying.
If my model is anywhere near the mark we will see a slight rise of around 0.1C in temp if solar cycle 24 gets going, followed by a 20-30 year slight fall of around 0.2C – 0.3C as the oceans cool due to generally low solar activity.
Considering UAH already fell around 0.2C from 2005 to now, this is not drastic as far as I can see.
Of course, I am probably completely wrong. 🙂
Tom P (19:21:37) : But the fact remains that while there have been plenty of assertions on this site that we are experiencing global cooling, nobody is willing to bet against me on this.
Reply: I said I will take your bet. In fact. I propose you double it. Use 200 pounds as your base. ~ ctm
Well, I’d be willing to “take the bet” on the same terms as ctm were it not for two things:
1) The gubmint has been cracking down lately on ‘online gambling’ and I’m not willing to take THAT bet.
2) The logistics of dealing with the bet are not worth the potential gains.
(Basically, I make $10,000 sized bets in trades and expect an overhead cost of $15 round trip with a time overhead of about 10 minutes or less and a cycle time of a week to 6 months with no legality risk. Your bet is too inefficient to interest me. Other than that, the odds look favorable to my side.)
So don’t read any great truth into the lack of takers. It’s just not a very interesting proposition.
Reply: Gee thanks, bragging about how much more money you have than I do. Well…nyeah. ~ ctm
Leif Svalgaard (22:46:54) : In astrophysics, one defines an ‘effective temperature’ of a body X as the temperature of that blackbody that radiates as much X. Makes perfect sense.
THAT I would agree with. Surely we can park something at geosync that can get a “visual integration” of the X radiated from the earth? (Might need 2, one front and one back…). Why go through all the hokum of measuring thousands of points and averaging if you can get a single data point that has meaning?
Since one is not averaging intensive variables, but instead taking a measurement of one thing (radiated energy in the field of view) one has a valid metric. Basically, you are measuring the emissions for a “single pot of water”.
A different question is how well we can measure the effective temperature with the thermometer network we have.
I fear the answer is “not well”. A further question is how to tease out any historical baseline from the sparse (in some cases non-existent) historical thermometer data… I can say that the GIStemp code is not up to the task.
interested spectator (21:16:03) :
NASA is predicting that solar cycle 25 will be one of the weakest in CENTURIES
They also predicted that cycle 24 would be one of the strongest in centuries…
I’m tempted to predict it will be one of the most mediocre in centuries, just to complete the set 😎
Oh what the heck:
It will be one of the most mediocre in centuries. Indistinguished.
You heard it here first (and you will forget it here first too, I’d wager… 😉
ctm
Your reply to EM Smith.
But you keep telling us you are a virtual moderator comprised merely of pixels-not flesh and bones-what could you possibly want with money?
tonyb
Reply: You’re think of Evan, but after a year of mental health recovery I am stepping back into entrepreneurship. So we’ll see. ~ charles the moderator
Reply: Gee thanks, bragging about how much more money you have than I do. Well…nyeah. ~ ctm
Urk. I um, er. It wasn’t meant to be bragging…
BTW, one can make a $10,000 sized bet using options for about $100 “real money” … (A $100 stock can have options trading for about $1 that are ‘interesting’ – and a lot of options at 5 to 10 cents that are almost guaranteed to lose – and that means that $100 will control 100 shares of a $100 stock, so you are swinging about $10,000 of stock with a $100 option buy. So you, too, can make $10,000 size bets on less than you are betting now.
Longwinded way to say you don’t need a lot of money to bet big, only to pay off 8-{ big and options let you control the payoff side of the bet…
Reply: I was jerking your chain, but I am kicking myself for not putting down a couple of grand on calls for Ford when it was $1 ~ charles the day late and dollar short moderator (who knew how well the Escape was doing in the marketplace)
FORECAST ?
SEE:
http://sites.google.com/site/earthquakepredictionbyjac/Home/greenhouse-effect
http://sites.google.com/site/earthquakepredictionbyjac/Home/greenhouse-effect/KLIMAT2.eml?attredirects=0
Jacek Dunajewski
RW: “yet atmospheric CO2 concentrations started rising in about 1750. CO2 did not suddenly become a greenhouse gas at some point after 1940.”
I see it coming. Next claim will be that all the warming we have seen since the LIA is due to man.
Charlie: Thanks for the link to Lean and Rind (2009). It makes the same basic error that Lean and Rind (2008) makes. It assumes the relationship between ENSO and global temperature variations is linear. It is not. This was illustrated in my post “Regression Analyses Do Not Capture The Multiyear Aftereffects Of Significant El Nino Events”
http://bobtisdale.blogspot.com/2009/07/regression-analyses-do-not-capture.html
Refer also to my posts:
“Can El Nino Events Explain All of the Global Warming Since 1976? – Part 1”
http://bobtisdale.blogspot.com/2009/01/can-el-nino-events-explain-all-of.html
“Can El Nino Events Explain All of the Global Warming Since 1976? – Part 2”
http://bobtisdale.blogspot.com/2009/01/can-el-nino-events-explain-all-of_11.html
“RSS MSU TLT Time-Latitude Plots…Show Climate Responses That Cannot Be Easily Illustrated With Time-Series Graphs Alone”
http://bobtisdale.blogspot.com/2009/06/rss-msu-tlt-time-latitude-plots.html
One can always, always fit a function to past data and desired future projections. Once the AGW advocates are done retrofitting their models, they will still predict calamity. This is the single point of failure in the current political situation: politicians do not understand that one can always create a model for the past – and that this says nothing about the model’s ability to predict the future.
Regarding testability: the only meaningful test is this: write down specific predictions for the future. Wait till the future gets here. Do the predictions match, yes or no? Unfortunately, when the answer turns out to be “no”, they just weasel-word their way out of it and retrofit their model again…
Nick Stokes (16:06:32) :
“………..This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.”
I’m delighted to learn this because the major argument I was given by a luminary of the British government in reply to a letter from me to the PM was to do with the temperature record.
I, and most here I would guess, have few problems with “greehouse theory”, what troubles me is the invocation of “climate catastrophe”, which in my opinion is a step too far.
E.M. Smith – you have horribly misunderstood some basic physics. Averaging intensive variables is trivial. Your example about taste is meaningless. Taste is not a quantifiable physical property.
Leif
“In astrophysics, one defines an ‘effective temperature’ of a body X as the temperature of that blackbody that radiates as much X. Makes perfect sense. A different question is how well we can measure the effective temperature with the thermometer network we have.”
.
It makes about as much sense as saying that given some constant power P , I choose an arbitrary function f0 among the infinity of functions solving
[Integral over some arbitrary space-time domain (fdx) = P] .
There where it stops making any sense is when people begin to infer from :
[Integral over some arbitrary space-time domain (fdx)] = [Integral over same arbitrary domain (gdx)] => f = g what is obviously wrong .
Or in other words and that is actually the point of E.M.Smith , f0 may solve [Integral over some arbitrary space-time domain (fdx)] = P but it clearly doesn’t solve any differential equations representing the laws of physics like f.ex Navier Stokes .
Not even remotely because f0 in our case here describes an isothermal and isotropic body while the real body (described by the function g) is anything but isothermal and isotropic .
There is a word for that and this word is UNPHYSICAL .
So nobody is really interested in a solution F0 for the Earth (the global average temperature) because it is unphysical e.g doesn’t represent even approximately the physics of the Earth . Its only property is that its integral gives by definition the total emitted power while getting everything else wrong .
It is not very sensible to theorize about patterns of global warming or cooling without having regard to two major geophysical processes:
• the Earth’s variable rotation and its relationship with global temperature; and
• the Lunar Nodal Cycle and its relationship with the climate of the Arctic and the Pacific Decadal Oscillation.
In relation to the Earth’s rotation there is considerable evidence that decadal length variations in the rate of the Earth’s rotation result in periods of global cooling or warming.
Lambeck and Cazenave (1976), “Long Term Variations in the Length of Day and Climatic Change” published in 1976 in the Geophysical Journal of the Royal Astronomical Society Vol 26 Issue No 3 pps 555 to 573, reported that there is an established relationship between the Earth’s decadal variable rotation and climate dynamics.
As LoD shortens, (i.e. the Earth rotates faster) the planet warms; in contrast, as LoD lengthens, the planet cools. There is a time lag of most likely six years between the change in the Earth’s rotation and global temperature changes.
Their paper is available here: http://rses.anu.edu.au/people/lambeck_k/pdf/37.pdf
Their paper warrants careful study.
Lambeck and Cazenave (1976) found that:
“The long-period (greater than about 10 yr) variations in the length-of-day (LoD) observed since 1820 show a marked similarity with variations observed in various climatic indices; periods of acceleration of the Earth corresponding to years of increasing intensity of the zonal circulation and to global-surface warming: periods of deceleration corresponding to years of decreasing zonal-circulation intensity and to a global decrease in surface temperatures. The long-period atmospheric excitation functions for near-surface geostrophic winds, for changes in the atmospheric mass distribution and for eustatic variations in sea level have been evaluated and correlate well with the observed changes in the LoD.“
Lambeck and Cazenave (1976) argued that the cooling of that the planet experienced in the 1960s arose from a slowing of the Earth’s rotation.
They wrote:
“if the hypothesis [that decadal rotation decrease (increase) results in planetary cooling (warming)] is accepted then the continuing deceleration of[the rotating Earth] for the last 10 yr suggests that the present period of decreasing average global temperature will continue for at least another 5-10 yr.”
Lambeck and Cazenave (1976) predicted that the cooling would come to an end by the mid 1970s and be followed by a period of global warming because they had discovered that the planet’s rate of rotation had begun to accelerate from 1972.
They wrote:
“Perhaps a slight comfort in this gloomy trend is that in 1972 the LoD showed a sharp positive acceleration that has persisted until the present, although it is impossible to say if this trend will continue as it did at the turn of the century or whether it is only a small perturbation in the more general decelerating trend.”
How this comes about is a matter of continuing debate and research. It does seem reasonably well established that the proximal causes are changes in the behaviour of the Earth’s inner cores and the way these cores are coupled dynamically and electromagnetically to the rest of the Earth.
However, there is growing evidence that coupling between various forms of solar activity and the inner cores is one of the determinants of the variable behaviour of the cores.
Regardless of this, the relationship established by Lambeck and Cazenave has been corroborated by others and disconfirmed by no one. Keep in mind that Lambeck and Cazenave found the correlation between LoD and average global temperature is a statistically significant 0.91 using time series some 150 years long with good quality data.
It could be, as Kurt Lambeck argues in his book, The Earth’s Variable Rotation: Geophysical causes and consequences, Cambridge University Press 1980, pps 279 – 282, that both arise from a third cause (about which he did not speculate).
It could be that the same solar activity contributes to the global climate variations in addition to the ways in which rotational change over a decade brings about climate change.
Richard Gross of the JPL at the California Institute of Technology produces a report each year that presents the most recent data about the rotation of the Earth. The title of the report is Combinations of Earth Orientation Measurements: SPACE2007, COMB2007, and POLE2007. See here http://trs-new.jpl.nasa.gov/dspace/bitstream/2014/41279/1/09-18.pdf
Fig 4 (d) which is the graph of Length of Day from 1960 to 2007 below with the smaller variations smoothed out.
{Looks like the lovely garph didn’t copy into the Leave a comment box, but you can find it in the pdf}
The graph shows that LoD has been shortening (rotation speeding up) since 1970, except that the planet’s rotation slowed just a little between 1988 and 1994. It then began to speed up, de-accelerating again just a little in 2006. There is just a hint of speeding up again in 2007.
The key factor to concentrate on is the decadal rotational changes.
The table below summarises LoD variations over the last fifty years, the predicted climate consequence and the period in which that consequence would occur, given a lag time of five years, other things being equal. As the determinants of climate dynamics are multivariate, non-linear and non-stationary and include elements of randomness, it is not realistic to say that the predicted climate consequence necessarily follows.
Time period of rotation LoD Rotation Climate temperature Climate period (5yr lag)
Pre 1960 – 1972 lengthen slower cooling 1960 – 1977
1972 – 1987 shorten faster warming 1977 -1992
1987 – 1994 lengthen slower cooling 1992 – 1999
1994 – 2002 shorten faster warming 1999 – 2007
2002 – present lengthen slower cooling 2007 – ?
{looks like the formatting vanished when I copied a word document into the “Leave a comment” box}.
L&C noted that the significance in the “lag suggests that the LOD observations can be used as an indicator of future climatic trends, in particular of the surface warmings.”
Prediction of the LoD time series is an area of specialized research conducted by a relatively small number of scientists. Gambis and Bizouard (2003) see slide 18 of http://www.ien.it/luc/cesio/itu/gambis.pdf predict that LoD will lengthen during 2000 and 2010 resulting in global cooling from around 2006 to 2016 according to the relationship established by Lambeck and Cazenave.
As has been shown in previous posts there is a well established relationship between the Lunar Nodal Cycle, the climate of the Arctic and the Pacific Decadal Oscillations. The LNC peak of 2006 brought on the negative PDO which is now cooling North America.
Here is another recent paper about the LNC by Renato Ramos da Silva and Roni Avissar “The impacts of the Luni-Solar oscillation on the Arctic oscillation”, Geophys. Res. Lett., 32, L22703, doi:10.1029/2005GL023418; here is the Abstract:
“The Arctic Oscillation (AO) has a major impact on climate variability in the extra tropical regions of the Northern Hemisphere. In this study, we show that specific alignments between the Sun, the Moon and the Earth known as the Luni-Solar Oscillation (LSO) that occur at frequencies of nearly 9 and 18 years are unambiguously correlated with the AO since the mid-1960’s. The occurrence of the LSO peaks is predictable and, as a result, it improves climate predictability. Furthermore, we hypothesize that the recent observed increase in the AO amplitude maybe due to ice melting in the Arctic and SST increases at northern latitudes combined with the recent LSO peaks.”
The authors conclude with this interesting observation:
“Finally, we note that the current generation of global climate models that are broadly used to produce various climate change scenarios do not account for long-term tidal dynamic effects. This may be a significant flaw worth investigating carefully.”
TomVonk (03:38:42) :
So nobody is really interested in a solution F0 for the Earth (the global average temperature) because it is unphysical e.g doesn’t represent even approximately the physics of the Earth.
Of course it does. Just like it makes a lot of sense to say that the average family has 2 1/3 children. Nothing unphysical about it. The problem comes when you try to make rhetoric out of it.
So RW…
The “expected rise” over the last decade was also 0.2 C. How has the reality on the ground compared to that one? Since it was wrong both in sign and magnitude is it ok if I not assume that everything in the computer fantasy games called models is the absolute truth?
What do the trends look like if we remove the “corrections” that are obviously incorrect? What would we have if we made some realistic attempt at correcting for UHI and land use changes?
GeoS
How delighfully naive you are to believe that any part of the AGW hypothesis is based on actual facts and figures. The following is an exchange I had with Joel Shore over on another thread a couple of days ago. This isn’t a dig at Joel who I always find a pleasant well informed promoter of AGW-it is the system that undepins it that is wrong, as the exhange articulates.
“From TonyB
Hi Joel
Good to see you over here again. You’ll be setting WUWT as your home page before you know it 🙂
You said;
“Of course, our understanding of AGW is based on a lot more than just current temperature trends. It is, for example, based on the temperature difference and estimated difference in forcings between the last glacial maximum and now. So, any new competing hypotheses will (at least eventually) also have to explain this empirical data in addition to the current temperature trends.”
I appreciate you were being simplistic, but to broaden it out you obviously know that our understanding of AGW is based on a lot of hypothetical data.
When you parse temperatures, sea level rises etc etc that go back some 150 years or more, to hundredths of a degree or mm, this must surely be in the hope (rather than certainty) that we had accurate data back then that can be parsed and sliced so minutely and therefore has some scientific validity.
Do not certain things worry you about much of this data;
* The notion that 20 stations comprise our 1850 global temperature data and they have changed in numbers and locations ever since? (don’t get me started on whether a GT has any meaning)
* That our global sea level (AArgghh!) is based on highly extrapolated (i.e. non existent) data back to 1700 that relies on three northern European tide gauges?
* That SST were measured in a very haphazard way over a tiny portion of the globes water surface and only highly specific local ones should be given any credence?
* That ice melts and re-freezes with monotonous regularity in the arctic and that current events are not out of the ordinary?
Is there not a scintilla of doubt in your mind that relying on -at best- often highly dubious if not actually meaningless information, is not a sensible way to run a railroad?
Ps That 8 out of 10 letters reference was the first time I have seen you make a joke, keep it up!
PPs Don’t tell Flanagan, but I quite like him posting here as well 🙂
Best regards
tonyb
From Joel Shore (15:31:33) :
TonyB: What you have identified are the kinds of issues that one deals with in science all of the time. Data is seldom available without some (often severe) data quality issues. I don’t know what to tell you except that the way one deals with it is the way that climate scientists have been dealing with it: by having different independent analyses done on the data sets, by making estimates of the errors introduced due to various known data issues, by looking at different measures of a similar thing (e.g., in addition to direct air thermometer measurements, there are ocean temperature measurements, measurements of the advance and retreat of glacials, borehole temperature measurements, and various temperature proxy measurements).
Work at the forefront of science is seldom as easy and straightforward as presented in science textbooks. This is probably one of the main reasons why scientists and “lay people” tend to reach rather different conclusions regarding the strength of the evidence in a field such as climate science where lay people are motivated to investigate the science.
If other fields of science were subjected to the similar sort of study by lay people that climate science has been, I think that these people would find similar deficiencies…and they would probably be left believing hardly any of the theories of modern science. And yet, I think these theories have been very important and successful in advancing our knowledge, and the reason I think that is the case is that, while each individual piece of data or each experiment may have its problems, the whole set of data and experiments taken together generally have a high degree of redundancy that means that the overall picture that emerges is more likely to be correct than one would expect by looking at each experiment in isolation.
To be honest, I have spent almost my entire career as a computational physicist being continually surprised at how well the models that I use agree with the experimental data in spite of the fact that I can almost always identify many concerns that I have with either the data or the model (e.g., many things that the model is ignoring).
From TonyB (16:37:11) to Joel
Thanks for your candour. However it is explained away, much of the theoretical information presented is contradicted by observations from the real world. It seems to me that climate science has standards not as rigorous as those applied to other sciences and the burden of proof falls far short of what should be expected.
This is probably the first science born in the computer age and many believe the models to be more accurate than we can in reality currently achieve.
According to the IPCC, Climate Change 2007: The Physical Science Basis “The set of available models may share fundamental inadequacies, the effects of which cannot be quantified.”
best wishes”
As I say, this is not a dig at Joel, but I do think it demonstrates the uphill task we have when we point to actual facts and figures to demonstrate our case. These apparently do not count as much as we thought they did.
tonyb
E.M.Smith (16:50:59) :
Nick Stokes (16:06:32) :
This is completely wrong, and reverts to the common fallacy that AGW is based on an examination of the temperature record. It isn’t. It has always been based on an analysis of the greenhouse effect, and the accumulation of GHGs. Hansen’s paper was based on that too. That’s why he gave his physics-based projections citing varied GHG emission scenarios.
“So it has always been based on a fantasy about gasses and models and projections, oh my! Ok, got it…
It’s so much easier to understand when you leave the data out and accept that it has always been confirmation bias and self delusion.”
Well then:
“The data don’t matter. We’re not basing our recommendations [for reductions in carbon dioxide emissions] upon the data. We’re basing them upon the climate models”
Chris Folland
UK Meteorological Office
“The “expected rise” over the last decade was also 0.2 C. How has the reality on the ground compared to that one? ”
Using GISS data, the mean global temperature anomaly from 1999-2008: 0.49°C. Mean global temperature from 1989-1998: 0.31°C. So, the last decade has been 0.18°C warmer than the one preceding it. If you prefer the creationist dataset, UAH gives anomalies of 0.04 and 0.20, giving an increase of 0.16°C.
“Since it was wrong both in sign and magnitude…”
Try again and give us a more accurate description of how +0.16 and +0.18 compare to +0.2
“Ron Mexico:
So, Hansen’s projections are physics based, eh? Fair enough. Einstein (full disclosure – I ain’t no Einstein) liked his Gedankens, here’s one: I live on a platform suspended in the sky, an area of the sky that never has wind, & have done enough ball throwing to figure out that v=a*t & d=.5*a*t*t & the gravitational acceleration constant a is 9.8 m/sec.”
I am sure that Einstein had many ideas, but you do not have to make a “square plural” of der Gedanke (sing.)/ die Gedanken (pl.). But you should square the sec in g. Otherwise it is a pleasure to see somebody calculating in “metric units”.
James F. Evans (10:15:36) :
“Instead of assuming an increasing AGW why not look for evidence of it in the actual data. In other words let the data have primacy over the theory.”
Imagine that!
Data over theory.
Sadly, in scientific discipline after scientific discipline, theory trumps data.
The imagnination and desires of men have a firm grip on Science.
No surprise there.
But when theory trumps data, one can not claim the state of Science is “clearly healthy!” with a straignt face and retain credibility.
But perhaps after this experience with the inscrutible Sun men will be humble and more open-minded and less attached to figments of imagnination flickering on the cave wall.
Kudos to you. The notion that we can have science absent “good” data is paranormal science. The facts are there, the data points have been inconsistent and moved in many instants while in others the local geology has changed dramatically, from rural to urban, thus artifactually changing the database. The database derived from our satelites is juvenile and flawed by external influences and marred by poor design, temp. buoys that float freely, give me a break. The more I study and read the less I am convinced that we know about this and that we understand even less than that. When I superimpose these thoughts on the history of remarkable climate change on this earth I am forced to the conclusion that the only hot air involved here is from the proponents of AGW, in support of their theory and their funding.
What i want to know is will the lack of warming or any of the science which has questioned AGW have an influence on the next IPCC assessment?
RW (11:57:26) :
This whole piece is a tiresome mixture of half-truths, untruths, misunderstandings and simple mistakes.
What is becoming very clear is that he whole business of evemn measuring the global temperature has become “a tiresome mixture of half-truths, untruths, misunderstandings and simple mistakes” as you put it.
There is no “current global temperature standstill”. The trend in temperatures from x-present is not statistically different from the trend from x-1998, or x-2002, or x-2005, or whichever date you want to cherry pick. Yet again, you allow yourself to be fooled by weather. Will this ever stop?
What do you call it when the temperature stops rising? If to want to masage the data, you can. With a decade of little or no risies, you still get a flat part of the curve, in fact they mostly seem to be going down (hockey sticks notwithstanding).
There is no “IPCC estimate of AGW”. The quantification of the expected temperature rise is far more complicated than the single number you wrongly attribute. In fact, in this and the next few decades, the expected warming from the rise in CO2 would be about 0.2°C, not 0.1°C.
The IPCC has definitely predicted the effects of the presumed AGW. It is repeated to us, with alarming increases, on a regular basis by media. True, it is far more complicated, but the media portray it as figures. Ane thay are wrong, I agree. That means we should challenge these assertions, I am sure you can agree.
“The 0.5 deg C warming observed between say 1850 and 1940 is not due to AGW” – and yet atmospheric CO2 concentrations started rising in about 1750. CO2 did not suddenly become a greenhouse gas at some point after 1940.
1750? From what, exactly? we were not burning fossil fuels back then, so I assume you assert CO2 increase is not caused by us then? I’m not sure I understand the logic of this.
Rik Gheysens (12:25:51) :
It seems me that the authors in this new paper (2009) have an opinion that is opposite to that of 2008. Then they argued: “According to this analysis, solar forcing contributed negligible long-term warming in the past 25 years and 10% of the warming in the past 100 years.”
Now, they state:”But as a result of declining solar activity in the subsequent five years [from 2014 to 2019], average temperature in 2019 is only 0.03±0.01 C warmer than in 2014. This lack of overall warming is analogous to the period from 2002 to 2008 when decreasing solar irradiance also countered much of the anthropogenic warming.”
Have they been overtaken by events??
No, no, no, you just don’t understand. If it’s warming, it’s AGW, if it’s cooling, it’s the Sun. Or it’s weather. Or it aerosols. Or it’s bad data. or it’s something else we may not have thought of……..
Tom P (13:19:29) :
I’m willing to bet against you on these terms that we are cooling. Are you on (in a currency multiple of your choice)?
I would love to take you or anyone up on that, apart from the massive problem that I cannot trust the data. When it is analysed by amateurs, and when they can even get a hold of it, we get clear signals denying any warming, whereas when it is analysed by ‘professionals’ it is massaged to ‘prove’ a foregone conclusion.
Shame about that.
RW (15:07:00) :
“There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.”
Oh, god, not this again… it’s so depressing to see so many people fall for this tragic nonsense. What does “temperature” mean to you? Only if you think the answer is “nothing, ever” is your previous statement logically possible.
I’m sorry, but you are completely wrong, and I rarely say that.
The idea of a global mean temperature is so deeply flawed with hidden misunderstandings that it is effectively meaningless. When you are trying to capture a trend of 0.01C a year, it is laughable.
My random thoughts on it are:
Quite frankly, there are a huge number of problems with trying to measure the temperature of the world. I mean, you can’t just stick a thermometer into a convenient orifice and wait.
There is land and sea and air.
Which part do we measure? Presumably we should measure all of them. Should we weight the results according to the thermal capacity of each of these mediums? Should we consider the relative volume of each? When we talk of the land, presumably that is just the surface.
Land is all at different levels.
Do we measure the surface wherever it is? In theory we should. There will be considerable differences depending on the make up of the land. Clay, sand and chalk, for example, will have different heat retention properties.
The air is at different temperatures all the way through, and constantly moving. Where should we measure. How should we weight the various measurements we do take? Do we have to take humidity into account (wet air has a higher thermal capacity). How should we compare measurements at high humidity to measurements at low humidity?
The seas are deep, and at varying depths. Where should we measure? The top few hunderd meters? What about after storms when the lower, colder waters may be (and most often are in my experience) brought to the surface.
Assume we manage to find compromises for all of these variables. I assume somone has, however good or poor they must be, because temperatures are published and we are told they are valid.
OK, then, how far apart are the measurements taken. In a uniform grid? Obviously not. If most are concentrated in one place (eg the US), how do we weight these against the others?
How do we take the spreading urban sprawl and its associated warming of the envoronment into account? How do we filter the data to remove this bias?
The number of active stations seems to vary constanatly. A huge number of stations in the former USSR went offline when the USSR collpased in 1990. Interestingly, it seems that a whole lot more went offline in China, Turkey and Australia at about the same time. I have no firm idea why.
How can we relate the measurements after 1990 to those before?
OK, so we manage, somehow, to achieve that. Since graphs are shown with no display of this fracture in the continuous record, we have to assume someone is managing that and adjusting the data to compensate.
So what we probably need is the temperature taken at a constant height in the air, and at a constant depth in the sea, every hour of every day, at regularly spaced relatively small distances. Given this, we could probably be fairly sure of temperature changes and trends over time.
What we have is measurements taken at inconsitently spacings, with random geographic distribution (probably weighted toward the US), at varying heights.
But some argue that the trend is important, not the absolute temperature. This is a good point.
The problem them becomes when you measure. Let’s look at when the measurements are taken. There is a strong ‘time of day’ bias in some records, especially older ones. It depended on when someone decided to wander out and read the thermometer as to what temperature was recorded. That does not sound very accurate.
So we then have Max/Min thermoemeters. That should be better, shouldn’t it? Well, perhaps not. Imagine the temperature stays at 10C all night, and most of the day. In the afternoon the clouds part, the sun comes out, and the temperature reaches 20C in a couple of hours before it rains, dropping it to 10C again sharply.
The record shows 10C – 20C. Is that an average temperature of 15C? I think you can see it is not.
There is pretty clearly a huge amount of ‘manipulation’ of this data required before meanigful results are available. What is not clear is what is done to the raw data, and why. That makes many mistrustful of any ‘global’ or ‘average’ figures, and therefore trends. They do have a good point.
^^^
and BTW, if you do want an orifice for the mythical global thermometer you can pick the UK. That might be because the UK is like the ‘mouthpiece’ of the world, or OTOH it might not. I’ll leave the alternative as an exercise for the reader….
E. M. Smith,
“Well, I’d be willing to “take the bet” on the same terms as ctm were it not for two things:
1) The gubmint has been cracking down lately on ‘online gambling’ and I’m not willing to take THAT bet.
2) The logistics of dealing with the bet are not worth the potential gains.”
Both pretty poor excuses. The law does not make illegal to make bets online, just to run a business based on online betting. Hence your first excuse is worthless.
Given that you like the odds, your logistical argument is pretty lame as well. You would make £200 (or more than $300) just by typing a two word response, taking perhaps 10 seconds. I’ll pay by PayPal so all you need do is send me the account name, maybe another minute of work. That’s a rate of of $15,000/hour!
If you really have a problem with making such easy money, it really does look like you’re all talk and no trousers!
Richard Mackey (04:16:38) :
It is not very sensible to theorize about patterns of global warming or cooling without having regard to two major geophysical processes:
• the Earth’s variable rotation and its relationship with global temperature; and
• the Lunar Nodal Cycle and its relationship with the climate of the Arctic and the Pacific Decadal Oscillation.
In relation to the Earth’s rotation there is considerable evidence that decadal length variations in the rate of the Earth’s rotation result in periods of global cooling or warming.
Is this for real, or a Troll / Chain Yank / humour I don’t get?
I think the physics I learnt at age 12 proves that to be complete 00’s
To Tom P .
I’m not a gambling man – never have been and never will be.
Here’s one for you though, and all you other AGW guys out there. And also to Barrack Obama, Gordon Brown, Angela Merkel, et al.. Why don’t we have a voluntary tax that you pay if you believe AGW is real and you don’t pay if you believe that AGW is a total crock? Seems simple enough to me…
Alexej Buergin (05:52:07) :
I am sure that Einstein had many ideas, but you do not have to make a “square plural” of der Gedanke (sing.)/ die Gedanken (pl.). But you should square the sec in g. Otherwise it is a pleasure to see somebody calculating in “metric units”.
Hey Alexej,
Thanks for the corrections…..my missive was written late at night (for a lightweight like me) so I will blame the hour on my turning the Earth’s gravitational acceleration constant into a velocity…..
Regarding my misuse of the German plural and singular….I can make that mistake any time of the day (but hey I already told you I was no Einstein)…..
Score one for the (sometimes) corrective capabilities of blogs! Cheers!
Chris Schoneveld (01:03:04) :
“. . . I see it coming. Next claim will be that all the warming we have seen since the LIA is due to man.”
—————————–
Acutally, I do not think it is the “next claim.” It has already been claimed. See the chart SPM.2 from the AR4. According the IPCC, since 1750, radiative forcings from anthropogenic sources are over 13 times natural radiative forcings.
Despite the merits of Global Mean Temperature concept, I think it is clear from glacier retreat that the earth has warmed since 1750. Although AGW concerns often focus on the anthropogenic influence of the last 50 years, an “RW” poster on another blog has been very clear in his understanding that the warming since 1750 has been due to Greenhouse gases.
“”” E.M.Smith (16:06:26) :
George E. Smith (09:37:42) : But I suspect that GISStemp is a reasonable representation of the tiny network of thermometers (owl boxes) that make up the (near) surface network.
It isn’t.
It uses too many airports for UHI adjustment (treating them as rural). It uses places very unlike an urban area and very far away as proxies for it (they are not). It divides the world into 6 latitude bands (and hides the major movement of thermometers south in those wide bands). And much much more… “””
E.M. you are missing my point. If I go to my local supply house; say Van Waters & Rogers, and I purchase 12 calibrated Mercury in glass thermometers having a -20 deg C to + 200 deg C range that are accurate to 0.5 deg C when used according to the manufacturer’s specifications (maybe imersion depths and such like); I can now place these thermometers around my house in various places; one in the oven, one in the refrigerator, one in the freezer, one on top of the TV, etc
So each day I go and read each thermometer and note the reading; maybe I do it twice a day; and being a working stiff, I would read them all before I leave for the office, and 12 hours later when I am home again.
I can gather this data for 10 20 150 years or so, and statistically process all those numbers to compute an average daily temperature for that specific set of 12 thermometers; and I can plot lots of nice graphs, and do regressions, and trend lines and all such statistical trickery.
And what I am doing is a perfectly valid protocol for me to use for my set of 12 themometers; for whatever reason I am spending this time and effort.
Where I get into trouble is when I assert that the data and the computed results of whatever statistical mathematical prestidigitation I have done; in any way shape or form, represents a “global average” temperature for my house.
Same thing gor GISStemp; whatever Hansen does with the numbers received from his collection of thermometers; is perfectly reasonable for his aims on what he wants to get from that specific set of thermometers; That includes all the ones that are in Urban Heat islands; particularly that Climatologically Engineered one at the University of Arizona.
But just as my set of 12 thermometers is not properly sampling the total temperature map of my house; so too, Hansen’s set are not properly sampling the total global surface temperature map. It is perfectly valid to take a temperature sample inside a UHI, or any number of UHIs. What is not valid is to apply those UHI readings as representing the temperature in some tropical rain fores 1200 km away from the UHI. That is where the Nyquist Theorem comes into play, and dictates that you must sample the continuous function at a rate not less than twice the highest frequency contained in that (band limited) continuous function; and in this case we have a continuus function in space and time, and have to satisfy the criterion as to both variables.
Which is why I say that GISStemp graphs are probably good representations of GISStemp; which I define as simply whatever AlGorythm, Hansen applies (consitently) to his limited thermometer set. But it is when he extrapolates, and asserts that his graphs relate to this planet’s behavior; that Is when I yell foul; they are no more a valid representation of planet earth’s mean surface temperature, than are my 12 thermometers in my house representative of my house. And if my wife is baking a cake when I read the oven thermometer; I’m likely to get a strange reading; which I will process like any other as Hansen does; but I will find that I have not properly evaluated the frequency spectrum of my house temperature signals, and I have failed to comply with Nyquist’s Sampling Criterion.
George
Betting! Julian Simon would enjoy that if he were alive today. Paul Ehrlich lost.
I’m confused, I thought that CO2 followed temperature not the other way round?
I’m not either. But the bigger problem is that neither side can prove their case (actually AGW needs to prove their case, since natural variability is the default position)
Ah that old favorite “the cooling is masking the warming” handwave. Usually reserved for the Antarctic non-warming. The trouble is that when you change your story all your other hand-waves come back to haunt you. The team (Mann, Trenberth et al) made made great play about the solar variations not being able to explain temperature beyond 1980 (Lockwood+Frohlich used 1985). Implicitly thereby they admitted that it correlated very well for hundreds of years up to that 1980 cutoff as can be readily seen on the NASA site or Solanki’s site.
They really weren’t expecting this 12 year stall. So now that very weak and declining natural variability is apparently overcoming the very strong GHG effect and they all scuttle back to the 100 year trend for support. But some of us still remember the IPCC 1950 cutoff for AGW and we remember too the 1980 cutoff of the solar debunkers. If you want to change your story now get it in the next IPCC papers. Or better still be honest and admit you don’t have a clue.
The sensible ones are obviously backing down gradually in order to save face while the fanatics are still concocting new hockey-sticks and getting shriller – before it’s tooo laaaate.
Got to laugh at RW’s 10 year weather event. The “weather noise” handwave again! Only in climate science would you assume the declining end of a time series was noise. Here’s a scoop – It’s not noise it’s a signal.
Jeff Alberts,
“I’m not a gambling man…”
That might be why you’re missing the point. This bet will not prove anything but is rather predicated on my assessment that I have a greater understanding of what might be the drivers of temperature change. Of course Charles believes the same, so he is willing to enter the bet. Natural oscillations will indeed mask any longer term trend, hence the square-root dependence of the stake with integrated time. As time increases anything other than a flat trend will ensure one of us ends up winning until the other concedes.
For instance, if in 1900 Arrhenius, after he first published the theory that CO2 was warming the atmosphere, had made an identical bet with one of Charles’ ancestors, he would have lost in the first four years but ended up more than $25,000 by the time of his death.
Reply: I’m just doing it because it’s fun. In my worldview or “understanding”, it is simply a coin flip. I have no idea if I will win or lose. ~ charles the moderator
masonmart (22:42:02) : “Nick Stokes, what you are saying is that AGW is not based on physical observations but only on the weak AGW hypothesis/belief? I knew that this was true but to hear it from a Canutist is brilliant. Model based hysteria and pseudo science. The house of cards is falling.”
You are traducing the memory of an intelligent man.King Canute would certainly have been a sceptic. The story is that Canute’s obseqious courtiers believed that he was so powerful that he could even stop the tide coming in. To show these fools that he was not all-powerful, Canute ordered his throne to be put on the beach and sat on it as the tide came in and ordered the tide to stop. The tide came in anyway and soaked him. He proved his point.
We need a political leader of Canute’s intelligence today!
“Here’s a scoop – It’s not noise it’s a signal”
Got to laugh at JamesG’s hilariously unwarranted certainty. Talk about faith!
Jeff Alberts (14:19:02): I’ll go you one better. There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.
Yes. Please read chapter 1-1 in this excellent book we had in school:
http://books.google.co.uk/books?id=dcCpdY_Y2CEC&lpg=PR11&ots=7yJSXXYXIu&dq=treatise%20on%20irreversible%20and%20statistical%20thermodynamics%20wolfgang&lr=&pg=PA4#v=onepage&q=&f=false
Temperature is strictly speaking only defined at a point in equilibrium. On the other hand, I think it is perfectly clear that our earth was much colder during the last ice age. If we approach a new ice age due to low solar activity, we cannot sit on the fence and say there is no global mean temperature.
Possibly global temperature may be a more meaningful concept if we relate it to the global energy, however, there are so many subtle energy forms that the concept would be highly disturbing. Still, in order to calculate some sort of bulk average temperature, we should at least take into account the thermal mass of the air and the oceans, since there is no law for the conservation of temperature – it is the energy that is conserved.
“an “RW” poster on another blog has been very clear in his understanding that the warming since 1750 has been due to Greenhouse gases.”
You’ve probably misunderstood what this “RW” character has said. You, and other posters, seem to be under the impression that climate change over any period can be ascribed to one single variable. Let me assure you that this is never, ever the case. If you see anyone promoting a document entitled, for example, “Nature, not human activity, rules the climate”, you should tell them this.
Charles,
“I’m just doing it because it’s fun.”
You’re a sport!
I’m surprised that of all the contributors to this site who have in the past vehemently stated that the world is cooling not a single one has taken me up on my bet.
Instead of grumbling about unwarranted carbon taxes, this should be a perfect opportunity to make some money as a hedge. Why the loss of nerve?
Tom P
Few people are willing to jump off the cliff into the cold water because it’s fun. But it’s ok for the rest of us to do it.
An Inquirer
I think you’ll find quite a few people bet on red or black at the roulette wheel, just not those who are gambling averse.
“”” Invariant (13:23:19) :
Jeff Alberts (14:19:02): I’ll go you one better. There is no “Global Mean Temperature”. It’s an artificial construct that is completely meaningless.
Yes. Please read chapter 1-1 in this excellent book we had in school:
http://books.google.co.uk/books?id=dcCpdY_Y2CEC&lpg=PR11&ots=7yJSXXYXIu&dq=treatise%20on%20irreversible%20and%20statistical%20thermodynamics%20wolfgang&lr=&pg=PA4#v=onepage&q=&f=false
Temperature is strictly speaking only defined at a point in equilibrium. On the other hand, I think it is perfectly clear that our earth was much colder during the last ice age. If we approach a new ice age due to low solar activity, we cannot sit on the fence and say there is no global mean temperature. “””
Well your thesis is not correct, and moreover your cited reference does not even say that “”” Temperature is strictly speaking only defined at a point in equilibrium. “””
What it does say; is that things (and thermometers) that are in thermal equilibrium with each other will all register the same temperature.
But today temperature is defined in terms of the kinetic energy of particles, even a single atom has a temperature, and that temperature could be changing rapidly as that atom collides with other energetic atoms or molecules.
As for the “Global Temperature”; or shall we restrict it to say the “global surface temperature”; it most certainly has a value at any instant in time, and it certainly has an average value taken over any time frame such as a complete sun orbit for example.
The problem is we have no method of measuring the complete global surface temperature map to the required resolution to compute even the single instant of time global surface mean temperature; even satellites can’t do it, becasue no satellite has instantaneous 4pi coverage of the earth’s surface to even take a snapshot of the complete temperature map.
I don’t know what any of the orbital periods of surface sensing satellites is, but potentially they can scan enough of the surface to register several data points each day for any point on the surface; but they aren’t simultaneous, so that complicates the data reduction.
But the more important point is that the global mean surface temperature even if we could measure it, tells us nothing about the energy flows into and out of the earth’s surface; and that is what the big question is; are we gaining or losing net energy on this planet ? And in that sense “global mean temperature” is meaningless; besides being unmeasurable.
George
Tom P:
Seldom will I bet when the odds are 50/50 that I will lose. Economists believe that most people are risk adverse and will not bet on a 50/50 proposition (except for entertainment), and I certainly am one of them.
To me, It is a 50/50 chance that the GMT will be cooler or warmer next year and beyond. It fact, regardless of AGW, there may be a warming trend due to recovery from the LIA. I do not know when the recovery will be complete, and I do not see any sure science that explains what is driving the recovery or how long it will continue before the next set of variables induce a trend in the next direction.
I do have bet offers for which I have no takers:
1) Al Gore’s warning of ice-free Arctic by 2013.
2) $100,000 on ice-free Artic by 2030. (I would like to double my granddaughter’s inheritance.)
3) The status of the polar bears in 2050.
Excellent post, thank you.
An Inquirer,
If you think the world is warming you would indeed be misguided to take my bet.
But recovery from the LIA? Here’s the historical temperature time series combining the global temperature derived from glacier history and the instrumental record:
http://img9.imageshack.us/img9/1994/glaciervsinstrumental.png
It certainly doesn’t look like the slow recovery over many centuries that would be expected according to your hypothesis.
JamesG (10:12:14) : “The team (Mann, Trenberth et al) made made great play about the solar variations not being able to explain temperature beyond 1980 (Lockwood+Frohlich used 1985). Implicitly thereby they admitted that it correlated very well for hundreds of years up to that 1980 cutoff as can be readily seen on the NASA site or Solanki’s site.”
You can add a few more years to that. From the IPCC Report AR4 2.7.1.3 : “In particular, the cosmic ray time series
does not correspond to global total cloud cover after 1991 or
to global low-level cloud cover after 1994 (Kristjánsson and
Kristiansen, 2000; Sun and Bradley, 2002) without unproven
de-trending (Usoskin et al., 2004).“.
Incidentally, they then go on to describe ways in which it might or might not correspond over different periods, before dismissing it and cutting the amount allowed for solar variation. They then say “The level of scientific understanding is elevated to low relative to TAR for solar forcing due to direct irradiance change, while declared as very low for cosmic ray influences (Section 2.9, Table 2.11).“.
It beggars belief that the MSM and others treat their “findings” as gospel.
George E. Smith (14:09:26) Well your thesis is not correct, and moreover your cited reference does not even say that “”” Temperature is strictly speaking only defined at a point in equilibrium. “
Citing from the book:
“To find the temperature at a point in a system undergoing a change, let us suddenly isolate a small element of space surrounding the point and allow the matter to reach equilibrium; the temperature then measured in the usual manner defines the temperature at the point. [] For a macroscopic point of view, on the other hand, we shall require the element to be small.”
It is a long way to go from the kinetic energy of a molecule to the bulk temperature of a large body, but thermal simulations that are of vital importance in many critical engineering installations leaves little room for philosophical definitions of temperature. Nearly all equations in the book I cited are differential equations valid only for a single point in space at local thermodynamic equilibrium. In practice the kind of local thermodynamic equilibrium we talk about here is reached very quickly both in time and space, however, all equations we use in thermodynamics are strictly speaking only valid at local thermodynamic equilibrium. Outside local thermodynamic equilibrium temperature is not defined, at least not in the classical way that leads to useful and observable temperatures that can be used to simulate thermal installations. See this link:
http://en.wikipedia.org/wiki/Thermodynamic_equilibrium#Local_thermodynamic_equilibrium
“Global thermodynamic equilibrium (GTE) means that those intensive parameters are homogeneous throughout the whole system, while local thermodynamic equilibrium (LTE) means that those intensive parameters are varying in space and time, but are varying so slowly that for any point, one can assume thermodynamic equilibrium in some neighbourhood about that point.”
Here the main point is “varying in space and time, but are varying so slowly”. So in a sense most dynamical systems vary so slowly in space and time that we can assume local thermodynamic equilibrium. This is important, since it means we can use all the wonderful equations in the above mentioned book.
Global temperature on the other hand is a philosophical concept, but I argue that we should at least adjust for the fact that the thermal mass in the oceans is huge compared to the thermal mass in the air. In this way the global temperature will be a better measure of the state of our earth and whether we are approaching a new ice age or not.
And since no one knows the EU is turning up the heat to get climate regulations to go global:
http://www.ecnmag.com/news-EU-Global-Climate-Pact-091009.aspx?menuid=0
“……….and yet atmospheric CO2 concentrations started rising in about 1750. CO2 did not suddenly become a greenhouse gas at some point after 1940……’ RW (11:57:26).
But the temperature rise prior to 1940 cannot be attributed to anthropogenic emissions, i.e. the burning of fossil fuels.
http://photos.mongabay.com/09/0323co2emissions_global.jpg
1940-1980, the temperature was in stasis.
2000-2009, the temperature has been in stasis.
The entire temperature trend 1940-2009 occurred 1980-2000.
http://www.woodfortrees.org/plot/hadcrut3vgl/from:1940/to:2010/mean:13/plot/hadcrut3vgl/from:1940/to:1980/trend/plot/hadcrut3vgl/from:1980/to:2000/trend/plot/hadcrut3vgl/from:2000/to:2010/trend
“At what point do we question the hypothesis of CO2 induced warming?”
There are many who have always questioned the hypothesis. After all, the hypothesis was and is extremely weak as scientific theories go. It only described about 60 years of the billions of years of climate history, and not even that if you question the groundless assumption of aerosol cooling. Simply put, the theory is almost baseless and has always been that way.
I am just an operational meteorologist who has questioned the hypothesis since the early 1990s. I find it interesting that researchers are just now beginning to say the same things people like me have been saying for nearly 2 decades, and acting like it is a revelation. It would be funny if the whole thing didn’t cost so much.
Tom P,
Here’s my wager: No one will be able to falsify the theory that global warming/cooling will go outside of its natural historical parameters. We can make the wager for however long you like, up to, say, ten years [I’m in my 60’s and can’t make the wager as long as I’d like].
I’m willing to put up $10,000 cash. Right now.
Anthony can hold the money; winner’s charity gets the loser’s $$$$$ +interest. Winner gets to claim the charitable tax deduction.
What say you, trouserboy?
Agreed?
Tom P (14:45:16) : “If you think the world is warming . . .”
I believe you have subtly shifted what I said. It is not that I think the world is warming, but that I believe that the world has warmed in the past 200+ years, and I do not know if that trend has runned its course or will continue.
Some could quibble with the word “recovered,” but to the extent that the concept of GMT has meaning, it is always in some sort of trend. I have not read the Science article that you referenced, so although I have several questions related to your attached graph, I will forego those until I have a handle on that data splice.
———————–
RW (13:48:22) : “You’ve probably misunderstood what this “RW” character has said. You, and other posters, seem to be under the impression that climate change over any period can be ascribed to one single variable.”
There has been no suggestion or evidence that I misunderstood this “RW” character, and much the contrary, I am the one in those conversations who has emphasized the multitude of variables. And I believe that most posters on this blog have an appreciation for the possible role of many variables. It is the IPCC in AR4 report (and in other places) who have said that the anthropogenic sources that have had overwhelming influence — tossing out natural variables. As I have said before, I do not know whether the set of variables that caused the recovery from the LIA has run its course or whether other variables have emerged to cause a trend in the opposite direction. As a scientist, I have studied more than a dozen proposed key variables in what has caused trends in the past, and CO2 emissions rank toward the bottom of variables that apparently have had a major influence.
Tom P (16:11:41) :
Ron de Haan
It must feel a little demeaning to be scrabbling around with US city data rather than making a global inference.
As you’re talking but not taking, can I infer you’re not willing to put your money where your mouth is?
Tom,
I posted my opinion about Lean and Rind (2009) here: Ron de Haan (10:22:58) :
The link to the icecap.us publication was extra, but interesting.
If the global data is handled like they did with the local data, there would be no AGW Hoax. Read it and you will understand why.
Smokey,
“No one will be able to falsify the theory that global warming/cooling will go outside of its natural historical parameters.”
Are you sure you mean this? This would appear to state that no one will be able to falsify future AGW! In any event, without some quantification of “natural historical parameters” there is no way to determine who might have won a bet based on this statement.
Please try again.
An Inquirer,
“I have not read the Science article that you referenced, so although I have several questions related to your attached graph, I will forego those until I have a handle on that data splice.”
You’ll get a better handle if you first understand that the plot is an overlay of two different independently determined temperature records, not a “data splice”.
Here’s the abstract of the Science article:
Extracting a Climate Signal from 169 Glacier Records
J. Oerlemans
I constructed a temperature history for different parts of the world from 169 glacier length records. Using a first-order theory of glacier dynamics, I related changes in glacier length to changes in temperature. The derived temperature histories are fully independent of proxy and instrumental data used in earlier reconstructions. Moderate global warming started in the middle of the 19th century. The reconstructed warming in the first half of the 20th century is 0.5 kelvin. This warming was notably coherent over the globe. The warming signals from glaciers at low and high elevations appear to be very similar.
[snip]
Ron de Haan,
You certainly appeared to make your intentions clear earlier:
“I don’t know what the future will bring but if I am aloud to gamble, I say we will continue to cool.”
Why the loss of nerve? At least you’re not alone – not one person who has claimed here that the world is cooling has so far been willing to bet on it.
You claim “If the global data is handled like they did with the local data, there would be no AGW Hoax” with reference to:
http://www.crh.noaa.gov/news/display_cmsstory.php?wfo=mkx&storyid=31040&source=0
No, such data would be meaningless. How do you calculate globally “Days with minimum temperatures at or above 70F”? Such a parameter is a local determination and indeed even as such is a useless metric for a good proportion of the globe which never experiences such a high minimum temperature.
I bet. Stop crowing that no one would.
You bet because you believe the odds are in your favor, that there is an unnatural forcing driving up temperatures.
I believe no such thing. But I’m not a Sun worshiper either.
I’d say we have a constrained random walk with a side of Milankovitch. But that is simply my opinion, not some massive logical deduction.
I bet because I think it’s a coin flip, but don’t mind the risk.
So while I haven’t been screaming “It’s cooling”. I clearly disagree with your model and am willing to bet on it.
Leif
“So nobody is really interested in a solution F0 for the Earth (the global average temperature) because it is unphysical e.g doesn’t represent even approximately the physics of the Earth.
Of course it does. Just like it makes a lot of sense to say that the average family has 2 1/3 children. Nothing unphysical about it. The problem comes when you try to make rhetoric out of it.”
.
You still don’t get it do you ?
Or you don’t read what people write ?
In both cases it is a wrong attitude .
So AGAIN :
.
1) There is an infinity of partitions of a surface having the same spatial average temperature .
2) But there is only one partition that solves for the laws of physics .
3) A constant (the average) does NOT solve the laws of physics .
4) A function that violates the laws of physics is unphysical by definition . If you persist to call it physical then it is you who is trying meaningless rhetorics .
5) The only property of the constant is that its integral gives the power . But that is again only a useless tautology because it has been DEFINED that way .
6) The Earth is neither isothermal nor isotropic . A global spatial average can’t have any relevant property for its dynamics even approximately .
See 3) . It has nothing to do with statistics . Have a look at Navier Stokes and live with it .
Charles,
“I bet. Stop crowing that no one would.”
You bet because you are a good sport – full credit to you! But what I actually said was that nobody who had clearly stated they believe the world’s cooling is willing to put their money where their mouth is.
Has there been a sudden conversion to Puritan values by some regular posters to this site?
TomVonk (02:26:22): It has nothing to do with statistics . Have a look at Navier Stokes and live with it.
We know Navier Stokes very well. A mathematician would probably state that strictly speaking there is no such thing as the average temperature for the oceans. Still, if you associate swimming with feeling cold in winter time the reason is that the average temperature is lower in winter time. It has to do with the integration of the partial differential equations over volumes which we physicist call bulk volumes. We then get bulk temperatures,
http://en.wikipedia.org/wiki/Bulk_temperature
Bulk temperatures are real even if the spatial and temporal temperature is not the same all over the bulk. Such bulk temperatures are used in commercial software applications to calculate the transient transport of petroleum fluids in pipelines that are more than 100 kilometers long. The deviation between the simulated and the measured transient fluid temperature after transport through the line is often less than 0,5 degrees Celsius.
While mathematicians cannot live with imprecise definitions like bulk and ocean temperature, these are most useful for physicist and in particular engineers. I suspect that the origin of the problem is that mathematicians are not so trained to do valid approximations based on experimental experience and intuition as physicists and engineers.
Jimmy Haigh (07:45:47) :
“Here’s one for you though, and all you other AGW guys out there. And also to Barrack Obama, Gordon Brown, Angela Merkel, et al.. Why don’t we have a voluntary tax that you pay if you believe AGW is real and you don’t pay if you believe that AGW is a total crock? Seems simple enough to me…”
There already is such a voluntary tax. Some airlines offer the opportunity to buy carbon credits when buying airline tickets. I know of only one person who has ever done this. I understand on good authority that the take up of carbon credits in one airline is tiny. But of course this sample only covers people who do fly and excludes believers such as Mr Gore who presumably don’t……………
TomVonk (02:26:22) :
2) But there is only one partition that solves for the laws of physics .
I don’t know what this statement means. Please explain [and that one only].
Tom P (10:27:13) :
if in 1900 Arrhenius, after he first published the theory that CO2 was warming the atmosphere, had made an identical bet with one of Charles’ ancestors, he would have lost in the first four years but ended up more than $25,000 by the time of his death.
He’d still have been wrong about co2 though. 😉
I already have a $1000 dollar bet running otherwise I’d be tempted. Good luck to you and Charles, but I hope Charles luck is better, both for the sake of his bet and mine.
tallbloke,
I’m curious as to the basis of your existing bet.
It’s a pity you’re unwilling to increase your exposure on this one – the thinking behind my bet is that random fluctuations are damped out when integrated over time, and the stakes increase correspondingly. Hence luck will play a decreasing role as the bet proceeds.
If you’re right the world is cooling there’s every chance to make a sum comparable to what Arrhenius might have won.
Are you sure you’re not tempted?
Tom P: “…nobody who had clearly stated they believe the world’s cooling is willing to put their money where their mouth is.”
It’s interesting that Tom P has re-framed the AGW conjecture to buttress his wager. The actual debate centers on the alarmist contention that runaway global warming will result from added CO2. Global cooling only became an issue when the planet didn’t cooperate with the alarmists’ predictions. With the climate flat to cooling over most of the past decade, the goal posts have been moved again and the new mantra is “global cooling is caused by global warming.” The real response [which they can not face] is: “We were wrong.”
As I’ve pointed out before, TP has set up a stacked deck in his favor by not handicapping the natural global warming trend line. [He also tried to stack the deck by trying to cherry-pick the starting year.] The wager as it stands is close to a 50/50 proposition in the first year. But as time goes on, the wager leans more and more heavily in TP’s favor. He says that anyone declining to play with his stacked deck has no trousers. Could TP be any more insufferable?
I’d be willing to bet that within the next ten years we will never reach the UN/IPCC’s AR-4 projections. But Tom won’t take that bet, because he can’t stack the deck. I’d also be willing to bet the planet will not be 3° C warmer in the next ten years, but of course Tom will come up with various reasons why he won’t fade me. The real reason is the same: he can’t stack the deck.
Trouserboy can offer his wagers and turn down similar wagers from others, there’s no harm in that. But constantly disparaging and ridiculing anyone who doesn’t take his bet, when he has his thumb on the scale all the while, is at best juvenile and it typifies the corruption at the heart of the climate alarmists’ failed conjecture. They have made it personal. And they run and hide from any honest, neutral public debate for the same reason: they can’t stack the deck in their favor.
Alarmists are all pretty much like Tom P, and vice versa. If they can’t win fair and square, they move the goal posts until the odds are in their favor. As the old saying goes, by hook or by crook. Anything to win, right?
Tom P (05:56:43) :
tallbloke,
I’m curious as to the basis of your existing bet.
If you’re right the world is cooling there’s every chance to make a sum comparable to what Arrhenius might have won.
Are you sure you’re not tempted?
It’s a very simple bet. It is that the average temperature for 2015 will be lower than for 2005 as measured by the average of all four major indices.
Looking pretty good at the moment
http://www.woodfortrees.org/plot/wti/from:2005/trend
Temperature is around 0.2C down from 2005 at the moment. 😉
I don’t think that by taking you up on your bet I would gain as much as Fred Bloggs might have in your hypothetical scenario, because I think the game will be up for the AGW hypothesis very soon and you would pull out.
If the negative phases of natural factors can overcome the co2 effect so easily, how much of the late C20th warming was due to the positive phases of those same factors?
tallbloke, Looking at the last 40 year cycle in my posting above, you have a excellent chance of collecting.
Smokey,
There’s no cherry picking of the year. I initially selected 2008 as the starting point early this year simply because it’s the last year we have data for. Coincidentally, the UAH temperature anomaly for 2008, at 0.048 C, is extremely close to the long-term anomaly baseline. Hence, whether you think there is a short term or long term trend in temperatures my starting point is quite reasonable.
Of course the bet is not going to be attractive to anyone who thinks the world is indeed warming. If you have never said the world is cooling, there’s no question of my saying “put up or shut up”. But others certainly have made such a claim. Ron de Haan earlier in this thread indicated he was willing to bet on it. All I did was take such a challenge at face value. The goal posts were well planted before I strode on to the pitch.
As for the other bets subsequently offered, there’s only one that has come up with a clear definition, your 3 degrees increase in the next ten years. We won’t see warming on anything like that scale. Offering a joke bet so as you can claim I’m ducking out reflects rather more poorly on you than me.
I’m all for open and honest debate. But it’s a little difficult to keep this up when as we were last discussing the science you turned rather quiet. Just to remind you what we were looking at:
http://img9.imageshack.us/img9/1994/glaciervsinstrumental.png
A couple of days ago I wrote:
“Hence, unless there is some reason to think the analysis has been fudged – and I have come across no suggestions to that effect – temperature is indeed the main driver for the movement of glaciers. The conclusion is therefore that the overall retreat seen over the last 150 years is due to the global increase in temperature. If you choose to dispute this, please tell me your grounds. If not we can move on.”
I’ve heard nothing back from you. As you have chosen not to dispute this, let’s indeed move on. Maybe you could suggest an explanation for the plot’s shape?
tallbloke,
Here’s the full dataset of temperatures, with the total trend and the trend for 2005 marked out to indicate the starting point:
http://www.woodfortrees.org/plot/wti/from:1978/plot/wti/from:2005/to:2006/trend/plot/wti/from:1978/trend
If I knew nothing about the source of the data, I’d say it was white noise on a rising trend, apart from a couple of excursions. The first of these is an extended below-trend period from 1993 to 1996. The second is a big peak in 1998 to 1999.
In fact we know that the first excursion is the global cooling due to the ash from Mount Pinatubo and the second is the end-of-the-millenium Super El Niño [why do contributors rant on about how global temperature averages are meaningless and don’t reflect the physics of the planet when these features are quite clearly associated with known physical processes?].
Apart from that I can’t really see any other natural factors appearing above the noise, and indeed we are currently above trend.
2005 was a good choice for a starting point – for you! Unlike Smokey I’m not going to fling around baseless accusations of cherrypicking – I’m sure the starting point was chosen as simply as mine. It would have been quite close to the linear trend at the time.
If, again, I was blind to the data source, my best estimate for 2015 would simply be a linear extrapolation – there’s too much noise on this time series to justify a more sophisticated approach. On that basis 2015 should turn out warmer than 2005, though it’s quite close and less than one standard deviation in the noise. Perhaps 60/40 against you.
Obviously I’m no poker player, tipping my hand before making the bet. But if you’re nevertheless willing to increase your exposure I’ll bet against you. As my wager proposal is more impervious to noise than yours, I already have some hedge against random variations in the data.
And Smokey, you owe me quite an apology, as well as a response to my earlier post.
TomP and Smokey
BBC radio 4 at 1.30pm today, Vicky Pope of the Met office reluctantly admits the climate has been cooling against their expectations and models
http://news.bbc.co.uk/1/hi/programmes/more_or_less/8248922.stm#email
tonyb
TonyB,
Just listened to it. She admits nothing like what you said. She states quite clearly that natural variability overlies the long term warming trend that the Met Office model predicts. As a case in point she cites the 1998 El Niño: 1998 was indeed warmer than 2008.
Your hearing was obviously impaired. Were you listening to the programme in the car?
And do you think the world is actually cooling?
TomP
I do not know if this is a transcript of the interview with her, which was a vox pop that fitted into the rest of the programme. She definitely 100% said-very hesistantly in response to the interviewers questioning-that the world had cooled.
You ask me Is the world cooling? Temperatures have been at best flatlining for some years but its much too early to think of it as a trend. I do think ‘global’ temperatures are a nonsense anyway and do not believe we have a reliable measure. This year was the 91st warmest in the CET record. It makes you think…
Personally, if history repeats itself, I would have expected a good more few years of natural warming. I certainly wouldn’t bet on it though and think that sceptics are making themselves hostages to fortune by relying too much on a few years data
By the way the BBC often repeat their programmes (sorry, they give us ‘another chance to hear’) so I will listen out for it.
tonyb
Tom P (10:10:42) :
tallbloke,
Here’s the full dataset of temperatures, with the total trend and the trend for 2005 marked out to indicate the starting point:
2005 was a good choice for a starting point – for you! Unlike Smokey I’m not going to fling around baseless accusations of cherrypicking – I’m sure the starting point was chosen as simply as mine. It would have been quite close to the linear trend at the time.
If, again, I was blind to the data source, my best estimate for 2015 would simply be a linear extrapolation – there’s too much noise on this time series to justify a more sophisticated approach. On that basis 2015 should turn out warmer than 2005, though it’s quite close and less than one standard deviation in the noise. Perhaps 60/40 against you.
Obviously I’m no poker player, tipping my hand before making the bet. But if you’re nevertheless willing to increase your exposure I’ll bet against you. As my wager proposal is more impervious to noise than yours, I already have some hedge against random variations in the data.
Tom, I revisited the terms of my bet, my memory is atrocious at times (Big bike crash 3 years ago). It’s that the trend from 2005-2015 will be down not up. You’d probably say that lowers my chances of winning, but I’m still pretty confident. The guy I bet against has been pretty quiet anyway, which was part of my aim in accepting his bet! 😉
I don’t really want to take on another bet at the moment though, so I’ll decline, sorry.
It’s all good fun though, and by the time the bet matures, the stake will probably buy me and my opponent a rice beer each in some Chinese bar in Europe. All us high rollers should have a get together and a laugh over the whole shenanigans further down the line.
TomP
Sorry Tom, perhaps it was you listening in a car. I just listened again and in response to the inteviewers prompting about the possibility of future ten year cooling periods she says;
‘we’ve just had one’ (a 10 year cooling)
‘yes.’
It is right towards the end of the first item about climate change. Perhaps you fell asleep-the first guy’s voice was very monotonous.
I thought it was a good intelligent programme. I remain to be convinced that the current cooling will turn into a real trend (the MWP lasted 400 years after all) but we have just experienced an extended cooling period which has caused the modellers to come up with new models.
tonyb
TonyB,
Quite correct that there was an admission of a drop. But not that it was against models and expectations – rather that natural variability can occasionally cause such dips.
I agree, though, there’s little cooling trend to be discerned from the data – hence my offer of a bet with tallbloke. But I can’t agree global temperatures are a nonsense – how does the record pick up Pinatubo and El Niño so clearly?
tallbloke,
That’s just as I understood the bet. Shame you’re unwilling to extend, and apologies accepted. I’m just a little miffed though by others who are adamant the world is cooling but are now keeping their mouths shut while their trousers are still round their ankles.
I’ll be on for a beer in 2015 – should I say a warm one?
Smokey,
You’re also keeping very quiet. And it’s you who wrote “they run and hide from any honest, neutral public debate…”
Tom P,
Sorry [<— my apology] I haven't responded. I haven't visited this page since I last posted. I've been having some fun over on the Svensmark thread, and a couple others.
I’ve offered you several wagers, all of which you refuse to accept. So don't rag on me for not taking the bet you put out there. My wagers are no no different than what you're trying to do. I've explained exactly why upthread.
Now, back to Svensmark.
Here is a more detailed repeat of the link I mentioned above to TomP and Smokey
+++++++++++
BBC radio 4 at 1.30pm today, Vicky Pope of the Met office reluctantly admits the climate has been cooling against their expectations and models
http://news.bbc.co.uk/1/hi/programmes/more_or_less/8248922.stm#email
This is the BBC’s Tim Harford item (the link is found at the bottom of the box to the right of the item “Blowing cold, then hot”).
transcript
“Tim: If the cooling that the Leibniz Institute predicts actually takes place, are you worried that ’s going to take the wind out of some of the sails of scientists who are warning about the threat of global warming?
Vicky: It’s very important to realise that there will be ten-year periods where the temperatures don’t increase or they even decrease as the Leibniz study is suggesting –
Tim: We’ve just had one.
Vicky: Yes, in fact we have, but that doesn’t mean that global warming has stopped, it’s simply a question of natural variability, giving a temporary decrease in temperature overlaid on top of a long-term warming trend, and in fact I believe that’s what the results of that study suggest –
Tim: Sorry to interrupt but you say that were going to have ten-year periods of cooling. How can we be sure that the rapid warming we saw in the 1980s and 1990s wasn’t the exceptional period?
Vicky: This is the point really, is that 1998 was exceptionally warm because there was an El Nino, because there was a natural variation overlaid on top of climate change. So what you can see very clearly is a long-term trend and then these periods of rapid warming and less rapid warming or even cooling overlaid on top of that because of natural variations.”
This should also be seen in the context of the New Scientist interview. All I am saying is that the models did not predict the (officially) admitted cooling and they are having to take ‘natural variability’ into greater account. I make no predictions as to whether this is the start of a longer cooling trend.
tonyb’
Tom P (18:57:32) :
TonyB,
Quite correct that there was an admission of a drop. But not that it was against models and expectations – rather that natural variability can occasionally cause such dips.
Tom, do you accept that such natural variables as can cause occasional decadal dips by overcoming the almighty power of co2 can also therefore cause occasional decadal rises? Logic would seem to demand it, because if they are strong enough to overcome co2, they must rise as often as fall, or temperature would have diminished over the C20th not risen.
If you accept the inexorable logic of this, you also accept that some of the temperature rise attributed to co2 was is fact due to the positive phases of these other natural factors.
This has a considerable effect on the calculation of the co2 radiative forcing.
What say you?
Smokey,
“I’ve offered you several wagers, all of which you refuse to accept.”
These were:
1.” No one will be able to falsify the theory that global warming/cooling will go outside of its natural historical parameters.”
This statement is nonsense, though it probably means the opposite of what you intended.
2. “The planet will not be 3° C warmer in the next ten years.”
Certainly way beyond any projections. This would need a heat input way beyond anything humans could produce. This a joke bet that only demonstrates a lack of good faith.
3. “That within the next ten years we will never reach the UN/IPCC’s AR-4 projections.”
This is better, but as I said before, this needs some numbers to be a meaningful bet. The IPCC projections can be found in AR4 here in figure 10.5:
http://www.ipcc.ch/pdf/assessment-report/ar4/wg1/ar4-wg1-chapter10.pdf
I take you are therefore willing to bet that the monthly UAH lower troposphere temperature anomaly for the next ten years will never reach the equivalent ensemble average time series anomaly for any of the three scenarios outlined in IPCC AR4 figure 10.5.
You pay up if and when any of the projections is reached. I pay up if after ten years the projections have never been reached.
How much do you want to bet? We can start next month.
Christopher Hanley – “But the temperature rise prior to 1940 cannot be attributed to anthropogenic emissions, i.e. the burning of fossil fuels.”
CO2 did not become a greenhouse gas in 1940. Your statement makes no sense.
Jim Clarke – “[the hypothesis of CO2 induced warming] only described about 60 years of the billions of years of climate history”
Another greenhouse effect denier! Amazing! The climate record, as a whole, could not be accounted for if there was no greenhouse effect. Your statement makes no sense.
An Inquirer – you’ve misunderstood the IPCC as well. Even if one factor is dominant, it does not follow that all other factors are negligible. You cannot account for the climate of the last 150 years without accounting for the greenhouse effect of CO2. You also cannot account for the climate of the last 150 years without accounting for the influence of the Sun and volcanoes. I think you’ve got some revision to do.
“As a scientist, I have studied more than a dozen proposed key variables in what has caused trends in the past, and CO2 emissions rank toward the bottom of variables that apparently have had a major influence.”
Where is your work published? Please give the journal reference.
What price El Nino?
From
http://weather.unisys.com/archive/sst/sst_anom_loop.gif
ending september 6, to:
http://weather.unisys.com/surface/sst_anom.html
seems it is weaker and weaker.
I remember Nina biting last year in the same plots.
tallbloke,
“Tom, do you accept that such natural variables as can cause occasional decadal dips by overcoming the almighty power of co2 can also therefore cause occasional decadal rises? Logic would seem to demand it, because if they are strong enough to overcome co2, they must rise as often as fall, or temperature would have diminished over the C20th not risen.
If you accept the inexorable logic of this, you also accept that some of the temperature rise attributed to co2 was is fact due to the positive phases of these other natural factors.
This has a considerable effect on the calculation of the co2 radiative forcing.
What say you?”
What we see in the satellite record are episodic rather than periodic inputs which cause a relatively brief excursion of a few years or so. I don’t see how they can have long-term effect on climate. All the rest looks like noise that would average to zero as well.
Longer term there may be some periodic variations. They would certainly modulate any background trends. There is some recent work by Huang in isolating such variations using empirical mode decomposition of the HadCRUT dataset:
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1986583
This paper shows that the background trend is by far the largest part of the world’s temperature history, with perhaps a little more than 0.1C out of today’s 0.8 C rise attributable to long term oscillations. Given the other uncertainties in the CO2 forcing factor, a natural contribution to the temperature profile on this scale does not have an appreciable effect.
So, no doubt there are natural oscillatory contributions to the warming we see, but their effect is swamped by a longer term rise over the last 150 years.
Has anyone seen Smokey? He owes me both an attempt to explain this temperature history, as well as getting back to me on the bet he offered and which I’m ready to accept. If anyone else would like to respond in his place, please go ahead.
Ron Mexico (21:13:03) :
“These climate guys are young in their efforts…”
So true. Someone elsewhere suggested they should get these guys working on the Unified Field Theory, as they could probably knock it out in just a few years.
George E. Smith (14:41:48) :
“The aliasing noise injected by failure to observe proper sampling protocol, is in band noise so it is permanently unremovable without throwing away good signal as well. ”
Just because a signal has aliased components does not mean it is not useful. The eyes with which you are reading this only sample at something like 30 Hz. The question is, what signals are there beyond the Nyquist frequency, are they significant, and to what band are they aliasing? Often, we are interested only in low frequency information. This is why many sensors are of the integrating variety, because we can turn the delta-integrated data into an average, and averaging naturally attenuates signals that would alias to the low frequency spectrum.
E.M.Smith (16:43:14) :
“An average of a bunch of thermometer readings MEANS NOTHING.”
You have a point. We do get a number that represents something, but what? It may be usefully compared with other samples, to the degree that they are similarly obtained, and we can thereby establish historical patterns, but is there really any basis for IPCC proclamations to the effect of “a 0.3 degree rise in global temperature will be catastrophic”? Maybe not so much. Thanks for bringing up the food for thought.
J. Bob (21:14:44) :
“The Fourier Convolution filter is preferred in that it covers the most recent end point, which the MOV and Recursive does not. ”
No, it is just interpolating the end points implicitly. What you should really be doing is designing a filter for a specific bandwidth so you know precisely what the frequency content of your output is. You should do a PSD to get an idea of precisely what the frequency content is. You can use a filtering algorithm such as this one to get a zero-phase result which provides the endpoints via a consistent approach.
Interesting discussions, all. Thanks.
I did look at the whole PSD plot. Unfortunately I gave my Matlab and all the toolboxes away over a decade ago, so I’m using EXCEL & VB. However running the data both directions is one thing I’ve tried on the FFT part. The “filfit” part I don’t remember in the Sig. Tool Box I had at the time. We did use “Padde” methods to correct for phase shift, but I think Sig. Cond. offers some interesting insights. Will take a look at the “filfit” and see what shows up. What did you think of averaging the Rimfrost data?
Apologies J Bob – I guess I got caught up in the technicalities and forgot to comment on the actual substance.
If I took the pro-AGW side, I guess I would look at the last upswing and view it as proof that things are getting hotter. On the other side, I would note that the slope is nowhere out of the ordinary. Moreover, you could argue that the trend is tapering off at the end, as your FFT filter seems to indicate. But, it can then be argued that this is an artifact of how the FFT filter handles the endpoints via circular convolution.
What I would do with this data is use the PSD to determine the major frequency components, then create a model composed of the sum of sinusoids at those frequencies with amplitude and phase parameters and use least squares estimation techniques to derive those parameters. Actually, now that I think of it, the easiest way is as follows. Say I have frequency spikes at just omega1 and omega2. My model is
M = [a1 b1 a2 b2] * [cos(omega1*t) sin(omega1*t) cos(omega2*t) sin(omega2*t)]’
This is a linear model, and the coefficients a1, b1, a2, and b2 can easily be estimated.
The thing is, with this data, observability of a periodic component longer than 400 years is very low. For all we know, if we could accurately extend the data series farther back, we might see a repeat of the constructive interference which may be occuring near the end of the data record in which a roughly 400 year cycle adds to the 60 year cycle, causing a local maximum independent of anthropogenic forcing. You might hypothesize such a component and try it out in the estimation process above, and see if you can’t replicate the behavior.
You probably can. That’s really the whole nub of the problem with the AGW models. They also can be tweaked to fit the data over any finite interval. But, that does not mean the models are “truth”.
Bart, what I will do is post a more detail portion of the FFT part of the analysis on this thread. One of the 1st things we did in a spectral analysis is to “echo”, or compare a unfiltered output to the original input. That saved a lot of arguments later on. If the unfiltered matches the input over the range, can one say there are “losses” at the end points?
Bart,
“That’s really the whole nub of the problem with the AGW models. They also can be tweaked to fit the data over any finite interval. But, that does not mean the models are “truth”.”
There’s no need to tweak any parameters – if you use the right analytical tools. Fourier transform techniques are not the best way to isolate periodic signals and trends as they can’t deal well with longer-term trends across the time series. This is, of course, unsurprising as a fourier transform can only deconstruct a signal into a series of sinusoids. Windowing only contributes by throwing away information concerning any trend to artificially force periodicity onto the system.
Empirical mode deconstruction is a more promising approach, and requires no prior assumptions concerning the periodicity or otherwise of the data – there’s just nothing to tweak.
Have a look at the article I cited above:
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1986583
As I mentioned, the most prominent signal that comes out of the data is a long-term warming trend overlaid with a much smaller 65-year cyclic component. Empirical mode deconstruction is an unbiased way to isolate any warming trend from the data, and shows quite clearly that the world has indeed been warming for the last 150 years.
Tom – no matter how you slice it, with a finite data set, your confidence in estimation of any periodic process with a period approaching or exceeding the timeline of the data set diminishes rapidly. Likewise, separating an actual trend from a long term periodicity becomes increasingly problematical.
There exists no magical analytical tool which can surmount this difficulty. It is fundamental, like the uncertainty principle in quantum mechanics, or the Cramer-Rao lower bound (in fact, it is a manifestation of the Cramer-Rao lower bound).
We use Fourier analysis because every periodic signal can be represented as a Fourier series. Every single one. This is one of the most basic results of functional analysis.
You have become enamored of an approach for which you do not understand the implicit assumptions. You cannot rule out a long term perodicity in the data. If you insist to me that you can, it will merely confirm for me that you do not know what you are talking about.
J Bob – what happens at the end points is necessarily a function of how you process the data. Information regarding what is happening beyond the end points is simply unavailable, and that information, unfortunately, is necessary to determine what is happening there with strong confidence.
Every path leads to assumptions, either explicit or implicit. For instance, in the approach I recommended of fitting periodic sinusoids, I am making the assumption that the model is periodic, and that the periodicity extends beyond the boundaries. This is generally a well-founded assumption, based on the behavior of countless observed processes in general circumstances.
Perhaps we are not dealing with a periodic signal? Perhaps, as Tom suggests, there is an actual linear, quadratic, cubic, or whatever polynomial trend? Maybe, maybe not. But, fundamentally, we cannot know from this finite chunk of data alone. Tom and the Warmists have made the assumption that there is, based on their intuition (or prejudice) regarding a significant human impact on the environment. Other’s, such as I, tend to intuit (or prejudice) a less significant impact.
Can we say objectively which one of us is right? No, we honestly cannot with the available information. But, we can objectively state that the Warmist agenda will bring immediate and severe hardship whether they are right or wrong, whereas the other direction only potentially brings hardship in the long term, before which we likely will have advanced far enough that we can deal with whatever occurs. That is what tips the balance for me.
“You have become enamored of an approach for which you do not understand the implicit assumptions.”
Perhaps I should have said “You have become enamored of an approach for which you do not, perhaps willfully, appreciate the implicit assumptions.”
A little less pejorative. I have wasted a lot of time in blog conversations which degrade into escalating meaningless barbs regarding the disputants’ native intelligences. I will take it as a given that Tom is a smart guy, and hope the courtesy is reciprocated, and I merely wish to open his eyes to considerations which he may not have… considered.
Tom P (13:52:22) :
So, no doubt there are natural oscillatory contributions to the warming we see, but their effect is swamped by a longer term rise over the last 150 years.
But given that the co2 effect is not generally thought to have become important until after WWII, the previous 2/3 of this long term rise has to be something else. If you are pleading some sort of special case that whatever previously caused the long term rise suddenly stopped, amd co2 took over, you will have to be more specific about what that other factor was, or your case looks very untenable. Whatever it was, it couldn’t be random stuff that cancels out to zero after a few years, that’s for sure.
Tom – I have just reviewed your link. It makes me want to groan and put my head in my hands and sob.
He gets an arbitrary set of “IMF”s from cubic spline interpolation, which have no physical basis, and do not themselves generally form a basis, and subtracts them out of the data until he has no more local extrema. It’s gobbledy-goop.
There is a reason we choose trig functions and polynomials as bases. It is because that is the form a vast array of physical processes assume, because of natural integrations and projections, and the regularity of time as we define it. If you unmoor yourself from these functions, you have no basis (literally) to extrapolate beyond the boundaries.
There is no magic silver bullet here that would allow you to divine information which simply is not available. Moreover, there is no basis (again, literally) to interpret the result. There is no linkage to naturally occurring processes which would tend to support one’s interpretation, whatever that may be. There is only a process for manipulating data.
There is only a process for manipulating data until you end up with a monotonic function of undetermined form.
Bart,
“He gets an arbitrary set of “IMF”s from cubic spline interpolation, which have no physical basis, and do not themselves generally form a basis, and subtracts them out of the data until he has no more local extrema. It’s gobbledy-goop.”
You have rather misunderstood empirical mode decomposition. Firstly it is a pure signal analysis technique – no assumptions need to be made about any physical basis to extract the components. This is part of its power, not a criticism.
As you said earlier “I am making the assumption that the model is periodic…” There is no need to make such an assumption. And there are plenty of physical processes that are non-periodic – a random walk for instance.
EMD does indeed produce an orthogonal basis set – otherwise it would be a meaningless technique.
And EMD most certainly does produce useful results, as its adoption in extracting information from seismic and medical data processing can demonstrate.
You are quite right, EMD decomposes signal until a monotonic function remains. But this could be part of a longer period cyclic function. If this was indeed the case, the upwards trend seen over the last 150 years could be interpreted as part of a 600-year sinusoid with an peak-to-peak amplitude of 1.2C, a minimum at 1850 and a maximum a predicted maximum at 2150. There is, however, physical basis for such a long-term periodicity and more importantly no sign in the earlier record of such a signal, for instance:
http://img9.imageshack.us/img9/1994/glaciervsinstrumental.png
tallbloke,
“the co2 effect is not generally thought to have become important until after WWII”
Where did you get that from? The general view is that CO2 warming became discernible from the middle of the last century as CO2 concentrations started to rise above background levels:
http://cdiac.ornl.gov/trends/co2/graphics/lawdome.gif
There in fact appears to be a close relationship between this plot and the glacier-derived temperatures above, or would you argue otherwise?
Sorry – a couple of confusing typos:
You are quite right, EMD decomposes the signal until a monotonic function remains. But this could be part of a longer period cyclic function. If this was indeed the case, the upwards trend seen over the last 150 years could be interpreted as part of a 600-year sinusoid with a peak-to-peak amplitude of 1.2C, a minimum at 1850 and a predicted maximum at 2150. There is, however, no physical basis for such a long-term periodicity and more importantly no sign in the earlier record of such a signal…
I admit to being lost about all the deconstruction and reconstruction of temperatures to provide trends or no trends, and isolate or not isolate what part of the trends or no trends are due to human influence primarily from CO2 or from natural causes. What concerns me is the implicit assumption that CO2 has been monotonically increasing during all the periods being reviewed, so that this is the one constant in all the debate and analysis. My point is that if CO2 has not increased monotonically, what relevance does the mathematical techniques concerning temperature have on the central question as to whether CO2 has caused warming, or at least to the magnitude assumed by IPCC, and therefore can it have played a part in temperature increases?
We should not allow the IPCC to choose the CO2 readings should we? What if the long term trend of CO2 in the atmosphere was a one quarter of the assumed increase? How then can one attribute increased/decreased temperature to CO2?
What evidence do I have for the proposition that CO2 has not been increasing? Please look here http://www.21stcenturysciencetech.com/Articles%202007/20_1-2_CO2_Scandal.pdf
I could reproduce exerpts from this if anyone desires, but if you have an open mind you would read all of it for yourself. Jaworowski has a huge reputation and had to be marginalised by the AGW believers to make their case. Another example of AGW cherry picking. Lets look at the evidence, not just the maths techniques.
Tom P (18:08:15) :
tallbloke,
“the co2 effect is not generally thought to have become important until after WWII”
Where did you get that from? The general view is that CO2 warming became discernible from the middle of the last century as CO2 concentrations started to rise above background levels:
http://cdiac.ornl.gov/trends/co2/graphics/lawdome.gif
There in fact appears to be a close relationship between this plot and the glacier-derived temperatures above, or would you argue otherwise?
Of course I would argue otherwise, that’s what keeps these debates interesting. 😉
If you want to argue that the co2 effect kicked in earlier, you need to get specific about what caused the cooling in the late 1800’s and the 1940-1970 period. Which brings us back to the oceanic oscillations and solar variation the AGW hypothesis has to dismiss as random small scale noise to survive.
Then there is the complete lack of correlation between co2 and the Medieval warm period. Of course, Law Dome is a long way from the location of the historical records which show the MWP was warmer than now in many parts of the world, but that’s a deficiency of the data you are trying to argue from, not evidence that the MWP didn’t happen.
Then of course there are the co2 measurements made by C19th scientists which don’t fit the theory, and have been quietly dropped…
Speaking of dropping, this thread is about to drop off the bottom of the list, so cheers, and see you on the next thread which tickles both our interests.
Alan Sutherland,
“Jaworowski has a huge reputation, and had to be marginalised by the AGW believers to make their case.”
He certainly has a reputation for his past work in radiation effects. But the article you cite (and his other writings on climate change) are found in the non-refereed “21st Century Science and Technology”, a magazine published by the Lyndon LaRouche movement, perhaps the most bizarre political grouping in the US. Jaworowski has marginalised himself here!
tallboy,
“If you want to argue that the co2 effect kicked in earlier, you need to get specific about what caused the cooling in the late 1800’s and the 1940-1970 period.”
No, first you have to explain the dominant warming trend in the signal. Of course the warming has not be always increasing, and there are weaker cooling periods to explain, some of which I’m sure are natural variability. But to dismiss an effect from CO2 on such grounds is to ignore the elephant in the room.
“There is, however, no physical basis for such a long-term periodicity and more importantly no sign in the earlier record of such a signal.”
There are all kinds of perodicities. Here, for example, is a 400 year cycle. I’m not promoting or otherwise affirming the paper – I just pulled it up randomly on google. I’m just saying, there is a lot more going on than the rising CO2 narrative which we are being spoon fed by the promoters of AGW.
I’m not saying there is a malevolent conspiracy afoot, at least a conscious one, among AGW adherents. But, I do believe they, like the Queen of Hearts, made their verdict first, and held the trial afterward. And, when you are searching only for evidence which supports your preconceived bias, mirabile dictu, that is what you tend to find. I believe they, in their hearts, believe they are doing the right thing, but it all flows from the initial conviction: we are doing something to the planet, let’s find out what.
The first principle is that you must not fool yourself – and you are the easiest person to fool.
Richard Feynman, Caltech commencement address, 1974
Bart, here is a more detail description of my Fourier Analysis .
The following is a expansion on use of Fourier Convolution methods in signal conditioning, or attempting to get information out of “noisy” signals. The methods used were those recommended by Blackman & Tucky’s book “Measurement of Power Spectra”. These methods were later refined by Cooley & Tucky’s presentations on the Fast Fourier Transform, which they developed.
The signal in question, was a average of 14 very long term temperature records starting in a period from 1659 (Central England) to about 1800. The data came from the Rimfrost site, http://www.rimfrost.no/ Primary purpose was to look at direct temperature measurement records, and evaluate how current temperatures compared to those earlier ones 150 to 300 years ago.
One of the tools used, along with moving averages and recursive filtering, was Fourier Convolution or filtering. In Fourier filtering, a input signal is converted to the frequency domain. There, the frequency content is evaluated, and certain frequencies of kept or removed, depending on the user. In this case, frequencies greater the 0.025 cycles/year (40 year period) were removed. The result was then transformed back to the time domain for analysis
The top insert in the figure below show the averaged temperature (Ave14), along with a de-trend line. This line intersects the end point of the data set. This “de-trending” done to avoid “leakage” problems with Fourier convolution. The lower figure shows the difference, or error from the de-trend line, noting that both end points are equal to zero.
http://www.imagenerd.com/uploads/ave14-de-trend-CkLob.gif
The actual convolution is performed on this “error” from the “de-trend” line, and is shown below. In this case, the “error”, or difference, is inserted into a sample frame of 512 sample (due to FFT reqts.). The non-Ave14 portion is padded with zeros so no discontinuity is present at the sample end points. This is shown in the top insert in the figure below.
http://www.imagenerd.com/uploads/ave14-raw-fft-echo1-2Tgav.gif
The top insert is two plots superimposed of the input and the output of the convoluted, or filtered input. In this case, no filtering is applied, the purpose being the check the computational procedures. Notice that the output is virtually on top of the input, indicating a good re-construction of the input signal. The lower insert is the power spectral density plot (actually ½ of it) that shows the energy contained in the various frequencies. Note that amplitudes get smaller at higher frequencies, while more energy is concentrated in the lower ones, stating about 0.14 cycles per year, or 8 year periods. This tapering off of energy at the higher frequency, also indicates “pre-whitening” is not needed.
The next figure shows the same data shifted to the center of the sample period. As expected the PDS is the same even though the signal has been shifted to the center of the sample period. This would indicate that “windowing”, using a Hamming, Hanning or other “window”, would not be needed.
http://www.imagenerd.com/uploads/ave14-raw-fft-echo2-p9mBY.gif
Again the reconstructed signal is virtually identical to the input, indicating the how well the Fourier filter can reconstruct a signal.
The figure below shows the effect of a “mask” that removes frequencies above 0.025 cycles/year. Basically a low pass filter.
http://www.imagenerd.com/uploads/ave14-40_yr_filter-sct4B.gif
The top insert shows the input and out signal, while the lower one show the PSD plot, after the “mask” is applied. The resultant signal is a smoothed line, that shows the lower frequency content of the input, uncluttered by the higher frequencies.
He last step is to use the “de-trend” line and the filtered signal to re-construct the filtered average (Ave14), shown below.
http://www.imagenerd.com/uploads/ave14-de-trend_40_yr_filter-esCIc.gif
Here one can see that the filtered signal does a good job of following the signal. The question is what happens at the beginning and end points, especially the end one. While all “tools” have their strength and weaknesses, a variety of methods are generally used to evaluate data, and this is but one. However, it would appear that the Fourier does seem to work a little better at the end points, especially if there are real cyclical elements embedded in the signal. This seemed to be confirmed by looking at the Chebushev filter results in a previous posting. This is in spite of transients caused by the sudden addition of more stations.
Looking at the end portion, one could make the case that we are entering into a downward cycle in the climate. However, in the PDS there are other frequencies that have a considerable amount of “energy”. There is a group in the 0.025 to 0.07 range, as well as 0.1 to 0.4 range that warrant evaluation. The one thing that does stand out, is that this crude analysis of western Europe indicates considerable temperature variation in western Europe.. It would indicate that there has been fairly warm periods in the past, and the 1850 date point seemed to be a relative low point in the climate temperature cycle. Anyway there is still a lot to do, (solar & north Atlantic variatin for starters).
An attempt also was made to keep the analysis simple, so that one who has some programming knowledge of VB and EXCEL can do a fair amount of analysis on their own.
Bart, my MATLAB is more then a little rusty, so if you could formulate your model in a more standard notion, I would appreciate it.
“And EMD most certainly does produce useful results, as its adoption in extracting information from seismic and medical data processing can demonstrate.”
Maybe. My hasty evaluation necessarily may not have taken in all the implications. But, it appears very, prima facie, ad hoc. I have seen dozens of analytical techniques become faddish and recede over my career (OK, that’s hyperbole, maybe 5 or 6 in my specific milieu), but the ones which are fundamental have more staying power. As I said, we tend to decompose time series according to those functions which are typically seen in nature, and for which solid reasons for expecting them to appear exist.
As I stated previously, there is a reason we choose trig functions and polynomials as bases. It is because that is the form a vast array of physical processes assume, because of natural integrations and projections, and the regularity of time as we define it.
Bart,
Most real-world time series are non stationary – their statistics depends on time. Fourier techniques find such signals very difficult to deal with.
The Queen of Heart’s is a very apt analogy. It is indeed execution followed by trial if you first force the time series to be stationary before taking the frequency transform. Empirical mode decomposition in contrast does not follow such a retrograde method for reaching a verdict and clearly extracts the dominant long-term rising temperature trend.
“Here, for example, is a 400 year cycle.”
I’m afraid the link doesn’t appear to work.
“I’m just saying, there is a lot more going on than the rising CO2 narrative which we are being spoon fed by the promoters of AGW.”
Quantitatively, not a lot more. About 80% of the content of the temperature signal is a rising trend well correlated with the CO2 concentration. I’m happy to discuss the other 20%, but like tallboy you appear to be ignoring the elephant in the room.
Tom – I would argue the opposite. Most real world random processes are stationary, which is why Fourier analysis has proven so successful over time. And, the most common non-stationary signals are martingales (see Donsker’s theorem). Deterministic signals are neither stationary nor non-stationary, as this is a probabilistic concept.
You can deal with non-stationarity in Fourier analysis simply by deriving the expectation of the operation. In this way, for example, we can find that the expected PSD of a Wiener process with autocorreltation E{x(t1)x(t2)} = K*min(t1,t2) is approximately 2*K/omega^2 at non-zero frequencies below the Nyquist rate (if you look up the literature, you may find that many people miss the factor of two by making a false analogy with an Ornstein-Uhlenbeck process with infinite time constant). There is a qualitative difference in that the frequency samples are highly correlated, whereas with a stationary process, they are essentially independent. This makes it difficult to get very good quantitative estimates from the PSD for that process, but it is still an excellent tool for qualitative analysis.
“About 80% of the content of the temperature signal is a rising trend well correlated with the CO2 concentration.”
And, you have no information to confirm whether it is a rising trend, or simply another cyclical component. Here is another attempt at that link, but if it doesn’t go through, try some googling of your own on long term periodic climate influences. I think you will be surprised what you find.
“Empirical mode decomposition in contrast does not follow such a retrograde method for reaching a verdict and clearly extracts the dominant long-term rising temperature trend.”
This is absurd. You are merely decomposing the signal another way, according to (I’ll take your word for it) another (arbitrary) orthogonal basis which gives you little insight into the actual physical processes. Let me try one more time to make the point: we generally assume trigonometric and polynomial bases because that is the form which most real world processes assume. Go and perform your EMD, then produce for me your bases, and tell me how such forms might arise physically. Generally, I expect you would not be able to, because you have become unmoored from physicality.
By pulling out a monotonic trend, you are making the assumption that such a trend exists in the large (beyond the boundaries of your data set). But, that may not be the case, and you do not have enough information to determine whether the assumption is true.
J. Bob – I am not ignoring you, it is just that a full response would take rather longer than responding to Tom’s questions, and unfortunately, I have a job. Briefly, let me say that, no matter how you do the analysis, your end points are going to in some way be an extrapolation, and therefore uncertain. In my suggested method of fitting periodic terms through least squares, I am not saying I can divine the “truth”. It would be more along the lines of demonstrating another plausible interpretation of where the data are going.
Would it be possible to post an ascii printout of the data you are analyzing to a page like you are doing for your plots so that I could perhaps generate some plots for you?
Tom –
I think I have said more or less all I can say, and at some point we will have to agree to disagree. But, let me take note of this plot:
http://img9.imageshack.us/img9/1994/glaciervsinstrumental.png
Direct temperature records, of course, only go back to the late 19th century, and even those are questionable given the sparsity of coverage, the sensor siting, etc.. which have been covered in this blog better than just about anywhere.
But, let’s assume the data are reliable and valid. If we as a species perceived time in reverse in 1850, we might well believe the world was rushing headlong into a global freeze. Year before year, we are pumping less and less CO2 into the atmosphere, and the temperature is plummeting. Remember, we have no perception of what has come before, and it will only be revealed to us in retrograde time. Clearly, time is running in, and we must take action!
You may not see the symmetry in the arguments, but they are wholly the same. Right now, there are those who believe the natural carbon sinks cannot handle the marginal increase in CO2 we are adding to the atmosphere each year. But, in the year 1850 going backward in time, they could be as easily concerned that the sinks are too aggressive, and that by failing to pump more carbon into the atmosphere, we are ensuring it will all be sucked out. If your reaction is skeptical, it is only because you have beforeknowledge of what unhappened in this event, whereas you have no such insight into our future.
Bart,
“Most real world random processes are stationary, which is why Fourier analysis has proven so successful over time.”
There’s no doubt about its success and I use it all the time. But most real-world processes involve elements of drift and random walk and so are not stationary. Often drift can be treated as a spurious signal and removed by detrending for Fourier analysis, especially in engineering applications, but if this is applied to real-world data you may well be removing a major component of the signal.
“Deterministic signals are neither stationary nor non-stationary, as this is a probabilistic concept.”
Correct, but I’m sure we agree that what drives temperatures on Earth isn’t a purely deterministic process. Hence there will be a stochastic component to any temperature signal and the statistics of the signal are important. Indeed you have been implicitly treating the signal as stochastic by using filtering to remove the random noise in your analysis above.
Hence temperature history can be considered as either stationary or non- stationary, and all evidence is that it is the latter. For instance the mean temperature of the Earth has varied at huge time scales, and when the literal end of the world comes as the sun blows up it will rise monotonically. It is problematic to extract quantitative data using Fourier techniques from such data as you have acknowledged.
Yes, the link now works, but says something a little different than maybe you intended: “global-scale temperature only shows a minor response” to the solar forcing discussed in their work. Of course there are long-term periodicities associated Milankovitch cycles, but I still await any evidence of a 400 year forcing of temperatures, and even that cycle would be inconsistent with the glacier record.
“Go and perform your EMD, then produce for me your bases, and tell me how such forms might arise physically.”
I’ve already done this! – and the plots of the decomposition of past temperature and CO2 history are above for you to see. I’d prefer you criticised rather than ignored them! You might want to read Arrhenius for a first order explanation.
“You may not see the symmetry in the arguments, but they are wholly the same…. in the year 1850 going backward in time, they could be as easily concerned that the sinks are too aggressive, and that by failing to pump more carbon into the atmosphere, we are ensuring it will all be sucked out.”
There are some severe physical and societal contradictions in assuming such a symmetry. But even ignoring these your analysis breaks down. By 1850 such a temperature change in reverse would have decreased steadily over the years and by then the changes would be minor compared to the dramatic drops experienced a century and a half earlier. I might imagine any panic would have subsided. I’m at somewhat of a loss to understand what you are trying to demonstrate here.
Bart, here is a site for my composite temperature.
http://www.4shared.com/file/132702881/dc4fb18c/Ave14.html
The columns are: Year, Hadcrut, Ave14 & the number of sites for that year.
On the Fourier filtering, I would guess that if is the computational method could re-create the original to the accuracy that it did ( echo1 & 2), it would seem that it would the end points should be pretty close to being right. Anyway have fun.
I would like to add my pennysworth to the discussion of Empirical Mode Decomposition (EMD) and the matter of trends and randomness.
I suggest EMD is a statistically valid methodology. It is worth a careful study as are the papers by its originator Dr Huang, especially those about why it is superior to Fourier methods.
The time series of almost all processes relevant to climate dynamics are non stationary, in that the measures within each are interrelated, and non linear and contain elements of randomness. EMD, unlike most statistical methodologies for analysing time series, makes no assumptions about the linearity or stationarity of a time series. EMD lets the data speak more directly, revealing its intrinsic functional structure more clearly. It does not does not have the restrictive assumptions of linearity and stationarity that the familiar Fourier-based techniques have, because it uses Hilbert, not Fourier, transforms.
I suggest that the EMD analysis of GMSTA data warrants close study as it is telling us what is happening in the GSTA time series, and hopefully in the world in which we live (I think it does). The paper is full of interesting insights, such as (page 14892):
“Finally, it is noted that the global temperature anomalies with respect to the sum of the overall EMD trend and the multidecadal variability appear to be quite stationary in the whole data span, indicating that the higher frequency part of the record in recent years is not more variable than that in the 1800s. The extreme temperature records in the 1990s stand out mainly because the general global warming trend over the whole data length coincides with the warming phase of the 65-year cycle.”
Murray C Peel, Senior Research Fellow Department of Civil and Environmental Engineering Centre for Environmental Applied Hydrology, University of Melbourne is an expert on EMD.
(see http://www.civenv.unimelb.edu.au/~mpeel ). He has, with his colleagues, written several papers about EMD and reporting applications of EMD.
One of the most interesting is Peel, M and McMahon, T. A., 2006. Recent frequency component changes in interannual climate variability, Geophysical Research Letters, Vol.33, L16810, doi:10.1029/2006GL025670.
Peel and McMahon demonstrated that randomness in the climate system has been on the rise since the 1950s. The authors used the EMD time series analysis technique to quantify the proportion of variation in the annual temperature and rainfall time series that resulted from fluctuations at different time scales. They applied EMD to annual data for 1,524 temperature and 2,814 rainfall stations from the Global Historical Climatology Network.
Peel and McMahon found that the proportion of variance due to inter-decadal fluctuations has been decreasing since the 1950s for rainfall and since the 1970s for temperature. They argue that this means the long term memory of the climate system is shortening, thus increasing the degree of randomness in the system.
Regardless of EMD, I suggest that the global authority on trend analysis and the analysis of non-linear, non-stationary time series which contain elements of randomness, is Demetris Koutsoyiannis, Professor of the National Technical University of Athens in Hydrology and Analysis of Hydrosystems; and Professor of Hydraulics in the Hellenic Army’s Postgraduate School of Technical Education of Officers Engineers; Editor of Hydrological Sciences Journal;. Here is his home page http://www.itia.ntua.gr/dk . He argues for a different approach based on Hurst phenomena. He would not use EMD, nor advocate its use.
There are many papers on his website which keen students of the time series analysis of non-linear, non-stationary trendy data with elements of randomness could fruitfully spend months studying. Cohen and Lins have, of course, shown that Nature is naturally trendy (COHN, T A. and LINS, F., 2005. Nature’s style: Naturally trendy. Geophysical Research Letters, (32), L23402.
Bart, forgot to put in the links to echo1 & 2.
http://www.imagenerd.com/uploads/ave14-raw-fft-echo1-2Tgav.gif
http://www.imagenerd.com/uploads/ave14-raw-fft-echo2-p9mBY.gif
Let me know if you got the text file.
Richard Mackey,
Thanks for the summary on EMD.
What is your explanation for the general warming trend, the most dominant mode of the temperature record as decomposed by EMD?