From Wikipedia:
Polar amplification is the greater temperature increases in the Arctic compared to the earth as a whole as a result of the effect of feedbacks and other processes. It is not observed in the Antarctic, largely because the Southern Ocean acts as a heat sink and the lack of seasonal snow cover. It is common to see it stated that “Climate models generally predict amplified warming in polar regions”, e.g. Doran et al.
Now with this paper, blowing the surface data out for AGW effects, what are they going to do?
Via the Hockey Schtick:
New paper finds only 1 weather station in the Arctic with warming that can’t be explained by natural variation
A paper published today in Geophysical Research Letters examines surface air temperature trends in the Eurasian Arctic region and finds “only 17 out of the 109 considered stations have trends which cannot be explained as arising from intrinsic [natural] climate fluctuations” and that “Out of those 17, only one station exhibits a warming trend which is significant against all three null models [models of natural climate change without human forcing].” Climate alarmists claim that the Arctic is “the canary in the coal mine” and should show the strongest evidence of a human fingerprint on climate change, yet these observations in the Arctic show that only 1 out of 109 weather stations showed a warming trend that was not explained by the natural variations in the 3 null climate models.
Note a “null model” assumes the “null hypothesis” that climate change is natural and not forced by man-made CO2 or other alleged human influences.
GEOPHYSICAL RESEARCH LETTERS, VOL. 39, L23705, 5 PP., 2012
doi:10.1029/2012GL054244
On the statistical significance of surface air temperature trends in the Eurasian Arctic region
Key Points
- I am using a novel method to test the significance of temperature trends
- In the Eurasian Arctic region only 17 stations show a significant trend
- I find that in Siberia the trend signal has not yet emerged
C. Franzke
British Antarctic Survey, Natural Environment Research Council, Cambridge, UK
This study investigates the statistical significance of the trends of station temperature time series from the European Climate Assessment & Data archive poleward of 60°N. The trends are identified by different methods and their significance is assessed by three different null models of climate noise. All stations show a warming trend but only 17 out of the 109 considered stations have trends which cannot be explained as arising from intrinsic [natural] climate fluctuations when tested against any of the three null models. Out of those 17, only one station exhibits a warming trend which is significant against all three null models. The stations with significant warming trends are located mainly in Scandinavia and Iceland.
Introduction
[2] The Arctic has experienced some of the most dramatic environmental changes over the last few decades which includes the decline of land and sea ice, and the thawing of permafrost soil. These effects are thought to be caused by global warming and have potentially global implications. For instance, the thawing of permafrost soil represents a potential tipping point in the Earth system and could lead to the sudden release of methane which would accelerate the release of greenhouse gas emissions and thus global warming.
[3] Whilst the changes in the Arctic must be a concern, it is important to place them in context because the Arctic exhibits large natural climate variability on many time scales [Polyakov et al., 2003] which can potentially be misinterpreted as apparent climate trends. For instance, natural fluctuations on a daily time scale associated with weather systems can cause fluctuations on much longer time scales [Feldstein, 2000; Czaja et al., 2003; Franzke, 2009]. This effect is called climate noise. Even very simple stationary stochastic processes can create apparent trends over rather long periods of time; so-called stochastic trends [Cryer and Chan, 2008; Cowpertwait and Metcalfe, 2009; Barbosa, 2011; Fatichi et al., 2009; Franzke, 2010, 2012]. On the other hand, a so-called deterministic trend arises from external factors like greenhouse gas emissions.
[4] Specifically, here I will ask whether the observed temperature trends in the Eurasian Arctic region are outside of the expected range of stochastic trends generated with three different null models of the natural climate background variability. Choosing the appropriate null model is crucial for the statistical testing of trends in order not to wrongly accept a trend as deterministic when it is actually a stochastic trend [Franzke, 2010, 2012].
[5] There are two paradigmatic null models for representing climate variability: short-range dependent (SRD) and long-range dependent (LRD) models [Robinson, 2003; Franzke, 2010, 2012; Franzke et al., 2012]. In short, SRD models are the most used models in climate research and represent the initial decay of the autocorrelation function very well. For instance, a first order autoregressive process (AR(1)) has an exponential decay of the autocorrelation function. LRD models represent the low-frequency spectrum very well, have a pole at zero frequency and a hyperbolic decay of the autocorrelation function. One definition of a LRD process is that the integral over its autocorrelation function is infinite while a SRD process has always an integrable autocorrelation function [Robinson, 2003; Franzke et al., 2012]. In general, both stochastic processes can generate stochastic trends but stochastic trends of LRD models can last for much longer than stochastic trends of SRD models. This shows that the rate of decay of the autocorrelation function has a strong impact on the length of stochastic trends. In addition to these two paradigmatic models we will also use a non-parametric method to generate surrogates which exactly conserve the autocorrelation function of the observed time series. Figure 1 displays the autocorrelation function for one of the used stations and the corresponding autocorrelation functions of the above three models. It has to be noted that there are a myriad of nonlinear stochastic models which can potentially be used to represent the background climate variability and the significance estimates will depend on the used null model. However, I have chosen the three above models because two of them represent paradigmatic models for representing the correlation structure and one conserves exactly the empirical correlation structure.
Figure 2. Map of stations: Magnitude of the observed trend in °C per decade.
Results
[17] Figure 2 displays the location of all stations and the colour coding indicates the magnitude and sign of the temperature trends. The first thing to note is that all stations experience a warming trend over their respective observational periods. The largest trends (more than 0.4°C per decade) are in central Scandinavia and Svalbard. Most of Siberia experienced warming trends of about 0.2–0.3°C per decade.
[18] After finding evidence for warming trends we have now to assess their statistical significance; do the magnitudes of the observed trends lie already outside of the expected range of natural climate variability? The above three significance tests reveal that 17 of the 109 stations are significant against an AR(1) null model (Figure 3a), 3 stations are significant against a ARFIMA null model (Figure 3b), and 8 stations are significant against a climate noise null hypothesis using phase scrambling surrogates (Figure 3c). All these trends are significant at the 97.5% confidence level. This shows that while the Eurasian Arctic region shows a widespread warming trend, only about 15% of the stations are significant against any of the three significance tests.
Figure 3. Stations with a statistically significant trend against (a) AR(1), (b) ARFIMA, (c) phase scrambling null model and (d) stations with a significant trend: blue: weak evidence, green: moderate evidence and red: strong evidence.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.


Hello Garret , @ur momisugly Garrett says: December 11, 2012 at 8:21 am,
Many people have contested you brief tome, but all I’m hearing back from you is crickets ?
rgbatduke says:December 11, 2012 at 10:47 am
To put it another way, if warming required the coincidence of three factors each with only two distinct values, we would only expect to be in warming phase 1/9th of the time Increasing the number of categories or factors makes this worse.
Interesting coincidence that is roughly the non-glaciated to glaciated ratio.
rgbatduke
Thanks for your detailed response. After examining tens of thousands of comtemporary references from the 7th Century onwards (I was at the Met Office library today looking at weather observations in Southern England from 1205 to 1350) I really dont see much different from today to various periods in the past. Certainly no sign of catastrophic warming in the modern era but an interesting saw toothed warming can be observed that has been going on for hundreds of years from around 1700. Judging by glacier information this seems to have happened also from around 750AD to around 1215Ad and to a lesser extent from 1350 to 1550.
I think the only thing that surprisres me about climate science is how much we THINK we know about it but how little we really do, as demonstrated by a constant stream of new papers which makes us all adjust our perspective.
tonyb
.
I did find a trend in the Arctic that was greater that the variation of other stations. But the reason was due to a shore ice effect. In other words, the earlier retreat of shore ice had a profound effect on the arctic stations that were right next to that shore. And most of them were. One can actually see the profoundness of the effect by looking at the ice coverage maps at cryosphere today, deducing when a shore station would have it’s nearby sea area become ice free, and then observing the drastic tempreature anomoly that was recorded for that station and that resulted from that change. Of course the change in temperature at the station is real. However, the warming that is local to the station and is caused by warm water circulating across the surface is not real warming when it is extrapolated from the shore stations to a thousand kilometers across the still existing ice and inland to areas still covered by snow and ice. Also note, that the stations in the maps above that do show significant trend are almost all shore stations.
So while there may be a temperature fluctuation in 1998, there has in general been no trend in Daily Temp Abnormality over the entire period of good data( after WWII), and this graph: http://www.science20.com/files/images/1950-2010%20D100_0.jpg show the Abnormality for North of 23 Lat on a daily basis for 1950 to 2010, weather variability far exceeds any trend in the data.
to within 0.1 degree over a decade or two of daily measurements. One then MIGHT be able to wait a decade, take another decade’s worth of measurements, and obtain
a second time, this time with much higher CO_2 levels (on average), also to within 0.1 K. This would give you a prayer of being able to resolve them, e.g. get
for the first decade and
for the second. Whereupon one would actually have direct evidence of a warming process with sufficient controls that one could at least hope that it is independent of e.g. humidity, decadal oscillation phase, thermohaline circulation, solar state. Indeed, one almost wouldn’t care about any of these things because they might shift T itself a bit year to year, but variation in nighttime radiative cooling in still, dry air would vary only with the fourth root of the absolute temperature and for that matter one could measure the actual absolute temperature and accurately subtract out this effect, leaving you with a non-confounded $\Delta T$ anyway.
That is actually a damn interesting thing to look at, and one I’ve suggested be examined in other threads a number of times. In principle, the mean thermal variation from sundown to sunup, when averaged over enough samples and controlled for variations in e.g. humidity or baseline weather — should be a direct measure of GHG surface warming.
The problem is this — there are few places on the Earth’s surface where one can get sufficiently precise data, controlled for the natural fluctuations, that would otherwise confound (IMO) any attempt to resolve the CO_2 based gain, in order to do the job.
My primary suggestion would be literally mid-desert — places where the annual variation of temperature and humidity year to year is very small, so December 11th this year is likely to have almost exactly the same max/min/mean temperature and same (near zero) humidity in any given year. In Durham, NC, where I live, this is pointless — in the years I’ve lived here 12/11 can be anything from dry and very, very cold, to snowing, to cool and raining, to warm and raining to just plain warm and dry. Today, for example, it is pretty warm but fairly humid (cloudy). In two days it will be clear and rather cold. This is NOT a good place to make a series intended to reveal this.
In the middle of Death Valley, or the Sahara, or the Negev, or the Australian Outback, if you eliminate the rare weather event that is clearly distinct from the norm you might get enough “good” data from enough “acceptably normal, dry” days over enough years to be able to resolve the second order variation in the delta T that represents a GHG-based signal. Bear in mind that this will be extraordinarily difficult, because it will take years worth of data to establish a mean Delta T baseline, and more years worth of data (some years later) to establish a second mean Delta T, and the two means have to be separated by at least 2 sigma to talk about statistically resolvable changes in the cooling rate.
If you could do enough places and years to obtain this, however, it would be enormously valuable! And it isn’t completely infeasible. In the Sahara, for example, it isn’t uncommon to have 50C variations in peak daytime to pre-dawn temperature, and it is likely that some 40C or more of that happens from sundown to sunup. If one assumes that one is trying to measure changes in $\Delta T = 40$ K on a scale of roughly 300 K absolute mean temperature, and that eliminating humidity altogether reduces the natural variance of the 40K year to year to (say) 1 degree, then one might be able to resolve the mean variance of
So good job, good idea, but don’t bother applying it everywhere as it is pointless where there is humidity and substantial natural variation. Apply the idea only where Nature provides you with a minimally confounded environment, where “only” the non-water vapor driven GHE limits the rate at which desert sands cool at night (or as close as you can manage), controlled for humidity as best you can manage.
rgb
I think this approach makes a lot of sense. It is more then obvious most natural processes that have veneration are stochastic in nature. Will it change anyone’s mind? I have my doubts. The extremes on both sides of this and most other politics-economic-social questions are only interested in out shouting each other. Rational thought and any science are unfortunately lost in the shuffle.
Remember the response of environment Canada 2010, to the question put via access to information request 2008, what is the state of EC weather station data acquisition ?
Errors & More Errors, calibration not done, yes we know, but we can’t afford the man hours to fix.
But we spent 4 billion on running computer simulations.
No surprise to see Eureka as the hot spot in the canadian arctic.
Lets see an electronic sensor installed in summer and calibrated in 1990 to a 0-20mA range = 0-40 C? ( midrange 20C)
Last winter these same style of sensors in Alaska at -40, we are told please ignore, cause they are not accurate, at and below this range.
Possibly the reason we are not measuring/recording any record low temperatures is the equipment.
Interesting concept of Null Hypothesis Models, I’ll wait and see. Who is running tests on these models and what proximity to past reality do they accomplish?
The distribution of the “significant” trends seems to be sampling the influx of Gulf Stream waters into the arctic. Is there not a comparative data set from the 1930s where anecdotal info, actual temps, newspapers, arctic ship traverses (probably largely from Norway) … gave cause for alarm about the plight of the seals and the melting ice? This would be a reasonable “baseline” for natural variability within a century. Also, I recall the existence of a report (read on WUWT a couple of years ago) by (I believe) a British admiral to the Royal Society in 1817 reporting on the widespread melting of ice and the thought that navigation through the arctic seemed feasible.
That it was essentially still within the LIA gives an extreme example of rgbatduke’s idea of the chaotic nature of climatic components coming together at times to give us warmth or cold. In any case, with the known comparatively recent histories of arctic warming, the statistical null models used here seem superfluous. Let’s take it as a given (at least within the context of what is even grossly knowable) that the natural variation can deliver notable freeze-ups and thaw-downs of the arctic in multi century scales and there is no dispute whatsoever that we have had multimillenial massive freeze-ups taking 50million cubic km out of the sea and depositing it as ice on half the NH land masses. How does 100% certainty stack up!
rgbatduke said (among MANY great thoughts):
“I just don’t think people appreciate how profound our ignorance here really is, how overstated our knowledge of the remote past (or even the relatively recent past) is.”
+1
Interesting coincidence that is roughly the non-glaciated to glaciated ratio.
Or, not a coincidence at all, given that current Milankovitch theory involves the coincidence of three distinct harmonic functions, one for eccentricity, one for precession, and one for (if I recall correctly) where perihelion lies relative to the above, facing e.g. northern or southern hemisphere or in between. There are other factors: one bobbing up and down in the solar system ecliptic, maybe more.
The problem with that is it has only been the ratio for the last five or six cycles. Over the previous two million years, the period varied from 26K to 40K to the current 100K years, while the length of interglacials remained roughly constant at around 10Ky. That is, there are problems with the Milankovitch theory that make it a plausible set of factors but unlikely to be a complete or sufficient explanation, and that make it absolutely not a predictive explanation. We have correlation, we have plausible causality, we cannot incorporate the two into a convincing predictive model that explains or predicts the end of the Wisconsin glaciation, the beginning of the Holocene, the Younger Dryas return to glaciation, the roaring back into the Holocene warm phase, the Holocene optimum (and why, during that optimum, the proposed positive feedback from humidity didn’t permanently lock the planet back into warm phase if climate sensitivity is as high as has been suggested) and all of the greater and lesser bumps along the way.
So as I said, five or six (at least) important factors plus the various orbital factors might do it. Call it a ten dimensional non-separable model. Tough stuff…
rgb
rgbatduke:
I agree – indeed, applaud – all you have said in this thread. I now write to comment on part of one of your posts because it mentions me and I wish to clarify my view for others.
In your post at December 11, 2012 at 10:47 am you write
Your views in the paragraphs which I cite mirror my views so closely that I could have written them myself.
I strongly agree that the Bern Model is bunkum.
Yes, you are right that our (i.e. Rorsch, Courtney & Thones) work only considered contemporary data. And I would appreciate your informing of the results of the work which you promise to soon conduct: perhaps you could publish for comment on WUWT if that does not harm your citation index? In the unlikely event that you want any assistance from me then please let me know.
For the benefit of others, I again copy a summary of my views to here so they know to what you have referred.
I do not know the cause of the recent rise in atmospheric CO2 concentration, but I want to know. I tend to think it is probably a result of temperature rise from the LIA but it may be entirely a result of the anthropogenic emission. I explain this as follows.
The annual anthropogenic emission of CO2 should relate to the annual increase of CO2 in the atmosphere if one is causal of the other according to a simple mass balance, but these two parameters do not correlate unless 5-year smoothing is applied to the data.
(There are reasons why smoothing of the data of up to 3 years can be justified, but e.g. the Bern model uses 5-year smoothing to obtain agreement between the emissions and the rise because less smoothing fails to obtain it.)
Importantly, the dynamics of the system indicate that ALL the anthropogenic emission and the natural can easily be sequestered by the system.
At present the yearly increase of the anthropogenic emissions is approximately 0.1 GtC/year. The natural fluctuation of the excess consumption is at least 6 ppmv (which corresponds to 12 GtC) in 4 months. This is more than 100 times the yearly increase of human production, which strongly suggests that the dynamics of the natural sequestration processes can cope easily with the human production of CO2. A serious disruption of the system may be expected when the rate of increase of the anthropogenic emissions becomes larger than the natural variations of CO2, but the data in this paragraph indicates this is not possible.
The failure of correlation denies the ‘mass balance’ argument. And, on face value, it seems to deny an anthropogenic cause of the rise, but it does not. An explanation of this is provided by the failure of the sequestration process to sequester all the annual emission both (natural and anthropogenic) when the system dynamics indicate the system could be expected to sequester them all.
Clearly, the system of the carbon cycle is constantly seeking an equilibrium which it never achieves . Some processes of the system are very slow with rate constants of years and decades. Hence, the system takes decades to fully adjust to the new equilibrium. And the observed rise is probably that adjustment. Thus, the system not sequestering all the emissions (when its dynamics indicate it can) is an indication of adjustment towards an altered equilibrium.
Therefore
(a)
The temperature rise since the LIA must induce some of the rise and could be the cause of all of it by creation of a new equilibrium state.
(b)
But the anthropogenic emission could be the cause of such a new equilibrium state and so be responsible for almost all the rise.
In either case, the correlations observed by Bart are indicative of the changed rate constants with fluctuating temperatures during adjustment to the new equilibrium state. And the ‘mass balance argument’ is irrelevant because it assumes the system is not changing its state.
One of our 2005 papers assessed these possibilities.
(ref. Rorsch A, Courtney RS & Thoenes D, ‘The Interaction of Climate Change and the Carbon Dioxide Cycle’ E&E v16no2 (2005))
The paper reports attribution studies we conducted which used three different basic models to emulate the causes of the rise of CO2 concentration in the atmosphere in the twentieth century. These numerical exercises are a caution to estimates of future changes to the atmospheric CO2 concentration. The three basic models used in these exercises each emulate different physical processes and each agrees with the observed recent rise of atmospheric CO2 concentration.
The models each demonstrate that the observed recent rise of atmospheric CO2 concentration may be solely a consequence of the anthropogenic emission or may be solely a result of, for example, desorption from the oceans induced by the temperature rise that preceded it (applying these two assumptions provided a total of six models from the three basic models). Furthermore, extrapolation using these models gives very different predictions of future atmospheric CO2 concentration whatever the cause of the recent rise in atmospheric CO2 concentration.
Each of the models in the paper matches the available empirical data without use of any ‘fiddle-factor’ such as the ‘5-year smoothing’ the IPCC uses to get the Bern model to agree with the empirical data.
So, if one of the six models of our paper is adopted then there is a 5:1 probability that the choice is wrong. And other models are probably also possible. And the six models each give a different indication of future atmospheric CO2 concentration for the same future anthropogenic emission of carbon dioxide.
Data that fits all the possible causes is not evidence for the true cause. Data that only fits the true cause would be evidence of the true cause. But the above findings demonstrate that there is no data that only fits either an anthropogenic or a natural cause of the recent rise in atmospheric CO2 concentration. Hence, the only factual statements that can be made on the true cause of the recent rise in atmospheric CO2 concentration are
(i)
the recent rise in atmospheric CO2 concentration may have an anthropogenic cause, or a natural cause, or some combination of anthropogenic and natural causes,
but
(ii)
there is no evidence that the recent rise in atmospheric CO2 concentration has a mostly anthropogenic cause or a mostly natural cause.
Richard
Gary Pearse
I suspect you are referring to my article carried here called ‘ historic variations in arctic ice part one’
I am currently writing part two covering the arctic melting from 1918 to 1949
It has meant reading several hundred arctic papers, a visit to the Scott polar institute in Cambridge and correspondence with a number of scientists with a special interest in arctic ice.
I hope to have it ready some time in January
Tonyb
This is easy stuff, [as] AGW is a self evident truth its impossible for any valid research to challenge it . Therefore any that does is not actual valid in the first place so can safely be ignored with its authors smeared has being in the pay of ‘evil fossil fuel companies’
Facts , data , reality, means nothing all that matters is does it support or not ‘the cause’ when your on a mission to save the planet there is no time to waste with such ‘details’
Scarface says:
December 11, 2012 at 11:39 am
Is it me, or did I just hear the fat lady singing?
*
She’s been warming up for a while now – if you’ll pardon the expression. 🙂
People keep saying the Arctic is ‘warming’ so how come the Length of the Arctic Melting Season is getting SHORTER?
Here is the likely cause of the warming at the Reykjavic station.
Iceland
Although
peat has traditionally been used as a fuel in
Iceland, present-day consumption is reported as
zero.
http://www.worldenergy.org/documents/peat_country_notes.pdf
I’ve experience of peat fires for heat and it is a very smokey fuel. Much more than coal.
Although climate science is interpreting this as a volcanic signal.
Iceland is a strong localized source of non-eruptive volcanic warming and cooling. Temperature trend maps show that this phenomenon is localized to an area of about twice that of Iceland. With altitude the area remains constant but the phenomenon weakens and changes sign upon passing through the tropopause. The effect’s magnitude implies a large positive feedback, according to a conventional climate forcing estimate. This phenomenon is unique in that it is not observed for the other major volcanic islands.
http://www.agu.org/pubs/crossref/2005/2004GL021816.shtml
Note in the graphic the warming plume downwind from Iceland AND from the UK, which has no volcanism, but does have a similar reduction in aerosol production from domestic coal burning.
rgbatduke says:
December 11, 2012 at 12:12 pm
“That is actually a damn interesting thing to look at”
Thanks, I’ve found little interest though.
So, while I agree on the site selection you outlined, in fact when I started that was my idea. I felt any conclusions reached would be discounted for cherry picking.
And I found even without that kind of filtering, it’s really consistent, and if you looked at the jpg, seasonal variation is very prominent.
If you haven’t taken a couple minutes, you should follow the link in my name to my results, and see what it shows around the world. I also have created a google map if the stations in each chart.
Gail Combs says:
December 11, 2012 at 1:32 pm
People keep saying the Arctic is ‘warming’ so how come the Length of the Arctic Melting Season is getting SHORTER?
Hi Gail. A quick experiment…Take a 1kg piece of ice out of the freezer and put it in a room that is say 18C. Then take a smaller piece of ice out and put it in a room slightly warmer than 18C. Time how long it takes for both pieces of ice to melt. That will give you your answer.
rgbatduke says:
December 11, 2012 at 7:59 am
About 20,000 years ago the Earth climate system received a large whack as the glaciation ended. We aren’t sure why although the Milankovich cycles seem to play a part. In any case it does not seem surprising that introducing a large disturbance into a complex inter-related non linear system should result in oscillatory behaviour of at least some observed variables. Attribution of such observations may however be futile.
Later you said “So good job, good idea, but don’t bother applying it everywhere as it is pointless where there is humidity and substantial natural variation. Apply the idea only where Nature provides you with a minimally confounded environment, where “only” the non-water vapor driven GHE limits the rate at which desert sands cool at night (or as close as you can manage), controlled for humidity as best you can manage.”
You are looking for extremely small changes introduced by changes in concentration of non water vapor GHGs and agree that to measure this is difficult even under the best of a very limited set of conditions, so my question is (apart from scientific curiousity) “why do we care?”.
Deco79 says:
December 11, 2012 at 2:30 pm
Oh my. And if the melting stops before the ice is fully melted and more ice starts forming from the water (melted ice) puddled around the as yet unmelted bit of ice, what might one infer about the temperature of the room.
climatereason says:
December 11, 2012 at 1:05 pm
Gary Pearse
I suspect you are referring to my article carried here called ‘ historic variations in arctic ice part one’
Hi Tonyb, I am looking forward to seeing your next installment. With all the real information that is available (some of it qualitative but substantive) it seems to me that the reliance on models has become excessive – almost as if one needs a model to do battle with other models. I would say that the models used could be employed wrongly as “evidence” that an Ice Age is impossible and never happened. Probably at the 95% confidence level!!
“Thanks, Rocky, but I write in latex without even thinking about it, and in latex a subscript is underscore. Technically I should be writing CO_2 — (latex CO_2) replace () with $ signs — but I’m too lazy and without a previewer and edit capability it is too easy to screw up and lose a whole paragraph.”
Heh. FWIW, if you are on Linux, and have bound the compose key to something handy (like the otherwise useless capslock) then COMPOSE, _, 2 will work fine. Like this => CO₂
Ok, admittedly doesn’t work so well for more complex stuff that requires latex/mathml 🙂
You can also add your own convenience sequences to ~/.XCompose – just make sure the first line is:
include “%L”
to retain the globals.
So, while I agree on the site selection you outlined, in fact when I started that was my idea. I felt any conclusions reached would be discounted for cherry picking.
And I found even without that kind of filtering, it’s really consistent, and if you looked at the jpg, seasonal variation is very prominent.
If you haven’t taken a couple minutes, you should follow the link in my name to my results, and see what it shows around the world. I also have created a google map if the stations in each chart.
I have dutifully followed the link and read your article. It does not change my suggestion or conclusions. Additional comments: Plotting the daily rise to max vs daily fall to min as a reflection of daily energy balance is not going to work, because it is not. You might as well just look at daily min to daily min, as long term energy imbalance is revealed by how much the minimum temperature increases (assuming global average temperature relates to global average enthalpy in some control volume, which is not strictly true).
However, the primary problem is that seriously, if you want to directly detect the change in the warming produced by the non-H2O portion of the GHE, you have to control for humidity which pretty much means using data from deserts. You also have to control for absolute temperature, which means that you have to correct for the fourth power dependence on temperature of the radiation rate — it actually cools faster on hotter days. You want to remove heat input altogether, hence do only nighttime radiative loss, not the daytime balance between radiation in and radiation out, which is too complicated. The desert is the closest thing we’ve got to a lunar surface, with atmosphere (but little or no water) overhead, and to the extent that one can further correct for wind driven temperature change (only use evening-to-morning variation on nearly windless nights) one can get quite close.
This isn’t “cherrypicking” — this is sensibly selecting a portion of the Earth’s geography that controls for confounding variable so you can more or less directly observe the Big Kahuna itself, CO_2 + Ozone driven greenhouse heat trapping with absolutely minimal (uncontrollable and highly variable) contributions from water vapor or clouds. Even the albedo is ideal — no vegetation to speak of, so no seasonal variation of albedo or emissivity per location.
If you wanted to do even better (and had actual money to work with) you’d use a black painted tarmac insulated from below with embedded thermometers facing straight up in a windless or nearly windless valley. Measure as perfectly as possible the pure blackbody cooling of a purely radiant emissive surface. A few decades of measurements and you’re golden.
rgb
Deco79 says:
Hi Gail. A quick experiment…Take a 1kg piece of ice out of the freezer and put it in a room that is say 18C. Then take a smaller piece of ice out and put it in a room slightly warmer than 18C. Time how long it takes for both pieces of ice to melt. That will give you your answer.
Uh … no.
The experiment you describe does not replicate the arctic ice melt season, nor does it measure an analagous timeframe.
Another failure of warmist cognition. Pardon the contradiction in terms.
Deco79 says:
December 11, 2012 at 2:30 pm
Gail Combs says:
December 11, 2012 at 1:32 pm
People keep saying the Arctic is ‘warming’ so how come the Length of the Arctic Melting Season is getting SHORTER?
Hi Gail. A quick experiment…Take a 1kg piece of ice out of the freezer and put it in a room that is say 18C. Then take a smaller piece of ice out and put it in a room slightly warmer than 18C. Time how long it takes for both pieces of ice to melt. That will give you your answer.
________________________________
OH, so you are saying the ENTIRE Arctic is now Ice FREE in the summers so that is why the melt season is shorter?