The new poster child for 'correlation is not causation': industrial revolution ended 1800 years of volcanic induced cooling

via Eurekalert today

The study emphasises that this trend came to an end with the beginning of the Industrial Revolution and the resulting global warming caused by human activity.

From the UNIVERSITAT AUTONOMA DE BARCELONA

The Industrial Revolution put an end to 1,800 years of ocean cooling

The high frequency and magnitude of volcanic eruptions could have been the cause of the progressive cooling of ocean surfaces over a period of 1,800 years. This is made apparent in an international study published recently in the journal Nature Geoscience, involving researcher P. Graham Mortyn of the Institute for Environmental Science and Technology (ICTA-UAB) and the UAB Department of Geography.

The study emphasises that this trend came to an end with the beginning of the Industrial Revolution and the resulting global warming caused by human activity. It further shows that the lowest temperatures in the first 1,800 years of the Common Era were recorded between the 16th and the 18th centuries, a period known as the “Little Ice Age”.

Earlier research had already shown that volcanic explosions cause the atmosphere to cool. The present study demonstrates that the oceans can absorb and capture more heat than the atmosphere over longer periods of time, thus attenuating global temperature changes in the short term. These alterations in temperature can be prolonged when the volcanic eruptions are concentrated into a short space of time.

These findings bring a new perspective to the study of regional and global variations in ocean-surface temperature over the centuries, before the appearance of anthropogenic (human activity-induced) climate change.

The researchers combined, for the first time, 57 previous works on sea-surface temperature changes, which are calculated from fossil remains extracted from oceans all over the world, from the Poles to the Tropics. The results were compared with data from terrestrial indicators, such as tree rings or ice cores. These revealed a similar cooling trend.

To investigate the causes of this cooling – more robust between the years 800 and 1800 – they used climate models and they examined how the ocean surface reacted to factors like solar activity, changes in the Earth’s orbital patterns, land use, greenhouse gases and volcanic activity. The last of these proved to be of importance. In order to analyse long-term trends, the team grouped the data into periods of 200 years.

P. Graham Mortyn is a member of the Ocean2k group within PAGES (Past Global Changes), which has over 75 members connected to a network of close on 600 highly-experienced scientists. Mortyn lays emphasis on how this group worked with data from previous publications available to the public and archived databases that meet extremely strict criteria, and how they made regular use of Skype conferencing, shared Google documents and other innovative technological resources.

The researchers point out that understanding how these factors caused changes in ocean temperature in the past could give us an insight into future changes in climate.

###

As is typical with shonky alarmist press releases in climate science these days, they don’t bother to name the study or present a DOI. So I looked it up. Laughably, there’s that favorite climate buzzword again, “robust”. It turns out this was published on August 17th, and P. Graham Mortyn is the 7th from the top named author. Given that and the lateness of the press release, this looks like a ploy for attention. And the press release reads quite a bit more alarmist than the abstract, which does not mention the industrial revolution.

Robust global ocean cooling trend for the pre-industrial Common Era

Helen V. McGregor, Michael N. Evans, Hugues Goosse, Guillaume Leduc, Belen Martrat, Jason A. Addison, P. Graham Mortyn, Delia W. Oppo, Marit-Solveig Seidenkrantz, Marie-Alexandrine Sicre, Steven J. Phipps, Kandasamy Selvaraj, Kaustubh Thirumalai, Helena L. Filipsson & Vasile Ersek

Nature Geoscience 8, 671–677 (2015) doi:10.1038/ngeo2510 Received 24 October 2014 Accepted 17 July 2015 Published online 17 August 2015

Abstract

The oceans mediate the response of global climate to natural and anthropogenic forcings. Yet for the past 2,000 years — a key interval for understanding the present and future climate response to these forcings — global sea surface temperature changes and the underlying driving mechanisms are poorly constrained. Here we present a global synthesis of sea surface temperatures for the Common Era (CE) derived from 57 individual marine reconstructions that meet strict quality control criteria. We observe a cooling trend from 1 to 1800 CE that is robust against explicit tests for potential biases in the reconstructions. Between 801 and 1800 CE, the surface cooling trend is qualitatively consistent with an independent synthesis of terrestrial temperature reconstructions, and with a sea surface temperature composite derived from an ensemble of climate model simulations using best estimates of past external radiative forcings. Climate simulations using single and cumulative forcings suggest that the ocean surface cooling trend from 801 to 1800 CE is not primarily a response to orbital forcing but arises from a high frequency of explosive volcanism. Our results show that repeated clusters of volcanic eruptions can induce a net negative radiative forcing that results in a centennial and global scale cooling trend via a decline in mixed-layer oceanic heat content.

UPDATE: Steve McIntyre looks at the oceans2K data, and surmises it’s a bit hyped up. (h/t CTM)

The long-awaited (and long overdue) PAGES2K synthesis of 57 high-resolution ocean sediment series (OCEAN2K) was published a couple of weeks ago (see here here). Co-author Michael Evans’ announcement made the results sound like the latest and perhaps most dramatic Hockey Stick yet:

Today, the Earth is warming about 20 times faster than it cooled during the past 1,800 years,” said Michael Evans, second author of the study and an associate professor in the University of Maryland’s Department of Geology and Earth System Science Interdisciplinary Center (ESSIC). “This study truly highlights the profound effects we are having on our climate today.”

A couple of news outlets announced its release with headlines like “1,800 years of global ocean cooling halted by global warming”, but the the event passed unnoticed at realclimate and the newest “Hockey Stick” was somehow omitted from David Appell’s list of bladed objects.

The OCEAN2K Reconstruction

One of the reasons for the strange lack of interest in this newest proxy “Hockey Stick” was that the proxy data didn’t actually show “the climate was warming about 20 times faster than it cooled during the past 1,800 years”.  The OCEAN2K reconstruction (see Figure 1 below) had a shape that even David Appell would be hard-pressed to describe as a “Hockey Stick”.  It showed a small decrease over the past two millennia with the most recent value having a tiny uptick from its predecessor, but, whatever image one might choose to describe its shape, “Hockey Stick” is not one of them.

ocean2k_recon

FAQ Figure 1: Results of the global sea surface temperature compilation from Ocean2k: A cooling over the past two millenium was reversed only in the most recent two centuries. Fifty-seven previously published and publicly available marine sea surface temperature reconstructions were combined and compiled into 200-year brackets, represented by the boxes. The thin horizontal lines dividing each box are the median of the values in that box. The thick blue line is the median of these values weighted for differences in the region of the global ocean in which they were found. (More in Figure 2a in the paper and Supplementary Table S13. ) Link

Read it all here: http://climateaudit.org/2015/09/04/the-ocean2k-hockey-stick/#more-21349

Advertisements

91 thoughts on “The new poster child for 'correlation is not causation': industrial revolution ended 1800 years of volcanic induced cooling

    • Clearly Mann’s graph didn’t go far enough, temperature wasn’t flat it was cooling! If the paper wasn’t paywalled it would be worth reading for a laugh.

    • calculated from fossil remains extracted from oceans all over the world
      ==============
      they forgot to include the single tree that served as a the global thermometer for 1000 years.

      • It’s a side view of a marshmallow stick, the kind to which you affix a marshmallow and hold it over the campfire until it ignites (or gets golden brown), at which point you burn the inside of your mouth with gushing num-num. I actually like marshmallows that way, and always had an interesting hunt for just the right stick (the metallic ones you could buy got hot too fast; on the other hand, the wooden ones could catch fire, thus reducing the number of marshmallows one could eat; possibly a defense mechanicsm).

    • But the more they cool the past to “Prove” there isn’t really a Hiatus in warming the further the likelihood that the LIA will need to exist as more than a regional effect Or will they then Re-warm everywhere

    • I think that may be the difference between sea and surface temps. I find it interesting that the sea temps started to rise in the 1700s while human production of co2 didn’t reach significant amounts until the early 1900s.

  1. The Industrial Revolution with its sulphur laden coal burning produced more global cooling too, you dumb alarmists. Really! What is not very robust is the thinking processes of today’s climate scientists.

    • “What is not very robust is the thinking processes of today’s climate scientists.”
      They think?
      Who knew?

    • Congratulations on a fact free posting.
      The Industrial revolution started with water and wind power. The cotton and woollen mills were sited in West Yorkshire and Lancashire to make best use of the rivers and streams from the pennines. The goods they produced were shipped on canal boats pulled by horses.
      The use of coal didn’t begin in earnest until the end of the 18th century when the iron masters in Coalbrookdale discovered how to make iron using coke and steam engines expanded beyond their niche market in the collieries.
      In 1700 the total amount of coal mines was less than 4 million tons per year. Most of that was high quality low sulphur anthracite for domestic use which was prized for its low smoke and ash content. It was 1830 before coal extraction reached 30 million tons per year.
      Today world production is in excess of 7000 million tons per annum. The poster boy in Europe is Germany which has EXPANDED its production of coal to 194 MT per year. So much for the German switch to renewable energy. Most of that coal is high sulphur dirty brown coal to fuel the power plants that need to run to back up the intermittent water and wind power.
      The really big players are however China and India which between them produce more than 50% of global coal and they are EXPANDING their use and plan to do so for the next 25 years at least.

  2. Maybe I am missing something here, gone too quickly through this one,,,,,,but is basically the claim in this one that the industrial revolution started before LIA ended?!
    Did actually the industrial revolution started by 1660?!
    Are this guys claiming that their models are better than GCMs, when GCMs actually somehow show that there is no any global warming due to CO2 emissions either natural or human prior to 1960?
    Can some one be kind enough to explain the actual point of this study, if I have got it wrong in principle?!
    thanks in advance.
    cheers

    • The Industrial revolution started around 1730 when the process of cloth production started to be mechanized using water powered machines such as power looms Cromford mill near Matlock in Derbyshire is generally reckoned to be the first modern factory and it was built in 1773.
      Before the Watt engine steam engines were so inefficient and transport for the coal needed so bad that steam power was pretty much restricted to collieries where they were used to drain the mines. It was the beginning of the 19th century before they saw widespread use. This was made possible by the canal system with its horse drawn boats. The first railways were built to carry coal to a point where it could be loaded onto ships. The transport of people came later.

  3. Two comments to note: 1) The increase between 1700 and 1900 appears more or less equivalent to the increase from 900 to 1100. If the first was natural forcing, why wouldn’t the null hypothesis be that the second is also naturally occurring, requiring their study to disprove the null hypothesis? Which I am willing to wager that their studies data cannot do. 2) Am I the only one who noticed that their error bars get larger the more recent the data, which is contra intuitive as the more recent and more modern the data the more reliable that data probably is. With the massive error range for 1700-1900 one would think you cannot draw any firm conclusions about the direction, the magnitude or the confidence in any conclusion.

    • That’s what “robust” means : it obviously tenuous so use a politician’s trick and describe it as robust, a word that suggests a lot but is vague enough to be robust against attempts to prove the statement wrong.
      You notice that they only claim “qualitative agreement”, ie it kinda goes up and down at about the same time …. mostly.

  4. The new poster child for ‘correlation is not causation’: industrial revolution ended 1800 years of volcanic induced cooling

    I don’t get the implied criticism here. I did not read anything in the press release or the abstract that suggested Ind. Revolution affected volcanoes.
    I think there is much to be criticised in what is suggested from reading the abstract but bottom line seems to me to be that the cooling to 1800 and in particular the LIA was mainly due to volcanism, a reduction in which allowed for recent warming.
    This seems to argue that AGW is a lesser factor than usually suggested.

    • What I read in it was that the “Cooling” (into the LIA) was presumably caused by Volcanism over an 1800 year period while the Warming (assumed from Industrial CO2) of the last 200 years is a driving factor of industry CO2 production.
      The Problem I see is that the level of Industrial CO2 production back in 1700 is so miniscule when compared with today that it shouldn’t have a greater effect on temperature than the current levels of CO2 production has

      • Agree there simply HAS to be a correlation of some kind between solar acne and the LIA. Which in turn suggests that there may be a stronger correlation than is thought by scientists, between general solar activity (solar chocolate binges?) and temperature.
        In this case the correlation could hardly be terrestrial temperature influencing sunspots, so it can only be that the sunspots themselves, or else the solar conditions that favour their appearance, cause terrestrial warming.

      • I thought Willis had shown how the climate regulates itself very quickly with clouds and thunder storms to even out the effects of even large eruptions. Leaving the suns output as the main driver of long term changes.
        James Bull

  5. cooling over the past two millenium was reversed only in the most recent two centuries.
    ===============
    FALSE. FIg 1 shows the reversal started around 1700. three centuries ago, which is BEFORE the industrial revolution.
    Cause and effect. The effect cannot come before the cause. Therefore, whatever caused the cooling to reverse, it cannot be the Industrial Revolution. However, we could say that the reversal of the cooling CAUSED the Industrial Revolution.
    For all anyone knows, had the Little Ice Age continued, the Industrial Revolution may never have happened. Instead the famine leading to the French Revolution could have sparked revolting peasants across the world, leading to another Dark Ages.

    • As they say, the data are binned, in 200 yr lots. “1900” means anything from 1800-2000, and so does include some modern warming.
      But the resolution is far too low to look for anything reliable about the last few decades. This is a proxy based paper about the cooling stage. Information about the modern warming stage should be instrument based.

      • There have been coolings in between warming stages in the past. The modern warming is no different from the Medieval, Roman or Minoan warm periods, except it hasn’t been as warm as any of those. The LIA might however have been colder than the previous cool periods.
        The long-term (3000 to 5000 years) trend in temperature is still down.

    • The LIA did continue after c. AD 1700. It just bottomed around that time.
      The LIA helped cause the Industrial Revolution because deteriorating climate led to forests being cut down and their range shrinking. The British turned to mining coal, which led to the development of the steam engine, originally designed to drain mines. The first patent for a steam engine was issued by Spain to Jerónimo de Ayanz y Beaumont in 1606. In 1698, Thomas Savery patented a steam pump using steam in direct contact with the water to be drained. Thomas Newcomen’s “atmospheric engine” of 1712 was the first true commercial steam engine, ie using a piston to pump a mine. In 1781, James Watt patented an improved steam engine producing continuous rotary motion.

      • Well to be fair the main reason for the shortage of timber in the UK and its increasing need for iron was the series of wars against the French from 1700 to 1815. These may have been influenced by the cooler climate but the main driver was geopolitical rivalry
        The conflicts were in the form of periods of open warfare interspersed with the sort of cold war we were familiar with between the USA and USSR. Battles were fought on almost every continent with major campaigns fought in North America ,Europe and India with skirmishes in Australasia and Africa.
        This required the felling of most of the forests in England to build ships and produce the charcoal to smelt iron. Coal had been mined on a small scale since at least the time of Queen Elizabeth 1st. It was shipped to London for domestic use. James Cook learned his trade on just such a vessel and the ships he used for his explorations were converted coal carriers. They were built with a wide beam and flat bottom so they could load and discharge coal from the beach without needing a port.

  6. Notice that Fig 1 ends at 1900. Thus the cheat that “was reversed only in the most recent two centuries”. It appears to be the most recent two centuries only because the graph ends 100+ years ago, around the time the Industrial Revolution was only getting started.

  7. “they used climate models and they examined how the ocean surface reacted to factors like solar activity, changes in the Earth’s orbital patterns, land use, greenhouse gases and volcanic activity.”
    This says everything that anybody needs to know. Climate models don’t work out how these correctly respond to each variable, that is down to the modelers. It is a completely made up construction based on views by a few climate modelers. There are no observations directly used in the construction. Can they explain how cooling oceans have in the past warmed the planet, even slightly warmer than today? Cooling oceans around Antarctica using proper observations (satellite) show the air temperatures nearby have cooled with them.

    • “There are no observations directly used in the construction.” Just to clarify because I’m sure some will pick up on the 200 year blocks. Reconstructions give a rough idea and not like true observations.

      • Steve McIntyre has promised to post a redo using 20 year bins. He already posted his R code that reproduced the 200 year bin paper result. His result will be similar to SI figure 10. See comment downthread.

  8. Seeing the graph in the update, one thing is clear: the MWP was notably warmer than the last 200y box.
    One is lead to ask why they did not use at least 100y resolution. They must have more that ten data points in each of the 57 proxies.
    Is there something about the last 100y segment of the proxies what would cause embarrassment?

  9. So they looked at climate models which have been tuned to be sensitive to carbon dioxide and volcanic haze, and found that climate models are sensitive to carbon dioxide and volcanic haze. Congratulations.

  10. Also note that their models were sensitive enough to greenhouse gases to warm the planet due to the small amount of CO2 before the big 1950 increase. The models should go crazy after 1950.

    • The models should go crazy after 1950.

      You are so optimistic. They don’t go crazy because they added aerosols to counter act the overemphasis on C02.
      Peter

  11. How on earth could they have ‘sea surface temperatures’ several hundred years ago????
    And yes, the error bars for the present are gigantic and much bigger than the distant past when no one was keeping records of ‘the world’s oceans’ which incidentally, varies greatly even today and little of the actual surface area is ever really measured, it is assumed to be various temperatures.

    • Obviously they used a computer model to ‘find’ the surface temperature.
      One of those machines that go ‘ping!’, a la Monty Python, no doubt.

  12. 1800 yrs of volcano induced cooling?
    Maybe, but what caused the changes from RWP to Dark Ages to MWP to LIA?
    Clearly there are many more factors involved. To try and link climatic changes to one factor is naive in the extreme.
    Still, I guess we ought to be eternally grateful that our forefathers ignored the 19thC equivalent of Greenpeace and carried on with the Industrial Revolution, otherwise we really would be up the proverbial creek without a paddle!
    Instead of the West paying “climate reparations”, perhaps we should ask for a reward!

  13. “To investigate the causes of this cooling – more robust between the years 800 and 1800 – they used climate models and they examined how the ocean surface reacted to factors like solar activity,”
    Aaaaahhh…climate models, is there anything they “can’t” prove?

  14. The main issue of course is how there can be 1800 years of volcanic activity large enough to change global temperatures? It only needs to stop for 2 or 3 years and the global temperature will respond back to normal. This has been observed for real on 2 recent occasions, not some fantasy model. I’m currently calling this paper a load of modeled nonsense. There has been no previous historical accounts of almost continuous large volcanic eruptions affecting the planet. Observations show the only way to cool the planet long term is for large volcanic activity to keep going on and going on.

  15. There is more. The data in the posted figure above was stuffed into 200 year bins, which McIntyre showed caused some serious problems.
    But, the SI figure 10 has 25 year bins from 1850 to 2000. Not paywalled. Figure 10 also parses the proxies several ways: tropics/extratropics, alkenones/MgCa, and so on. Guess what. No 20th century hockey stick in any of the parsings. Some show potentially meaningful cooling in the 20th century. Only one parsing shows meaningful warming, only in the 1975-2000 bin. Translation, the ocean proxies maybe aren’t so good. Or, the pre Argo ocean surface temp records maybe aren’t so good. Or both.
    When this study was announced in 2011, it was intended for inclusion in AR5. That obviously did not happen since the paper just appeared. SI figure 10 explains why. No statistical proxy evidence of ocean SST warming in the 20th century is a very inconvenient truth.

  16. P. Graham Mortyn is a member of the Ocean2k group within PAGES (Past Global Changes), which has over 75 members connected to a network of close on 600 highly-experienced scientists.

    They don’t know exactly how many members are in their group but they expect us to believe that they know what the temperature was a thousand years ago.

  17. Based on the long Siple Dome Antarctic core, volcanism doesn’t appear to have changed much over the past more than 3000 years:
    JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 111, D12307, doi:10.1029/2005JD006072, 2006
    https://geoinfo.nmt.edu/staff/dunbar/…/kurbatov_ziel_dunbar_siple.pdf
    See Figure 2.

  18. it’s interesting how they claim the volcanoes were the cause of the cooling…yet…they show a warming right around the time of Tambora…the largest volcano in the past 2000 yrs….in 1815……I suppose the effects of tambora weren;’t so pronounced in their study while we know the rest of the world was freezing about that time….

  19. ROBUST is the new political buzzword that politicians often use it when they are panicking about the weakness of their position.
    I just googled ROBUST in google news – these are the sort of statements you get
    “The German finance minister Wolfgang Schaeuble was on the wires today stating the German economy is robust despite the global risks.” In other words they are talking the German economy up because they are worried!
    here is another:
    “It is also positive to see the average price received remaining robust… Investec however suggests the auction indicates that the emerald market was not completely insulated from the pains the diamond industry has been going through.” in other words they are worried!

  20. I doubt any climate model can accurately simulate the effects of short and long term changes in stratospheric aerosols. El-Chichon and Pinatubo appear to have a net warming effect on the climate, not cooling.

    • “El-Chichon and Pinatubo appear to have a net warming effect on the climate, not cooling.”
      So why did global temperatures cool immediately after significant changes in stratospheric aerosols for both occasions before recovering again?

      • Cool immediately after? Compared to what? The other noise and oscillation clearly visible in the global anomaly? And what do you mean by “immediately”?
        You need to play “hunt the volcano”. All it takes is a good friend who goes here:
        http://www.woodfortrees.org/plot/hadcrut4gl/from:1850/to:2015
        Have him or her print the page, cut off the time scale and temperature scale, cut the temperature series into vertical (overlapping) strips 20 years wide. Then you get to play. In this time interval there have been a number of large — VEI 5 or 6 — volcanic eruptions. No fair looking them up, or even counting them. I promise, though, that there are more than 10. So use a vertical ruler and mark 10 lines on this graph that you think are evidence of volcanic cooling from major eruptions. In order to “win”, your line has to occur no more than two years after the eruption, no fair counting misses that precede the cause!
        Good luck!
        rgb

      • El-Chichon cooled compared a strong El Nino, global temperatures would have been about 0.2 c higher. Immediately refers too when stratospheric aerosols significantly increased as shown below.
        https://imagizer.imageshack.us/v2/696x330q90/r/836/had3vso2vsaot.png
        Pinatubo also occurred during a El Nino so global temperatures should have risen not cooled. There were warm NINO 3.4 SST’s during both these periods, so should have warmed more than both periods did.

      • You need to play “hunt the volcano”. All it takes is a good friend who goes here:

        That’s a pretty amusing way of doing it.
        A more formal method would be to compare the signal to that of the appropriate feedback level of auto-correlated noise and see if you can find the signal using wavelets. The Null Hypothesis is that your signal is just due to random autocorrelated noise, and this is entirely testable using Monte Carlo simulations.
        This is a pretty good paper that does this. In the paper when it comes to SST the ENSO signal is the only thing that crosses the 95% confidence interval. Nothing else does. No volcano signal in SST. Certainly not C02….
        http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.28.1738
        Unfortunately they took down the cached copy in the last week and the links are broken. I saved off a copy It’s at:
        https://www.dropbox.com/s/lw1kzdfjw0ifcdo/10.1.1.28.1738.pdf?dl=0
        The general writeup of the method on the author’s website is here:
        http://paos.colorado.edu/research/wavelets/
        Peter
        PS: If there’s interest I’m going to attempt to apply the same analysis to surface temperatures. Alas the wavelets tool in Octave are poor, I’ll have write my own.

      • A more formal method would be to compare the signal to that of the appropriate feedback level of auto-correlated noise and see if you can find the signal using wavelets. The Null Hypothesis is that your signal is just due to random autocorrelated noise, and this is entirely testable using Monte Carlo simulation.

        I already did most of the former a different way — simply trying to fit HadCRUT4 to a model for a temperature response that is a function of VEI with an exponential tail. Note well, fit all of it, back to 1850, using all VEI 4 or higher events. One doesn’t expect this to be perfect, as VEI is related to total ejecta mass but doesn’t include things like the chemistry of the eruption — Mt. St. Helens was VEI 5 in 1980 and produced nary a blip in spite of blowing out half a mountain with the equivalent of a 5 MT nuclear explosion; El Chichon was VEI 5 and produced a barely visible (oddly lagged) response probably because it was a very high sulphur emitter. In spite of being an order of magnitude weaker than the Pinatubo/Mt. Hudson double whammy in 1991, it produced almost as large a response in TOA total insolation as measured at the top of Mauna Loa.
        The result was NOT perfect. It was far from perfect. It barely improved a fit made without this extra information, in spite of substantially increasing the number of parameters. The problem was, and remains, twofold. On the one hand the best fit underpredicted just about all events. This is barely understandable — as noted above, some eruptions have the chemistry and project their material up to the right heights to have a (barely) noticable effect, others do not. One would probably need a five or six parameter model and more information than will ever be available to fit just this one historical record, and even then there would still be a lot of misses. The second one is that it is a simple fact that many eruptions, including large eruptions, produce absolutely no effect. By no effect, I mean that the temperature rises all the way through a VEI 5 or 6 eruption, not just that sometimes it remains flat. The improvement from the fit was very small, because even the modest improvement in the fit relative to smooth in the case of a few eruptions, e.g. Pinatubo, was reduced by the fact that the best fit went the wrong way a substantial number of times.
        Here is the best fit:
        http://www.phy.duke.edu/~rgb/cCO2-to-T-volcano.jpg
        You can clearly see the best fit VEI 6 events and can just barely discern the smaller VEI 5 events or clusters of 4s. Of the 4 VEI 6’s, 3 produced a crudely synchronous dip — often with a strange lag — while 1 had no discernible effect. The 4’s and 5’s have basically no correlation with dips. Sometimes the temperature goes up after a 4 or 5, sometimes it remains nearly constant, sometimes it plunges as much or more than it does for a 6 and stays down far longer (note the peculiar cluster in the early 1900s). It is a bit of a stretch — literally — to associate the small dip after El Chichon with the volcano, as it is both lagged by a full year or more and because that dip is even larger than the supposed Pinatubo dip. Finding correlation here is more about wishful thinking than reality.
        My own conclusion from this best fit is this: VEI 6 and up volcanoes, it can be argued, have a very small effect on global average temperature. This effect has an average (best fit!) amplitude of less than 0.1 C (yes, sometimes it is as much as 0.2 C, but then, sometimes it doesn’t affect it at all or is cancelled by other stuff that is going on and no doubt sometimes it is added to by other stuff going on as well, because this graph shows nothing if it doesn’t show lots and lots of stuff going on — natural cycles with amplitudes as high as 0.3 C bumping global average temperature or causing it to plummet in as little as one or two years. Not a few, not even many. The climate fairly clearly has a somewhat chaotic set of natural oscillations with amplitude on the order 0.1 C and periods on the order of 3 to 7 years that heterodyne and shift, so that it is the exception rather than the rule that any temperature persist within 0.1 C for decades. The “pause” we are now in is just such an exception — there is exceptionally little variance in the record post the super El Nino in 1997 with its associated 0.3 C bump. Nearly everywhere else, the temperature shifts by as much as 0.2 to 0.3 C up or down within 5 years of any starting point.
        This is why I state that quantitatively there is a serious signal-to-noise problem even for statistical algorithms to detect a volcanic response. IMO one simply cannot detect any response whatsoever from VEI 5’s on down, although one MIGHT if one added another dimension or three (but then one hits the dread “fitting an elephant and making him wave his trunk” problem, especially with such a short period to be fit). Any signal has an amplitude of a few hundredths of a degree, a lifetime of a couple of years, and is utterly swamped by the ~0.2 C natural oscillations that (since we cannot predict them) constitute “noise” on top of a straightforward log concentration model. VEI 6 volcanoes probably do have an effect, but seeing it is as difficult as “seeing” the pattern inside a random dot sterogram — you have to kind of cross your eyes and squint a bit. And would you ever see it — or even suspect it is there — looking only at the data? I mean, who looks at a picture of a bunch of paint splatters on a canvas and crosses their eyes and squints? Seriously!
        In the meantime, the point of hunt the volcano (and my own personal best fit to a volcanic model) is that while people and papers — like the top article — are free and easily with their assertions that volcanism in particular or aerosols in general have a major negative impact on all-things-equal global average temperatures, this simply is not true. VEI 6 and up volcanoes are rare, and below this any effect that they might have collectively or singly is simply invisible against the predominant secular effect plus natural variation, trebly so since we may or may not know the full spectrum of natural variation or have all of the physics right that drives it and certainly cannot extract any components on the order of the length of the thermometric record. And even VEI 6 volcanoes produce a dip that is all but gone in less than 5 years.
        Good Hunting!
        rgb

      • The volcano signal is not there in the overall record because only 2 since the 1980’s have been strong enough to significantly affect SAOT levels. Both only last no longer than about 3 years, so a change to global temperatures no bigger than ENSO shows almost like a needle in a haystack. Cooling is observed with these 2 volcanoes due to removing ENSO cools them further. The ENSO signal is what determined that these two volcanoes cooled the atmosphere.

      • rgbatduke:

        It is a bit of a stretch — literally —

        Your wiggles in places where there are no eruptions are larger than the wiggles where you have eruptions. So yeah, quite a stretch.
        BTW, what makes you think the trend is due to ln(C02) rather than noise? You have no evidence to support that. That also could be noise. Remember autocorrelated noise has energy proportional to 1/frequency. Also remember that you can’t see multi-hundred year period energy unless you actually collect the data…
        If you look at Figure 3 in the paper I cited you’ll note that the 95% confidence interval is much stricter (wider) the lower the frequency of the signal. That trend is the lowest possible frequency, therefore most likely to confirm the Null Hypothesis “it’s just noise”.
        Peter

      • BTW, what makes you think the trend is due to ln(C02) rather than noise? You have no evidence to support that. That also could be noise. Remember autocorrelated noise has energy proportional to 1/frequency. Also remember that you can’t see multi-hundred year period energy unless you actually collect the data…

        My friend, I do not think “have no evidence to support that” means what you think it means, not when I present a picture with 165 years of temperature anomaly with a ln(cCO2) + T_0 drawn literally right down the middle of the data. Tht is substantial evidence to support it. No evidence would be historical data that did not scatter neatly around the log curve.
        Is the data “sufficient” to prove the hypothesis? Of course not. For one thing, as you note, it is only a fairly short period of time, as these things go. For a second (which I didn’t show on this graph but do on others presenting the same fit) the acknowledged error bars of HadCRUT4 range from 0.1 C on the modern side to around 0.3 C in the 19th century (where I seriously doubt the honesty of the latter estimate). For a third, HadCRUT4 (and all of the other anomaly estimates) suffers from having been repeatedly “adjusted” in ways that almost always cool the past and/or warm the present, so that a substantial amount of the warming presented is due to data adjustment that may or may not be free from the biases that plague even the best-intentioned of data adjusters with a hypothesis to prove or an axe to grind. For a fourth HadCRUT4 doesn’t adjust for UHI at all (IIRC) and IMO UHI adjustments would obviously oppose and at least partly cancel the many “warming” adjustments made. For a fifth, there are many alternative hypotheses that might also explain the data, including the cumulation of many kinds of noise or the inclusion of neglected physics or features of nonlinear dynamics and deterministic chaos (as opposed to “noise”). For a sixth, the log CO2 model probably won’t work over the last 2000 years, maybe not even over the last 1000 years, although the global anomaly from before the invention of thermometers is known only by means of (contentious and inconsistent) proxies of various sorts and has even more uncertainty than the already substantial and probably underestimated uncertainty of thermometric estimates in the 19th century. At some point this uncertainty is so great that we literally don’t even know what we are supposed to be fitting.
        None of which means that the graph above is not evidence in favor of the hypothesis that CO2 warms the climate according to the theoretically predicted and easily understood log concentration rule. Seeing this curve should — if you are unbiased and/or have managed to avoid making up your mind about it on non-objective grounds — take whatever degree of belief you had on the issue before you saw it and increase your degree of belief in the “CO2 has caused some or all of the warming over the last 165 years) and decrease your degree of belief in the alternative hypotheses including whatever you wish to assert is the null, not the other way around. Honestly, the hypothesis fits the data pretty damn well, and objections from my long list above and more besides notwithstanding, that fact alone makes the hypothesis more likely to be true, not neutral, not less likely. How much more likely is difficult to quantify, but given that the hypothesis I implement is the no-feedback no-lag pure mean field theory applied to the Earth as an open thermal system, I’d say that it is very substantial evidence that at least part of the warming observed (which is itself, note well, highly uncertain given the acknowledged error bars — it could literally be half of the median claim) is due to the increase in CO2 as a greenhouse gas.
        Note well that I am far from alone in my personal belief that this is the case. Virtually all of the physicists including the most skeptical of them who speak or write on the issue agree on it, because one can walk through a derivation of the prediction and it makes sense. It also has considerable uncertainty because parts of the computation cannot be done precisely and the approximations used introduce uncertainty, as well as the fact that it is one component of a strongly coupled complex nonlinear chaotic double Navier-Stokes open system and so mean field results in general are certainly doubtable or could be overwhelmed or augmented by nonlinearities and feedbacks within the complex system.
        The point is that the sensitivity implied by this damn good empirical fit is 1.8 C per doubling of CO2. This is a far cry from the median IPCC estimate, and is IMO more likely to be an overestimate rather than an underestimate because of the high probability of warming bias corrupting the anomalies (in a most non-scientific estimation process). But I don’t think that all of the warming in any of the major anomalies is due to adjustments or bias, and think that the case for anthropogenic CO2 contributing to the observed warming is a strong one, just as I think (at this point) that the argument for strong cooling due to anthropogenic or natural aerosols is a weak one. With this one correction — weak to neutral cooling due to aerosols — GCMs predict much more modest (and IMO more likely accurate) climate sensitivities, in the 1 to 2 C range. This is not overtly inconsistent with skeptic Lindzen and Choi’s estimate of around 1 C.
        Don’t get me wrong. I am very much aware of the work of Koutsoyiannis on Hurst-Kolmogorov statistics in non-stationary climate processes, and yes, it is by no means certain that some, or all, of the observed 165 year warming is not due to the cumulation of either random or biased Hurst-Kolmogorov shifts. One can certainly run models to show that this hypothesis is capable of explaining the data, but alas those same models show that there is no more than a probability that the particular pattern observed would be produced, with alternatives with less robust cumulation of temperature rather more likely. The statistical models alone also are not easily made consistent with causal models or dynamical models (even simple ones) for the open system — the connection to the fluctuation-dissipation theorem, for example, is very difficult to imagine. The best one can say is that chaos is causing fluctuations that behave like colored noise or the like, which is descriptive but not very useful or testable.
        rgb

      • PS: If there’s interest I’m going to attempt to apply the same analysis to surface temperatures. Alas the wavelets tool in Octave are poor, I’ll have write my own.

        Actually, the matlab sources on his website will probably port into Octave with at most a few tweaks. Octave is usually at least 96 or 97% compatible with matlab, as long as the matlab code doesn’t call/use proprietary toolboxes. At a glance, he is distributing raw .m files and not a toolbox.
        I’m not convinced that wavelet transforms are going to yield a lot more information than a straight up FT, BTW. They may suppress certain kinds of artifacts, and they are arguably better for picking out oscillatory autocorrelation that otherwise randomly shifts phase, but the wavelet spectrum he presents for ENSO in region 3 is strongly reminiscent of what Vukcevic posted last week as the FFT for HadCRUT. There is a lot of power in the 3 to 6 year range, with probable peaks at 3 and 5 years but a lot of other stuff going on that is likely “noise” associated with nearly random short run phase shifts. It was interesting to see that the wavelets picked up what appeared to be an 11 year peak that is plausibly associated with the solar cycle. I strongly suspect that these features are persistent in an analysis of the global anomalies and while they are insufficient to predict future short-term oscillations, they do give one something to explain and connect to physics via fluctuation-dissipation.
        However, I fail to see the connection between the paper you put into dropbox and the question of whether or not CO2 is an excellent fit to the GASTA (global average surface temperature anomaly). It clearly is relevant only to the noise. Neither FFTs nor wavelets can squeeze blood from a stone or fourier components with period equal to the length of the timeseries from a timeseries — all you get is an artifact. Indeed, you cannot take seriously components HALF, or A QUARTER the length of the timeseries — there simply aren’t enough samples and the probability of artifacts or accidents is high. That’s why one cannot take the easily fit 67 year sinusoid in the GASTA superimposed on top of CO2 warming seriously — there are only a couple and change oscillations, not nearly enough necessarily be anything but accident in a fourier analysis of any sort.
        The 11 year peak, that is believable. The 3 and 5 year peaks are also strongly characteristic across the periods of strong ENSO, which are also periods of strong warming with the usual long period correlation to named multidecadal oscillations e.g SOI. But one is still a good step or three away from explaining global warming as being caused by ENSO, as it is just as easy to say that ENSO has been enhanced by global warming but self-organizes it into bursts. Lacking a quantitative, plausible physical model for either one these are nothing more than attractive sets of words that cannot easily be tested.
        rgb

      • rgbatduke responded to my comment:

        You have no evidence to support that.

        I apologize, that was far too strongly worded. I should have said “you have insufficient evidence”. I’m usually more careful in my phrasing, sorry about that.
        You presented all my skepticism in a very well written reply, which I mostly agree with. I learned a lot from your reply and really appreciate it. I have some more reading to do now…

        Koutsoyiannis on Hurst-Kolmogorov statistics in non-stationary climate processes

        I would love to get some references on that. Please post some, I’m still learning a lot in this area.
        best regards,
        Peter

      • rgbatduke wrote:

        I’m not convinced that wavelet transforms are going to yield a lot more information than a straight up FT, BTW. T

        Sorry, I should be more clear. I meant the part of the paper where they use the idea of deciding whether the hypothesis that there’s a signal, as opposed to the Null Hypothesis of “it’s noise”. As you point out you need at least 4 periods to ascertain anything about low frequency signals. A long term trend like C02 is as low frequency as it gets..
        I think what could be done, for which I have first-draft code, is to find out what the 95% confidence intervals for trends in autocorrelated processes. What I’ve done is created a Monte Carlo simulation of autocorrelated noise that is 8x the data length of the surface records, and then looked at trends in the middle of that noise that are the length of the surface station records. (e.g. Hadcrut4, GISS). Then I look at the distribution of trends in that Monte Carlo simulation, and compare to the trends in the reported surface station data. I can calculate confidence intervals that say “95% of autocorrrelated noise signals have trends between these two values”. This is somewhat similar to the idea in that paper, but applied to trends instead of frequency or wavelet analysis.
        My early draft results is that GISS is slightly above the 95% confidence level, and Hadcrut4 just below, both on the positive side. This suggest to me that there is some signal that exceeds that of random noise, but not by much. The signal is probably C02, given the lack of other obvious trends (here the solar enthusiasts will probably chime in…). However it’s not an extremely strong signal. Part of the signal is also like dry labbing on the part of Hadcrut and GISS as well…
        I clearly need to do more reading on how to deal with this “it could be noise” null hypothesis beyond the paper I posted. I’d love to get some references from you on how to pursue this idea. I’m using this forum to further my education on signal processing and statistics, it’s a fun way to do it.
        BTW, I have a strong philosophical guard against “well it must be something”. I believe that “it’s not knowable” is a perfectly valid result in science. Hypothesis in science can exist in 5 states: “proven correct”, “proven wrong”, “unconfirmed”, “not provable given current data”, and “it’s not even wrong”. The C02 influence for me is sitting on the borderline of “not provable with current data” and “proven correct”.
        You can guess which category I think most of the catastrophe predictions lie…
        best regards,
        Paul

      • Sorry, I should be more clear. I meant the part of the paper where they use the idea of deciding whether the hypothesis that there’s a signal, as opposed to the Null Hypothesis of “it’s noise”. As you point out you need at least 4 periods to ascertain anything about low frequency signals. A long term trend like C02 is as low frequency as it gets…

        A monotonic trend has no (meaningful) frequency. Hence using Fourier analysis to extract its components is largely a waste of time on any timescale. Indeed, it contributes a whole frequency spectrum of components only because harmonic waves form a basis for the vector space of continuous functions (including monotonic ones) not because it has a “frequency”.
        On the other hand, the fit to log CO2 above most assuredly is not noise on the timescale presented. Rather, there is clear evidence of noise around this physically motivated signal. Fourier analysis is numerical data in search of a cause and even with the FT or wavelet analysis in hand, one has to guess what MIGHT be producing what appears to be a signal at any given frequency unless one has a theory that can be used to reduce the dimensionality of the fit.
        This is why wavelets and fourier analysis are relevant to the short period stuff — if nothing else, they characterize the short time/transient response rates of the climate, which is actually shockingly important because of the Fluctuation-Dissipation theorem, and which is shockingly neglected in climate science and modelling, although there is some very recent work by Majda, et. al. that IMO shows considerable promise with the sole complaint that it is focusing on low-frequency response. The primary dissipative modes are clearly high frequency stuff like the peaks observed in both wavelet and Fourier transforms, and should really be studied with Laplace transforms and autocorrelation methods, or with truly complex and somewhat ad hoc methods that attempt to identify and project out specific physics and observed physically motivated frequencies — arguably for example the 11 year cycle and the log CO2 response and possibly on the 6+ VEI volcanoes — and focus on the short period fluctuations that remain.
        As I pointed out above, the fit I present is a no-lag, no-feedback fit to log CO2. It works more or less perfectly (perfectly within clearly visible, consistent short period noise). This is a serious problem for models that claim that there is a lot of “uncommitted warming” that is lagged by 30 to 100 years, or models that argue that there is or will be a twofold or better amplification of the warming due to water vapor feedback. Response could be lagged, no doubt, but this makes things more complicated and is thus not the default assumption as long as an unlagged model works. Similarly, there could be water vapor feedback (both ways!) but if so, it seems to have no impact on the overall log character of the response, which empirically must include all feedbacks. So 1.8 C/doubling could be interpreted as the climate sensitivity including all feedbacks — perhaps CO2 alone would have only produced 0.8C or 0.9 C — and if there is any uncommitted warming in some sort of invisible “pipeline”, it is cumulating and releasing in precisely the right way for it to be invisible, buried inside a nearly perfect log CO2 total response.
        Wavelet analysis is relevant to ENSO, the SOI, perhaps (marginally!) to the PDO, to the NAO, AO, etc. Aside from ENSO, the multidecadal oscillations are at the edge of the fourier resolution from a 165 year sample, almost 2/3 of which predate truly global observation with reliable instrumentation. We are half a century to a century away from having enough, reliable enough, data to be able to resolve many of the theoretical hypothesis out of which the enormously complex model of climate must be built. In the meantime, we are stuck with unconfirmed guesses, beautiful theories, political arguments, and a whole lot of confirmation bias as people who want the world to wean itself from coal and oil use the specter of climate catastrophe to bolster their otherwise hard-to-sell arguments, and people who want to use coal and oil no matter what argue that increasing CO2 has no effect whatsoever and besides, we aren’t responsible for the increase.
        The evidence strongly supports the assertion that we are largely responsible for the increase in CO2. The evidence strongly, but not conclusively, supports the assertion that the increase in CO2 has contributed to the general rise in global average temperature over the last 165 years. The evidence is absolutely clear that over this period, the net effect of increased CO2 both directly and as a side effect of producing cheap energy on human culture and society has been so overwhelmingly positive that it would not be far short of the mark to assert that the entire development of human civilization and its wealth and prosperity and contributions to the general weal has depended on it. Coal based energy lights the night and fuels our factories, and the CO2 released has only improved the Earth’s climate — so far — and has almost certainly been a tremendous blessing for the CO2-starved biosphere — so far. The evidence so far does not support the assertion of any sort of looming “climate catastrophe”, although neither does it rule such a thing out.
        It is difficult to choose a prudent course, given the evidence. But it is absolutely impossible to choose a prudent course when all of the evidence and reason applied to the problem is horribly distorted before being presented to the public (by both sides) to lead to a foregone conclusion (in both) directions. One side ignores the tremendous benefits of coal based power, plus the inconvenient truth that we can live without it at the moment only at the expense of the greatest depression the world has ever known times ten. The other side ignores the fact that increasing CO2 could, at some level, create serious problems for the Earth’s biosphere and its human component. Prudence suggests — perhaps — that it would indeed be desirable to gradually shift global energy production into long term sustainable forms, simply because coal and oil are limited resources and we are in some sense squandering them if not because of the risks associated with altering ocean chemistry and/or introducing unpredictable shifts into the chaotic maelstrom of the climate. Prudence demands — for certain — that this transition should be slow and should proceed at a rate consistent with the economics of poverty and the inevitable advances of technology.
        Should we be burning coal at an increasing rate to make electricity, especially for the world’s poorest people, as fast as we need to, ten to twenty years from now? Almost certainly. Should we still be doing so in 2050, or 2100? Almost certainly not. Should we continue to invest heavily in the development of alternative/sustainable energy resources? Of course, a no-brainer. Should we push the subsidized installation of those alternatives when they are clearly not economical, when the technology needed to support them is immature, when by doing so we divert money from the many other burning issues that confront us (like world peace, world education, world poverty, and the need to support research that might MAKE the alternatives the LESS expensive option in the long run?
        I think not. Alternative energy sources are great, to the precise extent that they make economic sense. When something is economically optimal, one doesn’t have to force people to adopt it. The evidence of long term future costs is weak and at least partly trumped up, and analyses of cost-benefit of using coal in particular as an energy source that rely on failed computer models to predict expensive catastrophes and that ignore the obvious net present benefits are not convincing. On the other hand e.g. solar technologies are very close to maturing and will very likely become the least expensive alternative over the next decade or two at the most. Technological breakthroughs in energy storage would change everything. Development of thermonuclear fusion or LFTR would change everything. One cannot make enormously expensive decisions to implement immature technologies or the future will laugh at you even as the con artists who will line up to take your money thank you.
        rgb

      • On the other hand, the fit to log CO2 above most assuredly is not noise on the timescale presented.

        We are mostly in agreement about wavelet and Fourier analysis. I’ve abandoned that approach for the reasons you stated.
        What about the idea that trend can be due to auto-correlated noise? Or some other process such as whatever ended the LIA? I’m pretty sure I can create autocorrelated noise on some period in the time series replicates a ln() function. After all I’ve already done that for the GISS temperature record…
        Peter

      • What about the idea that trend can be due to auto-correlated noise? Or some other process such as whatever ended the LIA? I’m pretty sure I can create autocorrelated noise on some period in the time series replicates a ln() function. After all I’ve already done that for the GISS temperature record…

        Of course this is possible. But is it probable? That’s a very, very, difficult question!
        For one thing, to get the particular pattern observed at all, one has to make a small mountain of assumptions on the character of the autocorrelation time(s). Then one has to run many sequences (for an assumption pattern that works at all) because unbiased noise will drive temperatures down according to some smooth function (for a sufficiently short segment of the data) as likely as up, and most of the time will take three steps up and two down, pursue a drunkards walk \sqrt{N} deviation from any starting point, as likely up as down, across a long enough timescale.
        Some fraction of the sequences will be close enough to your desired pattern that you can count them as “evidence” that the autocorrelated noise you’ve selected could be the cause of the warming observed. Yes, it requires the moral equivalent of flipping heads 25 out of 30 times or the like and hence is not “likely” but it is possible. If you choose just the right noise, you might get it down to within the p = 0.05 cutoff that “cannot be rejected” in a standard hypothesis test (although it is interpreted in many other ways, sadly).
        Of course, this is precisely the argument used by the purveyors of the GCMs, which are de facto doing precisely this. They are running a strongly autocorrelated Markov process, one that does not conserve energy in the first place with constant forcing, let alone reflect actual energy imbalance in the open system, so it would drift in just this way if not renormalized to some sort of guess at energy conservation that also leaves the supposed forcing residual. The result is that any given GCM produces an enormous (shockingly so, if you’ve ever seen it) spread in possible futures for a tiny random perturbation of initial conditions and parameters for a fixed forcing schedule. Sometimes it warms a lot. Sometimes it doesn’t really warm at all. Sometimes it cools. Then they average all of these futures together and make some sort of claim for this being predictive, even though per model the observed trajectory of the planet is out there on the wrong side of the p = 0.05 cutoff for the ensemble of trajectories for a majority of the models in CMIP5. Then they (worse still) average all of the CMIP5 trajectories together and show the collective envelope of the perturbed parameter ensemble average trajectories to fool the eye into thinking that the observed trajectory is within their acceptable predictive range.
        And here’s the real tragedy. They do this without the slightest shred of support from the combined theories of mathematics and statistics! There are no mathematical theorems I’m aware of that prove that the envelope of trajectories from a highly approximated, multivariate nonlinear chaotic model solved at an absurdly large integration stepsize that doesn’t even preserve detailed balance in a a stationary system has some necessary relation with the actual trajectory of a real system that is evolving consistently and with perfect balance all the way down to the subnuclear scale. And in any event, averaging multiple failed models does not produce a successful model in any theory of statistics with which I am aware. It certainly can, just as you can show that random chance can produce 25/30 heads and hence could be an explanation of why you are losing your shirt to a stranger who talked you into betting in a train station.
        The question is — what are the (true, unbiased) odds? Which is globally more likely, 25/30 heads or meeting a stranger in a train station who likes to wager on coin flips who just happens to possess a coin or coins that produce a surplus of heads when flipped?
        This is why one should stick with Ockham’s Razor. The model I present above (leaving out the volcanic bit) is effectively a one parameter model. Yes, it has to match scales with the arbitrary zero of the model lined up with the arbitrary zero of the anomaly, but this is a trivial shift up or down. The curvature is all \beta \log(CO_2). You expect this behavior on purely physical grounds, you look, and there it is. This isn’t betting with a stranger in a train station. This isn’t playing a much more elaborate game of chance with the GCM modelers. This is betting that our understanding of quantum mechanics, fluid dynamics, and radiation theory is not egregiously wrong as a simple mean field theory! Sure, there is noise, but the fit above is really, really good and it doesn’t require anything but the short-period variations associated with the reliable part of the wavelet or fourier transforms that characterize the short-period quasi-periodic variations of the climate to model the observation acceptably every time. It is literally a lot more likely to be correct than competing explanations, especially ones that rely heavily on random chance to obtain the observed (comparatively unlikely) pattern of warming in the first place.
        With all that said, you should google up Koutsoyiannis’ papers (readily available on the web, for the most part). He works through the statistics of all of this a lot better, and shows that a lot of the climate’s variation can be explained — in the sense that it cannot be rejected — as a cumulation of delta-correlated punctuated equilibrium shifts, with or without an underlying bias. There is some evidence for this in the actual climate — the current “pause” being part of it. There was a “sudden” shift up from 1985 to 1997-1998, followed by a near plateau of temperatures. There was a simlar shift pattern in 1930 to 1945, followed again by a near plateau of temperatures. There are similar, but both smaller and longer shifts and plateaus visible in some of the VERY long time hydrological records Koutsoyiannis originally studied. One can see the same thing in Bob Tisdale’s SST curves, where it is very pronounced and where the shifts are often strongly correlated with ENSO activity either way. In other words, there is a possible causal basis underlying the statistical analysis — one isn’t just mixing up a particular recipe of autocorrelated noise that could explain the climate, one is proposing that e.g. multidecadal oscillations cause step shifts in a locally stable equilibrium set point, and that even if these shifts are random they can easily cumulate to produce a near-monotonic shift up or down over a century or two, however unlikely it is that this will persist indefinitely without help in the form of a (e.g. CO2 linked) bias. Where “easily” is understood to be strictly decreasing in probability the more times in a row heads are flipped without the bias.
        Reality is, IMO, very likely to be a mix of these two things. The “local equilibria” being punctuated can be interpreted generically as a field of (locally stable) strange attractors of the complex highly multivariate nonlinear chaotic dynamics. In such a dynamical system, one expects the system to shift “randomly” (chaotically) between an orbit around any given attractor (an orbit producing, BTW, the short period spectra we started by discussing) to an orbit around a neighboring attractor after some number of cycles. However, climate is doubly non-stationary — the field of attractors itself is non-stationary. CO2 concentration variation (and, quite possibly, other things) may well shift the field of attractors, or “tip” the underlying contour of the stable points to make it somewhat more likely to shift to a warmer orbit than to a cooler one. The bitch is then to try to separate the inseparable — to infer the underlying dynamical structure of the inaccessible attractors from the chaotic progression of a single trajectory through an abstract space of almost inconceivable complexity, where there are almost cost certainly orbital fields of attractors distributed around superattractors that correspond, for example, to glacial/interglacial states, that are getting slowly pushed up or down by some truly long time scale dynamics.
        Imagine shooting a pinball onto a vast smooth sheet. Underneath the sheet there is a grid of pins attached to the sheet that can more or less smoothly push it up here, pull it down there. The pins themselves are mounted on a second sheet with larger pins that can push up whole fields of smaller pins. Those pins are also mounted on a sheet with larger pins still, all the way down through an unknown number of steps to a sheet with only two very large pins that very, very, slowly move up or down relative to one another.
        The trajectory of that pinball is, quite probably, our climate with only orbital, non-anthropogenic variation of our climate, or would be if the actual climate wasn’t strongly non-Markovian on top of everything else. Anthopogenic CO2 is slowly adjusting the legs of the pinball table itself, tipping it slowly over towards the warm-phase side of the table. But we have no good way to determine the rate of the tip, because we do not know how to predict the motion of one, single, layer of the underlying pins that are constantly shifting the topology of the top sheet on which the climate actually runs, swirling around a dimple a half dozen times before popping up over the lip and into a neighboring dimple, with curvature that moves it round super-dimples and super-duper-dimples ad nauseam.
        To understand this, we have 50 years or so of halfway decent data — some would argue only 35, some might argue for even less, only 15 or 20 — plus another century plus of increasingly corrupt and incomplete instrumental data, followed by low resolution, high error proxy-based guesses about the trajectory of the pinball in the more remote past. Frankly, we don’t stand a chance. And we won’t stand a chance to make progress until somebody acknowledges the incredible difficulty of the task, difficulty that may require more patience than the three-year grant cycle, tenure-based-on-non-null-results dominated world of scientific research is capable of permitting.
        Nobody wants to hear this, but I think it is very plausible that it will take fifty to a hundred more years of accurate measurements made with instrumentation we haven’t even thought of yet to get a handle on the data needed to understand the climate at a meaningful resolution. And even then it will be impossible as long as political and confirmation bias runs rampant in the field and distorts judgement with its religious save-the-world zeal.
        rgb

  21. Any explanation at all is touted loudly except the actual one, which doesn’t support the ideological agenda.

  22. Ha. They fart in our general direction.*
    57 —. Heinze —>.Beans —> Gas
    * Carefully** recycled from the original Monty Python.
    ** Well as careful as “climate scientists” are.

  23. I made a point here some years ago that I felt that it was at least as likely that the rise in temperature actually caused the Industrial Revolution. Someone wrote an eloquent reply agreeing with me.

  24. Reading between the lines of the actual abstract, a better way to restate their conclusions might be this:

    Climate simulations are tuned so that there are only two important parameters: Greenhouse gas warming and aerosol cooling. Cooling or warming due to orbital variation is nearly irrelevant across time scales of 1000-2000 years. We fixed greenhouse warming and orbital variation was too slow to explain the cooling. We couldn’t creditably reduce GHG concentrations as there is data on that in ice cores. We instead turned up the aerosol cooling until that worked. Then we needed a source for the aerosols, so we hypothesized a high frequency of volcanic eruptions as being responsible. But then the pesky 1800’s rolled around, starting out with a VEI 7 bang in the form of Tambora. Once again, we couldn’t explain the warming just by turning down aerosols, so we started to turn up GHG forcing and again, that worked!

    Sadly, if the authors had actually read e.g.
    http://ftp.pages-igbp.org/download/docs/books/Paleoclimate_global_change_and_the_future/d-chapter2.pdf
    in particular the section near the end concerning atmospheric aerosols, they would have realized that we have direct evidence of aerosol concentrations due to e.g. vulcanism over the last 1000 years and they are flat.
    Oops. Oh, and not only are they flat, they sharply go up starting in roughly 1900 from a baseline of roughly 50 ppm to peak well over 150 ppm. That is, they are somewhere between two and three times their value across the interval they claim was cooled by excess vulcanism. NO_2 is also nearly flat, although for technical reasons it is a spottier record and it starts going up almost parallel with CO_2 in roughly 1800.
    Finally, they missed the recent paper that recomputed the probable climate sensitivity to aerosols that concluded that aerosols have almost no effect on climate. Climate models in general have overestimated the cooling effect of aerosols because they only have two knobs on any relevant time scale within the late Holocene. GHGs warm, aerosols cool. In order to make this so, they had to make aerosols a major cooling factor in order to reproduce the temperature decreasing-to-doldrums observed in the 1940 to 1975 stretch, because CO2 was steadily increasing across that interval but the temperature was not! Fortunately, pollution controls instituted in the 1970s started to decrease aerosols, which apparently kicked off superfast warming by the mid-80’s in this two-knob model of the climate. It wasn’t until the doldrums returned after only 15 years around 2000 that it became increasingly clear that high matched sensitivity to CO2 and aerosols (in opposite directions) just wouldn’t fly, because again CO2 marched up, faster than before, and temperature has peskily refused to cooperate in spite of a general and continuing reduction in aerosols, both man-made and volcanic. This is pretty direct evidence that there is a lot more going on than two-knob models are capable of resolving.
    Current estimates of aerosol cooling indicate that it was overestimated by as much as a factor of three in the GCMs in CMIP5. In order to reproduce the temperature increase across the reference period (which happened to be the one stretch of strong warming visible in the last 75 years!) by means of cancellation of opposing terms, they had to overestimate climate sensitivity, by about the same amount. Again, there is no real question about this — we have very good data on aerosols over the last century, and we have the direct observations of the variation of atmospheric transparency at Mauna Loa as well, which show that even major (VEI 5 and up) volcanic events have almost no effect on top of troposphere insolation and what little they have is confined to 1 to 3 years after the cessation of eruption and is often hemispheric and not global. The temperature record is poorly correlated with volcanic activity, where again major eruptions can have a small effect (comparable to the rate of natural fluctuation and hence almost indistinguishable from it) for a couple or three years (which is also comparable to the timescale of natural fluctuation and hence is almost indistinguishable from it). The indistinguishability is easily proven by playing Willis’ game “Hunt the volcano” — trying to guess when major eruptions occurred by looking at the global temperature anomaly without any date scale and within panels of time that are deliberately scrambled. IMO it is impossible to beat random chance, but play for yourself if you doubt me.
    So this would be a great paper if its primary conclusion were not already disproven by the copious data already available! Which makes it a less than great paper, and an even worse press release.
    Accepting that the paper does demonstrate a general slow cooling of the oceans over 1000 years or so, it fails to explain either the MWP or the LIA, both (now) generally accepted as being true global events. The latter in particular cannot be explained by vulcanism, as a glance at sulphates as a function of time in Greenland ice cores would demostrate, if they bothered to look. It is also the case that land temperatures can countervary or move independently of SSTs, as SSTs have been remarkably flat outside of jumps associated with ENSO events while the global anomalies have more or less continuously increased.
    What all of this suggests — to me — is that global climate models are pitifully inadequate to explain the past climate record. There are more than two knobs, and it isn’t clear that they even have the sign of the feedbacks and direct effects right in all cases. There are almost certainly missing knobs. Finally, some of those missing knobs are very likely to be linked to pure chaos and self-oscillation — the spontaneous appearance of dissipative oscillators in a nonlinear driven open system. Given a number of these oscillators operating at the same time, there are bound to be periods where they heterodyne and others when they cancel each other out, periods of multiple decades to centuries. Since there is little reason to think that the modes themselves are stable or even persistent on century to millennium time scales, the quasi-periods of these oscillations can abruptly shift. IMO this is by far the best explanation for e.g. D-O or Heinrich events, and even explains why they are not strictly periodic but subject to abrupt changes in period or phase or disappearing altogether for a few cycles.
    Note well that this doesn’t rule out still other missing physics. One interesting feature of self-oscillation is that it is often “greedy” (or sensitive to signals on its input) and will tend to slave to and amplify even small fourier variations in their input if they are close to the natural frequencies they are prone to self-generate for physical reasons. If you like, they can be subject to selective “resonance” and amplify signals that are by themselves too small to take seriously into comparatively large variation in state. Transistors and tubes and photodiodes are good examples of this being set up deliberately (and are easy to turn into oscillators and with even a little feedback, often do). The Earth has a huge energy budget flowing through it on a daily basis, on average taking almost all of a TOA insolation that varies from 1325 or so watts/m^2 up to 1415 watts/m^2 as the Earth orbits the sun and reflecting, absorbing, and reradiating it away to outer space. It has obvious strong fourier signals superimposed on this forcing, and has a complex surface structure of sinks and sources. The energy transport that results is rich with self-oscillation, from winds that blow, then pause, then blow, then pause, to waves peak every seventh wave as they march towards the shore, to spectrum of slow and fast gyrations in thermohaline oceanic transport, to the named atmospheric multidecadal oscillations.
    Climate models can accurately capture almost none of this. What they never seem to show, in these papers that assert that “climate models prove this or that” is that the actual output of most climate models is a wide bundle of completely disparate “future possible climates” produced from a tiny spread in initial conditions or parameter values, which in any event are not precisely knowable, let alone known. These future climates are then averaged, and it is this average that is supposedly a valid predictor or hindcaster of our actual climate. But it isn’t! There is no mathematical or empirical support for this implicit assertion. The average of many chaotic trajectories produced by a chaotic system is not a chaotic trajectory and will in all probability not correctly represent self-oscillation modes that dominate actual true trajectories.
    That is, a given GCM might — with luck — spontaneously produce one or more multidecadal oscillations. They probably will not correspond in frequency or amplitude or location to the actual PDO, ENSO, AMO, AO, NAO, monsoon, etc, but there will be something. Any realistic climate will have features like these. But the average of many climates with features like these will not have these required features! And the features matter, because they are directly related to the nonlinear efficiency of energy transport and hence to the non-stationary “equilibrium” temperatures maintained in the open system. As is often noted, even a small shift in oceanic or atmospheric circulation patterns can dramatically affect energy dissipation or retention in the system and result in warming or cooling much greater than the supposed and miscomputed effect of GHG/aerosol forcing within the system. By averaging this away, one arrives at a “climate trajectory” that is truly meaningless, doubly so given the many other absurdities in the GCMs associated with the mismatched spatiotemporal scales of the dynamics, the non-uniformity of the grid usually used, and the lack of knowledge of the physics and forcings that is made up of many assumptions that frankly cannot be proven or even meaningfully tested.
    The final conclusion can only be that politics once again has intruded on science. The paper could have left off all of its assertions of cause based on GCMs and presented only the data, and it would have been good observational science. To present at the same time a study that contradicts the actual data on vulcanism and hence is probably wrong before it is even in print, and then to throw in some good old alarmism right before a major meeting intended to spend a substantial fraction of the world’s income, disposible or not, for the indefinite future and to the definite enrichment of many of the most “interested” parties, including conclusions absent even from the paper itself but safely insulated from scientic criticism, in the public arena, is one more example of Science, Done Wrong.
    It is just sad.
    rgb

    • Excellent.
      Also the location of volcanoes affects their effect on the atmosphere. In general however, even when there is cooling, it’s usually short term, ie years at most. The longer term effect is generally warming.
      There is so much wrong with this paper that it’s hard to know where to start.

    • Nice. The paper now reminds me of the scientists that have been predicting increasing trends in extreme weather due to climate change, evidently without realizing that NOAA keeps very good records of those hurricanes and tornadoes they keep on predicting, and they don’t show an increasing trend.

    • I am late in replying, but thanks for this nuanced analysis.
      For those wondering if your “two knob” take is correct… I’ve been looking at, and thinking about, models and simulations for a while. I had already noticed that Hansen seemed to think the only significant influences on global temperature are stratospheric aerosols and atmospheric trace gases, that is GHGs, as far back as 1988 (or as recently as 1988, depending on one’s frame of reference).
      http://pubs.giss.nasa.gov/abs/ha02700w.html
      If he added any other knobs in his follow-up 1998 paper, I can’t tell. ISTM that he was mostly interested in exercising his confirmation bias here.
      http://pubs.giss.nasa.gov/abs/ha01100t.html

  25. Sorry, I meant “they are flat over the interval from 1000 to 1900 CE” relevant to the assertion that they caused the cooling from 800 to 1800 asserted in the paper.
    rgb

  26. Looking at the graph the rate of warming looks pretty regular and steady since 1700 to 1900 to me.
    A few points.
    1 How was there enough extra man released Co2 in the atmosphere to have any effect back then.
    2, Why is the warming rate pretty much the same since 1700
    3 Could this just be a small natural rebound from the little ice age cooling?
    4 If this is CO2 driven ocean warming, would it not kick in later when the increase from 280 ppm was greater. Was it even enough in 1900 to have much influence at all?
    Looks more like an argument for natural variation in temperature to me.

  27. No article you are wrong and what I have wrote below is correct and applies to the last 7000 years or so and the last 1800 years.
    If one goes back to the Holocene Optimum the question is how fast is the earth cooling?
    Since the Holocene Optimum 8000 years ago the earth has been in a gradual overall cooling trend which has continued up to today punctuated by spikes of warmth such as the Roman ,Medieval and Modern warm periods.
    The main drives of this are Milankovitch Cycles which were more favorable for warmer conditions 8000 years ago in contrast to today , with prolonged periods of active and minimum solar activity superimposed upon this slow gradual cooling trend giving the spikes of warmth I referred to in the above and also periods of cold such as the Little Ice Age.
    Further refinement to the climate coming from ENSO, volcanic activity , the phase of the PDO/AMO but these are temporary earth intrinsic climatic factors superimposed upon the general broader climatic trend.
    All the warming the article refers to which has happened since the end of the Little Ice Age, is just a spike of relative warmth within the still overall cooling trend due to the big pick up in solar activity from the period 1840-2005 versus the period 1275-1840.
    Post 2005 solar activity has returned to minimum conditions and I suspect the overall cooling global temperature trend which as been in progress for the past 8000 years ago will exert itself once again.
    We will be finding this out in the near future due to the prolonged minimum solar activity that is now in progress post 2005

  28. Something I mentioned over at Climate Audit, which was promptly and appropriately squelched by Steve, is that we had better hope that the uptick of ocean temperature is predominantly caused by nature, for if man is responsible, we have chosen a feeble and inadequate method to sustain the trend. Only if nature has bounced us off the dire coldth do we have much hope of longer avoiding the chilling cliff at the end of the Holocene. The Little Ice Age was the coldest depths of the Holocene, and we are at half precession.
    ============================

    • But I’ll take any little effort which wisps the Coldisch Moth away from its Great Glacial Attractor.
      =========

  29. Great. It was warmer 2000 years ago and mankind did not die. History tells that Roman age was quite prosperous.
    Pity that it is difficult to refer figure 1 on the original source without showing the political speculation. Just the picture is quite pale. I would like to use original source in local discussion forums.

  30. The rapid 3~5 year warming recovery following large volcanic events conclusively shows that atmospheric temps are very insensitive, which blows the CAGW hypothesis to pieces as it assumes hypersensitivity involving runaway feedback loops from very tiny changes in IR flux.
    The CAGW alarmists are desperately hoping for a large volcanic event, which they can blame for almost 20 years of flat global temperature trends; i.e. “Our models worked perfectly but those cursed volcanos screwed everything up.”….
    Not so much..

  31. Those have to be the widest error bars I’ve ever seen in a published paper.
    Well, still it’s important to publish papers that fail the Null Hypothesis test. There’s too much bias in favor of high confidence papers in general. Note the trouble the medical world and psychology world got into with this.
    Peter.

  32. Even if their climate models are correct shouldn’t we be grateful? Today we are living longer, healthier with more food than ever before. Cold is not necessarily better……these folks have warm phobia.

    Abstract – October 1998
    Kenneth J. Hsu
    Sun, climate, hunger, and mass migration
    …Northern Europe was wetter while the middle- and low-latitude lands were more arid during colder epochs. Both sets of cold climatical conditions were unfavorable for agricultural production. Historical records show that large demographic movements in history took place because of crop failures and mass starvation, rather than escaping from war zones. The “wandering” of the Germanic tribes during the first two or three centuries of the Christian Era is one example. Whereas the accelerated release of carbon dioxide from the burning of fossil fuels is ultimately to cause global warming, historical evidence indicates, however, that global warming has been on the whole a blessing to mankind. Global cooling, on the other hand, has curtailed agricultural production and has led to famines and mass migrations of people….
    Doi: 10.1007/BF02877737
    http://link.springer.com/article/10.1007/BF02877737
    ————————
    Abstract – 1984
    Hubert H. Lamb
    Some Studies of the Little Ice Age of Recent Centuries and its Great Storms
    …And so the series gives us our most reliable estimate of the magnitude of the temperature depression in England and neighbouring countries. In northern Scotland, southern Norway and Iceland there are indications of a significantly greater depression of the prevailing temperatures…..The enhanced thermal gradient between latitudes about 50° and 60–65°N in this part of the world is thought to have provided a basis for the development of some greater wind storms in these latitudes than have occurred in most of the last 100 years…
    [Climatic Changes on a Yearly to Millennial Basis 1984, pp 309-329]
    doi: 10.1007/978-94-015-7692-5_34
    ————————
    Abstract – 1999
    Wolfgang Behringer
    Climatic Change and Witch-Hunting: The Impact of the Little Ice Age on Mentalities
    …..During the late 14th and 15th centuries the traditional conception of witchcraft was transformed into the idea of a great conspiracy of witches, to explain “unnatural” climatic phenomena……extended witch-hunts took place at the various peaks of the Little Ice Age because a part of society held the witches directly responsibile for the high frequency of climatic anomalies and the impacts thereof. The enormous tensions created in society as a result of the persecution of witches demonstrate how dangerous it is to discuss climatic change under the aspects of morality.
    Doi: 10.1007/978-94-015-9259-8_13
    http://link.springer.com/chapter/10.1007/978-94-015-9259-8_13
    ————————
    Abstract – 2004
    Oster, Emily
    Witchcraft, Weather and Economic Growth in Renaissance Europe
    ….The most active period of the witchcraft trials coincides with a period of lower than average temperature known to climatologists as the “little ice age.” The colder temperatures increased the frequency of crop failure, and colder seas prevented cod and other fish from migrating as far north, eliminating this vital food source for some northern areas of Europe (Fagan, 2000).
    DOI: dx.doi.org/10.1257/089533004773563502
    ————————
    Abstract – 2000
    Reiter P.
    From Shakespeare to Defoe: Malaria in England in the Little Ice Age
    …Until the second half of the 20th century, malaria was endemic and widespread in many temperate regions, with major epidemics as far north as the Arctic Circle. From 1564 to the 1730s—the coldest period of the Little Ice Age—malaria was an important cause of illness and death in several parts of England. Transmission began to decline only in the 19th century,…
    *Centers for Disease Control and Prevention – Volume 6, Number 1—February 2000 – Perspective
    ————————
    Abstract – 2002
    Otto S. Knottnerus
    Malaria Around the North Sea: A Survey
    …Malaria may have been introduced into the North Sea Basin in late Antiquity. It has been endemic at least since the 7th century, but its high-days were the Little Ice Age. After 1750 the disease steadily declined until it disappeared in the 1950s. ….
    doi: 10.1007/978-3-662-04965-5_21
    ————————
    Abstract – 1980
    AB Appleby
    Epidemics and famine in the little ice age
    …The frequent crises were caused by famine, epidemic disease, and war, sometimes working in combination, sometimes not….France was especially subject to famine in the seventeenth and early eighteenth centuries, with terrible crises falling in 1630-1631, 1649-1652, 1661-1662, 1693-1694, and 1709-1710….
    [transcribed by me]
    http://www.jstor.org/discover/10.2307/203063?uid=2&uid=4&sid=21103592744971
    ————————
    Abstract – 2013
    S. Engler et al
    The Irish famine of 1740–1741: famine vulnerability and “climate migration
    The “Great Frost” of 1740 was one of the coldest winters of the eighteenth century and impacted many countries all over Europe. The years 1740–1741 have long been known as a period of general crisis caused by harvest failures, high prices for staple foods, and excess mortality……We regard migration as a form of adaptation and argue that Irish migration in 1740–1741 should be considered as a case of climate-induced migration.
    doi: 10.5194/cp-9-1161-2013, 2013
    ————————
    Abstract – 1998
    M.D. Flannigan et al
    Future wildfire in circumboreal forests in relation to global warming
    Despite increasing temperatures since the end of the Little Ice Age (ca. 1850), wildfire frequency has decreased as shown in many field studies from North America and Europe. We believe that global warming since 1850 may have triggered decreases in fire frequency in some regions and future warming may even lead to further decreases in fire frequency….
    DOI: 10.2307/3237261
    ————————
    Abstract – 1993
    Yves Bergeron, Sylvain Archambault
    Decreasing frequency of forest fires in the southern boreal zone of Québec and its relation to global warming since the end of the ‘Little Ice Age’
    doi: 10.1177/095968369300300307
    ————————
    Abstract – 2006
    J.M. Russell, T.C. Johnson
    Little Ice Age drought in equatorial Africa
    …….A high ratio of Mg to Ca (%Mg) indicates strong droughts in central Africa during the Little Ice Age (A.D. 1400–1750), in contrast to records from Lake Naivasha, Kenya, which suggest a wet Little Ice Age. This spatial pattern in Africa likely arose due to coupled changes in the high latitudes, the position of the Intertropical Convergence Zone, and the El Niño–Southern Oscillation (ENSO) system. Our results further suggest that the patterns and variability of twentieth-century rainfall in central Africa have been unusually conducive to human welfare in the context of the past 1400 yr.
    doi: 10.1130/G23125A.1
    ————————
    Abstract – 2005
    Climate change, social unrest and dynastic transition in ancient China
    Dian Zhang et al
    …this study adopted a scientific approach to compare the paleoclimatic records with the historical data of wars, social unrests, and dynastic transitions in China spanned from the late Tang to Qing Dynasties. Results showed that war frequency in cold phases was much higher than that in mild phases. Besides, 70%–80% of war peaks and most of the dynastic transitions and nationwide social unrests in China took place in cold phases. …
    [Note: Qing dynasty 1644 to 1912]
    doi: 10.1007/BF02897517
    ————————
    Abstract
    Glynn, Peter W. et al
    A dead Central American coral reef tract: Possible link with the Little Ice Age
    …These analyses showed that live coral reefs in the Gulf of Papagayo, Costa Rica, were severely depleted in number, size and variety of species, compared to reefs in the major upwelling zone of the Gulf of Panama. Coral growth in the Gulf of Papagayo consisted mainly of dead reefs that died from 150–300 years B.P….
    doi: dx.doi.org/10.1357/002224083788519740
    ————————
    Abstract – 1979
    Great Historical Events That Were Significantly Affected by the Weather: 4, The Great Famines in Finland and Estonia, 1695–97
    …It is estimated that in Finland about 25–33% of the population perished (Jutikkala, 1955; Muroma, 1972), and in Estonia-Livonia about 20% (Liiv, 1938)….Records indicate that in the absence of an appropriate diet, the population consumed unwholesome and partly or fully indigestible ‘foods’ which led to widespread diseases and epidemics (diarrhea of sorts, including lientery, dysentery, etc.). There were even some cases of cannibalism,…
    doi: dx.doi.org/10.1175/1520-0477(1979)0602.0.CO;2
    ————————
    Abstract – 2007
    James M. Russell et al
    Spatial complexity of ‘Little Ice Age’ climate in East Africa:
    sedimentary records from two crater lake basins in western Uganda
    …Variations in sedimentation and salt mineralogy of hypersaline Lake Kitagata, and a succession of fine-grained lake sediments and peat in the freshwater Lake Kibengo, suggest century-scale droughts centred on AD 0, ~1100, ~1550 and 1750. These results broadly support data from nearby Lake Edward on the timing of drought in western Uganda, but contrast with lake sediment records from eastern equatorial Africa….
    doi: 10.1177/0959683607075832
    ————————
    Letter To Nature – 1993
    Large increases in flood magnitude in response to modest changes in climate
    James C. Knox
    …Here I present a 7,000-year geological record of overbank floods for upper Mississippi river tributaries in mid-continent North America,……..After ~3,300 years ago, when the climate became cooler and wetter, an abrupt shift in flood behaviour occurred, with frequent floods of a size that now recurs only once every 500 years or more. Still larger floods occurred between about AD 1250 and 1450, during the transition from the medieval warm interval to the cooler Little Ice Age….
    doi: 10.1038/361430a0
    ————————
    Abstract – 1983
    Jean M. Grove et al
    Tax records from western Norway, as an index of Little Ice Age environmental and economic deterioration
    Data from general tax commissions held in Sunnfjord Fogderi, Norway, reveal a substantial decline in rural prosperity between 1667 and 1723. Late seventeenth and eighteenth century incidence of serious physical damage to farmlands is documented in tax relief proceedings. Environmental deterioration characterised the early years of the Little Ice Age in western Norway.
    Doi: 10.1007/BF02423522
    ————————
    Abstract – 2004
    Richard H. Steckel
    New Light on the “Dark Ages” The Remarkably Tall Stature of Northern European Men during the Medieval Era
    …..It is plausible to link the decline in average height to climate deterioration; growing inequality; urbanization and the expansion of trade and commerce, which facilitated the spread of diseases; fluctuations in population size that impinged on nutritional status; the global spread of diseases associated with European expansion and colonization; and conflicts or wars…..
    doi: 10.1215/01455532-28-2-211
    ————————
    Abstract – 2005
    David A. Hodella et al
    Climate change on the Yucatan Peninsula during the Little Ice Age
    …Climate change in the 15th century is also supported by historical accounts of cold and famine described in Maya and Aztec chronicles. We conclude that climate became drier on the Yucatan Peninsula in the 15th century A.D. near the onset of the Little Ice Age (LIA). Comparison of results from the Yucatan Peninsula with other circum-Caribbean paleoclimate records indicates a coherent climate response for this region at the beginning of the LIA. At that time, sea surface temperatures cooled and aridity in the circum-Caribbean region increased.
    Doi: dx.doi.org/10.1016/j.yqres.2004.11.004
    ————————
    Abstract – 2011
    David D. Zhang et al
    The causality analysis of climate change and large-scale human crisis
    …Results show that cooling from A.D. 1560–1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis….Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere.
    doi: 10.1073/pnas.1104268108
    ————————
    Abstract – 1997
    L.K. Barlow et al
    Interdisciplinary investigations of the end of the Norse Western Settlement in Greenland
    …Historical climate records, mainly from Iceland, contain evidence for lowered temperatures and severe weather in the north Atlantic region around the mid-fourteenth century. Archaeological, palaeoecological and historical data specifically concerning the Western Settlement suggest that Norse living conditions left little buffer for unseasonable climate, and provide evidence for a sudden and catastrophic end around the mid-fourteenth century….
    doi: 10.1177/095968369700700411
    ————————
    Paper – 2008
    Phil Jones
    Historical climatology-a state of the art review
    River Thames freeze-overs (and sometimes frost fairs) only occurred 23 times between 1408 and 1814 (Lamb, 1977) when the old London Bridge constricted flow through its multiple piers and restricted the tide with a weir. Figure 1 shows the character of Old London Bridge with its many arches and obstructions to flow….
    Special Issue: Historical Climatology Volume 63, Issue 7, pages 181–186, July 2008
    DOI: 10.1002/wea.245

  33. The oceans mediate the response of global climate to natural and anthropogenic forcings.
    The usual atmosphere-centric BS. Oceans do not “mediate”, they drive climate fluctuations which are always happening naturally. The 97 percent value that folks are so fond of actually applies to the climate heat budget – 97% in the ocean, 3% in the atmosphere. All that ocean heat is not passive and static but agitated in chaotic-nonlinear mixing processes. Thus the circulating global oceans serve up continuous climate change.
    It is wrong to imagine that climate is driven by the 3% and not the 97%, i.e. by the atmosphere by processes like volcanoes and anthropogenic emissions. Volcanoes are a fart in a thunderstorm – posts by Willis and others have shown how grossly exaggerated the supposed climate effect of volcanoes are. Stories about a “year without a summer” don’t stand up to scrutiny. Their effects are minor and short-lived unless they reach the scale of supervolcanoes or a flood basalt.
    The ocean does not “mediate” climate, it drives it, the atmosphere mediates ocean oscillations.
    And finally, please stop using that “F” word in regard to climate. It is not forced from outside, it forces itself by its own internal dynamic (although there can be weak periodic forcing from astrophysical cycles.)

Comments are closed.