BBC – Real risk of a Maunder minimum 'Little Ice Age'

latest_512_4500[1]
The sun right now – showing increased activity over the last couple of weeks – click for details
From BBC’s Paul Hudson

It’s known by climatologists as the ‘Little Ice Age’, a period in the 1600s when harsh winters across the UK and Europe were often severe.

The severe cold went hand in hand with an exceptionally inactive sun, and was called the Maunder solar minimum.

Now a leading scientist from Reading University has told me that the current rate of decline in solar activity is such that there’s a real risk of seeing a return of such conditions.

I’ve been to see Professor Mike Lockwood to take a look at the work he has been conducting into the possible link between solar activity and climate patterns.

According to Professor Lockwood the late 20th century was a period when the sun was unusually active and a so called ‘grand maximum’ occurred around 1985.

Since then the sun has been getting quieter. 

By looking back at certain isotopes in ice cores, he has been able to determine how active the sun has been over thousands of years.

Following analysis of the data, Professor Lockwood believes solar activity is now falling more rapidly than at any time in the last 10,000 years.

He found 24 different occasions in the last 10,000 years when the sun was in exactly the same state as it is now – and the present decline is faster than any of those 24.

Based on his findings he’s raised the risk of a new Maunder minimum from less than 10% just a few years ago to 25-30%.

And a repeat of the Dalton solar minimum which occurred in the early 1800s, which also had its fair share of cold winters and poor summers, is, according to him, ‘more likely than not’ to happen.

He believes that we are already beginning to see a change in our climate – witness the colder winters and poor summers of recent years – and that over the next few decades there could be a slide to a new Maunder minimum.

It’s worth stressing that not every winter would be severe; nor would every summer be poor. But harsh winters and unsettled summers would become more frequent.

Professor Lockwood doesn’t hold back in his description of the potential impacts such a scenario would have in the UK.

He says such a change to our climate could have profound implications for energy policy and our transport infrastructure.

Although the biggest impact of such solar driven change would be regional, like here in the UK and across Europe, there would be global implications too.

According to research conducted by Michael Mann in 2001, a vociferous advocate of man-made global warming, the Maunder minimum of the 1600s was estimated to have shaved 0.3C to 0.4C from global temperatures.

It is worth stressing that most scientists believe long term global warming hasn’t gone away. Any global cooling caused by this natural phenomenon would ultimately be temporary, and if projections are correct, the long term warming caused by carbon dioxide and other greenhouse gases would eventually swamp this solar-driven cooling.

But should North Western Europe be heading for a new “little ice age”, there could be far reaching political implications – not least because global temperatures may fall enough, albeit temporarily, to eliminate much of the warming which has occurred since the 1950s.

You can see more on Inside Out on Monday 28th October on BBC1, at 7.30pm.

###

From http://www.bbc.co.uk/blogs/paulhudson/posts/Real-risk-of-a-Maunder-minimum-Little-Ice-Age-says-leading-scientist

==============================================================

Back in 2011, Lockwood said something totally dissimilar:

“The Little Ice Age wasn’t really an ice age of any kind – the idea that Europe had a relentless sequence of cold winters is frankly barking” – Dr Mike Lockwood Reading University

From: http://wattsupwiththat.com/2011/10/10/bbc-the-little-ice-age-was-all-about-solar-uv-variability-wasnt-an-ice-age-at-all/

I have a follow-on article coming up on UV observations in a couple of hours, don’t miss it.

Meanwhile the sun has recently gotten more active in the last couple of weeks, indicating a possible second peak in the current solar cycle is upon us, see details on the WUWT Solar reference page – Anthony

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

188 Comments
Inline Feedbacks
View all comments
Tim OBrien
October 28, 2013 11:58 am

Don’t worry, Al Gore can belch enough CO2 to offset the cooling…

mwhite
October 28, 2013 12:01 pm

“Inside Out Yorkshire and Lincolnshire”
http://www.bbc.co.uk/programmes/b03flj49

October 28, 2013 12:06 pm

An interesting (and very different) approach to modelling:
http://www.newclimatemodel.com/new-climate-model/
##########
Its so bad its not even wrong
“The general approach is currently to describe the climate system from ‘the bottom up’ by accumulating vast amounts of data, observing how the data has changed over time, attributing a weighting to each piece or class of data and extrapolating forward. When the real world outturn then differs from what was expected then adjustments are made to bring the models back into line with reality. This method is known as ‘hindcasting’.”
1. The current approach is to start with the known laws of physics. Not with data. in fact climate models were built before the collection of much of the data that is used to check them.
2. There is no “weighting” of pieces or classes of data. When ModelE code was released to the public I went t hrough every frickin line of code. No weights. no extrapolation. Wilde is making shit up. he even claims to be a statistician although he has never produced a single calculation or estimate with uncertainties.
3. He does not even know what hind casting is.

October 28, 2013 12:09 pm

Johanus says:
October 28, 2013 at 11:50 am
Why do we associate high “solar activity” with high sunspot count?
Because sunspots are a proxy for high magnetic fields in and around them. It is that magnetic field that is the ‘real’ solar activity. The additional radiation from the surrounding magnetic field wins over the blocking effect of the darker sunspots.

Kitefreak
October 28, 2013 12:16 pm

R Taylor says:
“… the possible link between solar activity and climate patterns.” Could the BBC pour fluid out of a boot if the instructions were written on the sole?
———————-
I liked “and if projections are correct” from Paul Hudson. Excuse me Paul, we have already seen with the passage of time that the IPCC’s “projections” are NOT correct, so…..
Honestly, the BBC can’t help but put the old propaganda lines at the end of each article about the environment; in this case the “don’t worry about the cold winters, the scary CO2 monster is coming back later unless we do something about it” BS (actually a relatively new propaganda line, since nature stopped cooperating).
I read on another excellent blog a commenter said: “The job of the mainstream news media is not to report the news, their job is to shape public opinion. People need to repeat that over to themselves, until it sinks in”.

milodonharlani
October 28, 2013 12:24 pm

SAMURAI says:
October 28, 2013 at 11:53 am
CACA never was a viable hypothesis. Its advocates never falsified the null hypothesis that observed climatic fluctuations since the 1970s or 1940s were in any way out of the ordinary.

Latitude
October 28, 2013 12:31 pm

in fact climate models were built before the collection of much of the data that is used to check them.
===
does that include the data they make up…..in order to make the models work?

October 28, 2013 12:37 pm

So… now they’re going to agree that a less active sun results in a cooler climate, and make that the excuse they need for why CO2 isn’t doing it, while at the same time arguing that the earlier active sun had nothing whatsoever to do with warming of the 80s and that our evil human-produced CO2 will soon swamp the sun’s current cooling of the climate, restoring mankind’s evil dominance that must be expunged at all costs… to save the world… and our children’s children…
Right… right… just checking. Hopeful bunch, aren’t they? They are desperate for the warming “catastrophe” to return, but at all costs it must be seen as human induced… so that we mere mortals can be clamorous to be led to… Oh, I give up! Somebody inform these Bozos that it’s school in the morning with some hard lessons coming, so they had better call it a night.

Phineas Fahrquar
October 28, 2013 12:38 pm

Reblogged this on Public Secrets and commented:
Though I think this is much more likely than catastrophic man-caused warming, it should still be taken with skepticsm: first, the BBC wholly bought into the “warming mania,” so it’s always possible they’ll fall for the next climate hysteria. (Remember global cooling in the 70s?) Second, the professor cited in the article is on record as saying a few years ago almost the opposite of what he’s saying, now. Everyone can change their opinions, of course, as new data comes in, but it’s something to keep in mind.

Roy
October 28, 2013 12:46 pm

Latitude says:
October 28, 2013 at 11:38 am
the ‘Little Ice Age’, a period in the 1600s when harsh winters across the UK and Europe were often severe……
truth is, no one would probably notice the difference……these guys play it up like it was freezing all the time.

Isn’t it strange that some people don’t think that we would notice the change if we returned to the conditions of the Little Ice Age, yet they think that a similar increase in temperature would be catastrophic?

milodonharlani
October 28, 2013 12:49 pm

Roy says:
October 28, 2013 at 12:46 pm
People would notice the Thames & Rhine freezing over & Swiss glaciers advancing again to threaten mountain villages.

ggoodknight
October 28, 2013 12:55 pm

“Because sunspots are a proxy for high magnetic fields in and around them. It is that magnetic field that is the ‘real’ solar activity.” -lsvalgaard
Correctomundo, though I don’t quite agree with the ‘real’. Luminosity would of greater interest if it varied very much, but it doesn’t. If it varied much we’d be very concerned.
One of the better ‘general science’ descriptions I’ve heard was from some PBS primetime science program where a solar physicist made a comparison to cooking spaghetti in a large pot. A low boil and you can see the pasta moving under the surface; get a roiling boil going, and the strands break the surface here and there. The higher the boil, the more the surface is broken. Sunspots and magnetic field strength are nicely analogous to the strands breaking the surface while boiling pasta and the heat under the pot.
The “Team” has been putting a lid on the link of GCR and sunspots to climate for 20 years. It’s time to see a change. Thank you, Mike Lockwood.

Forrest
October 28, 2013 12:58 pm

The real issue with trends is they only are trends until something changes…

October 28, 2013 1:00 pm

ggoodknight says:
October 28, 2013 at 12:55 pm
The higher the boil, the more the surface is broken. Sunspots and magnetic field strength are nicely analogous to the strands breaking the surface while boiling pasta and the heat under the pot.
Dumbed-down analogies always fail somewhere. The solar photosphere is ‘boiling’ and roiling all the time all over the sun, sunspots or no sunspots. Actually, the sunspots suppress that boiling a bit, thus interfering with the transport of heat to the surface.

ggoodknight
October 28, 2013 1:34 pm

ls, while I completely agree that dumbed down analogies always break down somewhere, I’m sorry I presented the analogy in a way that lead you to think I, or the unnamed solar physicist, thought heat within in the sun was involved in the analogy.
There was no intention for the analogy to link such magnetic activity to heat within the sun. Just magnetic activity and sunspots being analogous to the relationship of the strength of a boil and the pasta breaking the surface.

October 28, 2013 1:35 pm

J Martin on October 28, 2013 at 11:54 am
One in the eye for the paticipants of the secret BBC conference that declared the science settled and the need for balanced reporting over. A foolish mistake that may shorten many of their careers, hopefully.

– – – – – – –
J Martin,
Ahhh yes!
One of the behind-the-public’s-back advocates of unbalanced climate reporting was the BBC’s Roger Harrabin, the slinky (referring to the toy) of catastrophic climate change journalists. A Slinky only goes one direction; downhill.
Roger Harrabin is to climate journalism coverage policy as Maximilien Robespierre was to public safety policy.
John

October 28, 2013 1:50 pm

ggoodknight says:
October 28, 2013 at 1:34 pm
Just magnetic activity and sunspots being analogous to the relationship of the strength of a boil and the pasta breaking the surface.
But that is precisely where the analogy is wrong. The strength of the boil [on the sun] is not related to the eruption of sunspots.

October 28, 2013 1:59 pm

Latitude says:
October 28, 2013 at 12:31 pm
in fact climate models were built before the collection of much of the data that is used to check them.
===
does that include the data they make up…..in order to make the models work?
You dont make up data to make the models fit. In every tuning experiment I’ve looked at that is not the case. A typical tuning parameter might be Aerosols.
Let me give you an example. once we had to built a flight control model for the singapore airforce. The plane in question had an accelerometer in the sensor path. The accuracy of this accelerometer was unknown. So, how do we fix the model.
Simple. We build the model where the accuracy of the acceleromter is a knob
Step 1, determine the knob limits: we do this by looking at various boundary cases for devices and device theory. That gives us an upper bound and lower bound
Then we run the model at various settings of the knob until the model results match the observations. This gives us a value for the setting of the knob
In a climate model the value for aerosols is bounded: lets call it X + or minus .5 Watts.
You run the model and fiddle that knob. Since the value is unknown but bounded its about the only thing you can do since you cant go back in time and measure the aerosols.

October 28, 2013 2:04 pm

lsvalgaard ggoodknight
Why bother with analogy when you can observe the ‘real’ thing

ggoodknight
October 28, 2013 2:05 pm

“But that is precisely where the analogy is wrong. The strength of the boil [on the sun] is not related to the eruption of sunspots.”
Sigh.
There is no “boil [on the sun]” in the analogy. We seem to be talking past each other.

October 28, 2013 2:17 pm

ggoodknight says:
October 28, 2013 at 2:05 pm
There is no “boil [on the sun]” in the analogy. We seem to be talking past each other.
ggoodknight says:
October 28, 2013 at 12:55 pm
“The higher the boil, the more the surface is broken. Sunspots and magnetic field strength are nicely analogous to the strands breaking the surface while boiling pasta and the heat under the pot.”
Then I don’t ‘get’ the analogy.

ggoodknight
October 28, 2013 2:23 pm

“Then I don’t ‘get’ the analogy.”
I got that from your first response.

October 28, 2013 2:23 pm

here we are:

October 28, 2013 2:26 pm

ggoodknight says:
October 28, 2013 at 2:23 pm
“Then I don’t ‘get’ the analogy.”
I got that from your first response.

Then it must have been a very bad analogy…
Analogies are supposed to further the understanding, not stand in the way of it.

richardscourtney
October 28, 2013 2:26 pm

Steven Mosher:
Your post at October 28, 2013 at 1:59 pm says

You dont make up data to make the models fit. In every tuning experiment I’ve looked at that is not the case. A typical tuning parameter might be Aerosols.
Let me give you an example. once we had to built a flight control model for the singapore airforce. The plane in question had an accelerometer in the sensor path. The accuracy of this accelerometer was unknown. So, how do we fix the model.
Simple. We build the model where the accuracy of the acceleromter is a knob
Step 1, determine the knob limits: we do this by looking at various boundary cases for devices and device theory. That gives us an upper bound and lower bound
Then we run the model at various settings of the knob until the model results match the observations. This gives us a value for the setting of the knob
In a climate model the value for aerosols is bounded: lets call it X + or minus .5 Watts.
You run the model and fiddle that knob. Since the value is unknown but bounded its about the only thing you can do since you cant go back in time and measure the aerosols.

Bollocks!
The aerosol fudges in climate models are completely “made up” and each model uses a unique aerosol fudge.

It seems I need to post the following yet again.
None of the models – not one of them – could match the change in mean global temperature over the past century if it did not utilise a unique value of assumed cooling from aerosols. So, inputting actual values of the cooling effect (such as the determination by Penner et al.
http://www.pnas.org/content/early/2011/07/25/1018526108.full.pdf?with-ds=yes )
would make every climate model provide a mismatch of the global warming it hindcasts and the observed global warming for the twentieth century.
This mismatch would occur because all the global climate models and energy balance models are known to provide indications which are based on
1.
the assumed degree of forcings resulting from human activity that produce warming
and
2.
the assumed degree of anthropogenic aerosol cooling input to each model as a ‘fiddle factor’ to obtain agreement between past average global temperature and the model’s indications of average global temperature.
More than a decade ago I published a peer-reviewed paper that showed the UK’s Hadley Centre general circulation model (GCM) could not model climate and only obtained agreement between past average global temperature and the model’s indications of average global temperature by forcing the agreement with an input of assumed anthropogenic aerosol cooling.
The input of assumed anthropogenic aerosol cooling is needed because the model ‘ran hot’; i.e. it showed an amount and a rate of global warming which was greater than was observed over the twentieth century. This failure of the model was compensated by the input of assumed anthropogenic aerosol cooling.
And my paper demonstrated that the assumption of aerosol effects being responsible for the model’s failure was incorrect.
(ref. Courtney RS An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).
More recently, in 2007, Kiehle published a paper that assessed 9 GCMs and two energy balance models.
(ref. Kiehl JT,Twentieth century climate model response and climate sensitivity. GRL vol.. 34, L22710, doi:10.1029/2007GL031383, 2007).
Kiehl found the same as my paper except that each model he assessed used a different aerosol ‘fix’ from every other model. This is because they all ‘run hot’ but they each ‘run hot’ to a different degree.
He says in his paper:

One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.
The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy.
Kerr [2007] and S. E. Schwartz et al. (Quantifying climate change–too rosy a picture?, available at http://www.nature.com/reports/climatechange, 2007) recently pointed out the importance of understanding the answer to this question. Indeed, Kerr [2007] referred to the present work and the current paper provides the ‘‘widely circulated analysis’’ referred to by Kerr [2007]. This report investigates the most probable explanation for such an agreement. It uses published results from a wide variety of model simulations to understand this apparent paradox between model climate responses for the 20th century, but diverse climate model sensitivity.

And, importantly, Kiehl’s paper says:

These results explain to a large degree why models with such diverse climate sensitivities can all simulate the global anomaly in surface temperature. The magnitude of applied anthropogenic total forcing compensates for the model sensitivity.

And the “magnitude of applied anthropogenic total forcing” is fixed in each model by the input value of aerosol forcing.
Thanks to Bill Illis, Kiehl’s Figure 2 can be seen at
http://img36.imageshack.us/img36/8167/kiehl2007figure2.png
Please note that the Figure is for 9 GCMs and 2 energy balance models, and its title is:

Figure 2. Total anthropogenic forcing (Wm2) versus aerosol forcing (Wm2) from nine fully coupled climate models and two energy balance models used to simulate the 20th century.

It shows that
(a) each model uses a different value for “Total anthropogenic forcing” that is in the range 0.80 W/m^2 to 2.02 W/m^2
but
(b) each model is forced to agree with the rate of past warming by using a different value for “Aerosol forcing” that is in the range -1.42 W/m^2 to -0.60 W/m^2.
In other words the models use values of “Total anthropogenic forcing” that differ by a factor of more than 2.5 and they are ‘adjusted’ by using values of assumed “Aerosol forcing” that differ by a factor of 2.4.
So, each climate model emulates a different climate system. Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth. And the fact that they each ‘run hot’ unless fiddled by use of a completely arbitrary ‘aerosol cooling’ strongly suggests that none of them emulates the climate system of the real Earth.
Richard

Verified by MonsterInsights