I have some major pressing life issues that I have to deal with at this time, so all I can offer is an Open Thread. Feel free to discuss topics within our normal range. Anyone who wants to submit a guest post will be welcomed, provided it is factual and on topic. Use the “submit story” from the pulldown menu if you wish to contribute. If it is a technical essay with embedded graphs, please see the instructions on the submit story page. Thank you for your consideration.
Anthony Watts
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
[more krap from Doug Cotton -mod]
Hypocrisy alert. The gentleman featured in a previous WUWT post lecturing on a carbon-free lifestyle had just purchased a little seaside cottage in Malibu, California.
WUWT link:
http://wattsupwiththat.com/2014/09/24/another-green-calls-for-deniers-to-be-jailed/
Hypocrisy here:
Malibu link:
http://www.trulia.com/luxe/2014/09/22/cheryl-hines-and-robert-f-kennedy-buy-malibu/
Newlyweds Cheryl Hines and Robert F. Kennedy Jr. are putting down roots on the West Coast!
The just-married duo just purchased a gorgeous East Coast-style compound in the Point Dume area of Malibu, CA. Property records show that the home became property of Mr and Mrs. Kennedy on September 9, 2014, and they paid $4,995,000 for the 4 bedroom, 3 bathroom, 1 acre estate.
Melding her West Coast life with Kennedy’s East Coast roots, the Connecticut barn style main house has all new wood floors and wood elements throughout. The romantic guest house has a fireplace and full bath. The pool house has a laddered sleeping loft, full kitchen, huge shower, and a professional, soundproof recording studio!
In addition to the guest and pool house, there is a refrigerated wine shed, a two-story tree house, fire-pit, heated pool, Gunite spa, full built-in BBQ area, barn storage shed and digital phone and intercom system throughout the compound!
As she prepped for life as Mrs. Kennedy, Cheryl listed her former home in Bel Air, which just sold for $3,105,000.
[end quote]
Note the closing date of September 9th was before the Peoples Climate March on September 19-21, so there is nested hypocricy – lifestyle and sea level rise.
Golly. I just put earnest money down on a house in my little town on 3/4th of an acre. I think I will call it an estate. Call me unimpressed with their little piece being called a “compound”. With a price like that, I am guessing the house covers most of the 1 acre it sits on.
I don’t know. A year or so ago I read that in some parts of CA a mobile home can go for $1,000,000. Location, location, location (plus a bit of inflation).
They’re welcome to their “compound” as long as they don’t mess with mine.
Not a single solar panel or windmill to be seen.
Canadians say, booey-heh
It’s worth a look at Russian Arctic and Antarctic Research Institute’s (AARI) sea ice maps, that rely on actual observations (from ships and aircrafts) and are available from 1950 onwards here: http://www.aari.nw.ru/gdsidb/sea_ice/arctic/scripts/aari_n.html
And it’s worth to compare them with those of the most widely used sea ice dataset: Chapman&Walsh (http://polar.ncep.noaa.gov/seaice/climatology/months.shtml — http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/seasonal.extent.1900-2010.png).
End of August 1952 (Chapman&Walsh) vs. September 1st 1952 (AARI):
End of August 1952 (Chapman&Walsh) vs. September 1st 1952 (AARI): http://images.meteociel.fr/im/8192/image006_mgb6.png
The reason we’re having trouble detecting a signal from Co2 in annual temp trends is its very small and insignificant.
This morning at ~8:45 it was ~42F air temp, clear sky and Sunny. 47F sidewalk and grass temp, zenith was ~-57. At 12:30 it got cloudy air temp ~50F 65% humidity, sidewalk 57, cloud bottom s 23.4F. 6:30 pm 50F 63%, sidewalk was 51F, but clear skies at -59F.
Co2 adds 3-4 F at ~-50F, 1-2 F at 23F.
So 70F between clouds and no clouds vs 4F between Co2 and no Co2.
These are the temps the surface radiates to. You wonder why they are running as fast as they can from AGW being based on surface temps? Between ir temps, the difference between yesterday’s raising temp and last night’s falling temp, that surface temp are not changing globally, but mostly regionally, surface temps are a real loser.
Julia Roberts as Mother Nature: ‘I Don’t Really Need People’
Should the human race suffer a massive die-off or even extinction, Mother Nature, to hear Julia Roberts tell it, won’t much care.
http://www.businessweek.com/articles/2014-10-06/julia-roberts-as-mother-nature-i-don-t-really-need-people#r=shared
I wish Greens would make up their minds. Which is it, humans can destroy nature or can we not?
Only humans that aren’t green can destroy nature. So I guess only the little green men from Mars are OK?
(Some of the greens probably think that’s how we all got here to begin with.)
These papers are free-access available online until January 2015 at http://www.annualreviews.org/toc/statistics/1/1:
Statistics and Climate,
Peter Guttorp, Department of Statistics, University of Washington, Seattle, Washington 98195
Abstract
For a statistician, climate is the distribution of weather and other variables that are part of the climate system. This distribution changes over time. This review considers some aspects of climate data, climate model assessment, and uncertainty estimation pertinent to climate issues, focusing mainly on temperatures. Some interesting methodological needs that arise from these issues are also considered.
First paragraph of Introduction:
1. INTRODUCTION
This review contains a statistician’s take on some issues in climate research. The point of view is that of a statistician versed in multidisciplinary research; the review itself is not multidisciplinary. In other words, this review could not reasonably be expected to be publishable in a climate journal. Instead, it contains a point of view on research problems dealing with some climate issues, problems amenable to sophisticated statistical methods and ways of thinking. Often such methods are not current practice in climate science, so great opportunities exist for interested statisticians.
Climate Simulators and Climate Projections,
Jonathan Rougier and Michael Goldstein
Department of Mathematics, University of Bristol, Bristol, BS8 1TW, United Kingdom;
Department of Mathematical Sciences, University of Durham, Durham, DH1 3LE
Abstract
We provide a statistical interpretation of current practice in climate modeling. In this review, we define weather and climate, clarify the relationship between simulator output and simulator climate, distinguish between a climate simulator and a statistical climate model, provide a statistical interpretation of the ubiquitous practice of anomaly correction along with a substantial generalization (the best-parameter approach), and interpret simulator/data comparisons as posterior predictive checking, including a simple adjustment to allow for double counting. We also discuss statistical approaches to simulator tuning, assessing parametric uncertainty, and responding to unrealistic outputs. We finish with a more general discussion of larger themes.
1. INTRODUCTION
Our purpose in this review is to interpret current practice in climate modeling in the light of statistical inferences about past and future weather. In this way, we hope to emphasize the common ground between our two communities and to clarify climate modeling practices that may not, at first sight, seem particularly statistical. From this starting point, we can then suggest some relatively simple enhancements and identify some larger issues. Naturally, we have had to simplify many practices in climate modeling, but not—we hope—to the extent of making them unrecognizable.
Climate: your distribution of weather, represented as a multivariate spatiotemporal process (inherently subjective)
Weather: measurable aspects of the ambient atmosphere, notably temperature, precipitation, and wind speed.
On Computer Programs, Computer Models, the Scientific Method, and Global Warming
Anyone who has been following the current global warming science knows that there is relationship among all of the issues in this article’s title. I will attempt to make the relationships more clear.
First, my credentials. I am a 1985 graduate of West Point with a degree in computer science. I have been working in the computer field since I left the Service, which gives me more than 20 years of experience in IT. Currently I am working as an Oracle DBA, as I have been since 1999.
Through our history, as science developed and grew more complex, the need for complex equations also developed. As equations get more complex, the chance of error and the time required to calculate the answers for the equations grew. Particularly in such applications as projectile motion, the need for rapid, correct answers to complex equations kept increasing. It was this need for quick, correct answers to difficult equations that led to the development of computers and computer programs. A computer program will always produce the same answer to the same question. Note that the above description may not be the history of computers that you have read, but it contains the basics of computer development in a nutshell.
This leads us to computer models. A computer model is simply an equation or series of equations used to simulate a real world condition in the form of a computer program, sometimes mistakenly called a software program. Computer models are used because a given set of data will always result in the same answer, and it will be the correct answer if the program is correctly written. Models are used extensively today in projectile motion, orbital mechanics, and other scientific and engineering fields.
The most import single item to understand about computer programs is this: They are used to calculate an answer based on the theory of the scientist or engineer is applying. The programmer writes the program so that it complies with that theory. For example, the basic formula for calculating the distance an object will travel in a vacuum is a quadratic equation. This same formula can be used on Earth, even though it is not in vacuum, so the equation isn’t quite right, but it will still get close to the correct answer. So the initial programs were written just to resolve the quadratic equation. As more and more variables are resolved, the equation can be modified and fixed, and the answer gets closer and closer to reality. For example atmospheric pressure and air temperatures are significant factors that are figured into ballistics computers these days, but they are not part of that initial simple quadratic equation. Even the initial equation isn’t quite right because the Earth’s gravity field isn’t consistent everywhere because a planet and its moon are not perfect spheres. But the key point to this is that the scientist has a theory on how something works, and uses a program to determine what the answer would be if his theory is correct. In ballistics, the theory was a quadratic equation, which was close, but there were other items to factor in that reflected the real world. Thus a computer model is actually the mathematical version of the scientific theory, and as such it cannot prove a theory is correct. The model in effect is the theory, so at best it can provide evidence that the theory is correct. It can never prove it is correct even though it can provide very good evidence that a theory is incorrect.
Global Warming/Climate Change theory states (simply put) that as CO2 in the atmosphere increases, temperature increases. When you see the output of a computer model, the model will show that when the input of CO2 increases, the temperature increases. The output of these computer models prove nothing other than the program was written properly. It does not prove the theory is correct. It doesn’t even provide evidence for the theory unless the output closely tracks the real world temperature.
This brings us to the models used in global warming theory. When you see a graph of the predicted warming due to increasing CO2, you will see a single line. What they typically do not tell you is that this single line is almost always an average of approximately 100 different computer models, each run many different times using different sets of numbers. The simple fact that this is an average of 100 models should immediately raise a question in any rational person. If there are 100 models being used, which model is the correct one? Yet each of those models is supposed to represent the same theory. There can be only one correct model for any theory, but we are given the output of over 100 different models as evidence of Global Warming.
If the calculated distance of travel of an artillery round is 2 miles in our previous example, but the measured distance was half a mile, there is something seriously wrong with either the theory or the execution. Keep in mind that we have actual measurements of temperature dating back to the development of accurate thermometers. Thus, the first step in checking the accuracy of a computer model of temperature is the hindcasting. That is, can you feed it data from the years prior to 2001 and get accurate predictions for 2001-2013? Most of the models used today are dismal failures at this step one.
In global warming theory, only a cursory look is required to compare the output of the global warming models with the actual measurements over the last 15 years to determine that the predicted warming is between two and three times the measured warming. This is very strong evidence that, at the very least, the equations they are using in this theory are very wrong. Yet several times a month, we hear claims about ‘proven’ global warming science or ‘settled science’ or ‘97% consensus’, most recently from our own Secretary of State. In the chart below, the red line is the predictions by the climate models, and the green and blue lines are measurements.
Another problem with the theories is the ‘extreme weather’ claims. Most theories and their models claim that there will be more extreme weather events of increasing severity due to increasing global CO2. It is the analysis (or to be more exact, the lack thereof) of this piece of theory that should make the reasonable person question the competency of those espousing the theory. In the US, and most other countries, we have very good records of extreme weather, going back over 150 years. In Europe, the records are even longer. According to global warming theory and models, human generated CO2 started having an effect around 1940-1950. This means that if extreme weather was on the increase, we should have seen a gradual increase starting at about that same time. However, a check of the records (the extreme weather page at noaa.gov is a good place to look) shows that there is no trend in extreme weather since 1950 (or before). That means no change in the severity or number of hurricanes, tornadoes, floods, cyclones, droughts, blizzards, etc. So in this case, we can compare the model output with current measurements, and determine that once again there is a major problem with the model.
The larger problem here is that these scientists made the claim of increasing extreme weather due to increasing CO2, programmed it into their models, but evidently didn’t know that the evidence they needed was already available to prove or disprove this theory.
Competent scientists would have known that the evidence to disprove this part of the theory was available, and researched it before they published theories that are obviously incorrect. Yet these same scientists are sticking to this part of the theory, along with the rest of the theory, even though it has been disproven. The fact that these ‘scientists’ ignore the fact that their theory has been disproven, and even continue to espouse the theory, is enough to bring the competency of these scientists into serious question.
This leads us to the next topic, what some people have called PlayStation climatology. PlayStation climatology is the result of examining a particular piece of global warming theory and then taking it to its worst case. The drastic result of this input is then examined for real world consequences. A classic example of this is the erroneous scare tactic report that polar bears are in danger due to shrinking arctic ice. The idea is that if all the arctic ice disappears, polar bears will go extinct because they need the ice to survive. Well, first of all the arctic ice is not going to disappear. The temperatures in the arctic regions are still well below the freezing point of sea water. Second, according to global warming theory, arctic ice has been receding since approximately 1977. However, if we do a little research we discover that the polar bear population has been on the increase since approximately 1945, and is currently at an all time high. Yet this claim about polar bears was made in normally reputable s and labeled science.
As we noted above, the climate models aren’t even close on the temperature predictions, which is what they are intended to predict. How could they possibly be accurate on data that is normally used as just another parameter in the program? Other examples of PlayStation climatology include the disappearing sea turtles (they’re not, at least not due to climate change), decreasing trade winds, increasing droughts, increasing floods, and almost any alarmist prediction you have read in the last decade that is related to global warming. All of these come from nothing more than feeding data back into these same incorrect models to get alarming predictions.
I don’t have the faintest idea how ANY climate, or so-called GCMs work; Global Coupled Models, I think that stands for. Dunno what couples to what. I have never seen written down any recognizable mathematical equation for any smidgeon of some physical process that is supposed to be going on here on earth that in some way relates to either weather or climate.
The way I see it, the laws of physics are obeyed instantaneously. Physical systems do not sit and twittle their thumbs, before suddenly deciding to react to some stimulus, force or whatever.
It may be just attoseconds, or a microscopic fraction of that for some reaction to happen, and of course, the velocity of light, and other speed limits determine just how fast effects can propagate, but the don’t wait to be told.
In particular, I would like a GCM or a climate model to start by acknowledging that the earth rotates on its axis about once in 24 hours or so.
So forget all that BS about there being some kind of equilibrium state. There isn’t even any steady state, let alone any equilibrium state. Specially when it comes to thermo-dynamics, where everything in the system must be at exactly the same temperature, for it to be in thermal equilibrium.
And since there isn’t any equilibrium, then their are changes in process, and the propagation of those changes takes time. Heat energy (noun) travels quite slowly in the scheme of things along a copper bar, with a temperature gradient. Even slower in most other materials.
Well I’m not going to expand this into an essay. But I would like one day to come across some list of the various mathematical equations, that these Terra-computers are supposed to be solving, to follow the laws of physics, through a day in the life of planet earth.
I’m not going to hold my breath waiting for such equations. And if I find the word “average”, anywhere in those equations or dissertations, I will go and get a very big cage full of parrots to put all that paper to a more useful application.
Peter humbug, is supposed to be good at GCMs, but , I’ve never seen any of his equations either.
But today is not a total loss.
Shuji Nakamura, today, became a new Nobel Laureate in Physics; and I can’t imagine who was the previous, most deserving recipient of that award.
Nakamura is accompanied by two other Japanese scientists, in the award. I plead total ignorance of their work, but I’m sure going to look them up.
Way to go Shuji, you certainly had this award coming to you, so congratulations.
Also, I have long thought (many years) that Professor Nick Holonyak Jr, of the U of Illinois (Champaign-Urbana), inventor of the first Light Emitting Diode (GaAsP), was the most deserving non recipient of the Nobel prize in Physics.
This year, It would have been quite appropriate for the Swedish Nobel committee, to have awarded the Physics Prize jointly to Holonyak and Shuji Nakamura, for the first , and the first blue LEDs.
It is unfortunate that they did not choose this opportunity.
I know Nick Holonyak personally, and I’ve met Dr Nakamura (nice guy).
This was a big foot in mouth by the Nobel committee.
They have awarded the Physics prize several times for “Technology,” rather than “pure physics”. In 2009 to George E. Smith (not me) for the invention of the CCD (Charge Coupled Device) a semi-conductor gizmo. Also Jack Kilby of TI got the Physics prize for the integrated circuit. Bob Noyce, who invented a real integrated circuit, was not recognized, since they don’t award posthumously. So they just took too damn long to appreciate the invention of the integrated circuit. Jack Kilby just hand wired some transistors on a common wafer, instead of cutting them up first, and packaging them, and then hand wiring them up. Noyce’s IC was already wired up on the wafer.
Quite a few Nobels are given to the wrong persons, possibly for academic political reasons. Steven Chu, Obama’s energy czar, got the Nobel in Physics, for Optical trapping (by laser beams). That was actually invented decades earlier, by a Bell Labs guy; who subsequently taught Chu, how to do it. His name escapes, me, problem of short term memory, but he even did it on his own time. Bell Labs didn’t even support it till he had essentially done it; I guess his name was Ashkin.
IPCC AR5 TS.6 lists a lot of very important climate issues, including models and extreme weather, that the IPCC scientists admit they don’t understand well, if at all..
The three most recent 48 hr periods at Bardarbunga:
_________ 10/1-3 10/3-5 10/5-7
M5.1+_ _ _ _0 _ _ _ 0_ _ (5.1,5.5)
M 4.5-5.0_ _ 6 _ _ _ 7_ _ _ _ 2
M 4.0-5.0_ _10_ _ _10_ _ _ _9
M 3.0-3.9_ _ 5 _ _ _23_ _ _ _9
M 2.0-2.9_ _ 28 _ _ 19_ _ __26
@Fissure # _ 20 _ _ 18_ _ _ _8
@Fis. MaxM 1.4(13km), 2.7(10km)
Askja # _ 2grp 26 _ _11 _ _ _ 6
The M5.5 at Bardarbunga appears on the far SE side of the caldera, 8 km, Long 17.35W.
the M5.1 was at 2 km
The biggest at the fissure is a M2.7 at 10 km, unusually large. But there are very few of any size. only 8 on the 3DBulge over 48 hrs.
Askja NE has only 6 in the past 48 hrs, biggest M1.2, 5 km.
South of Bar, but there is one M2.2 at 7km
West of Bar: nothing on 3DBulge, but there is one on the vedur.is map
Web cams weathered in last night and this morning.
Reykjanes Penninsula: two cluster of M1.0-2.0 quakes. Long 22.4W and 21.30-.35W
Tjornes Fracture zone Large: M2.0: 2. M1.0-1.9: 8.
I’ve only been looking at this map of Iceland “North Atlantic” since Pamela Gray’s Oct. 5 comment.
http://en.vedur.is/earthquakes-and-volcanism/earthquakes/atlantic/
Yesterday the quakes offshore to the north showed up. Today, the sub M2.0 quakes along the SW Reykjanes ridge showed up.
They just had a 5.2
Wednesday
08.10.2014 16:00:29 64.674 -17.495 1.1 km 0.9 63.04 4.1 km NNE of Bárðarbunga
Wednesday
08.10.2014 15:30:08 64.678 -17.511 1.4 km 1.1 52.26 4.3 km N of Bárðarbunga
Wednesday
08.10.2014 15:29:01 64.682 -17.470 0.1 km 0.9 90.01 5.4 km NNE of Bárðarbunga
Wednesday
08.10.2014 15:24:14 64.680 -17.491 5.1 km 5.2 99.0 4.7 km NNE of Bárðarbunga
Wednesday
08.10.2014 15:17:51 64.674 -17.487 3.2 km 3.9 99.0 4.2 km NNE of Bárðarbunga
I suggest you not try to do as many news stories as you usually do.
The four most recent 48 hr periods at Bardarbunga:
_________ 10/1-3 10/3-5 _1 0/5-7 _ 10/7-9
M5.1+__ _ _ _0 _ _ _ 0_ _ (5.1,5.5) _ _(5.2)
M 4.5-5.0 _ _ 6 _ _ _ 7_ _ _ _ 2 _ _ _ _ 2
M 4.0-5.0 _ _10_ _ _10_ _ _ _9 _ _ _ _ 4
M 3.0-3.9 _ _ 5 _ _ _23_ _ _ _9 _ _ _ _ 7
M 2.0-2.9 _ _ 28 _ _ 19_ _ __26 _ _ _ 34
@ur momisuglyFissure # _ 20 _ _ 18_ _ _ _8 _ _ __10
@ur momisuglyFis. MaxM _ _ _ 1.4@ur momisugly13_2.7@ur momisugly10 1.5@ur momisugly13
Askja # _ 2grp 26 _ _11 _ _ _ 6 _ _ _ _11a
S. of Bard _ _ _ _ _ _ _ _ _ _ _ _ _ _ M2.3@ur momisugly1
W. of Bard _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 9
a: Askja #: only 4 visible on the 3DBulge data, but at least 11 on the Vatnajokull Map
Askja NE had a M2.5 @ur momisugly 3 km. This cluster may be extending to the north and the recent activity at the northern Tjornes facture zone may be moving SE toward it.
The fissure has 10 in the past 48 hrs, biggest M1.5 @ur momisugly12 km and M1.3 at 6 km.
South of Bar, but there is one M2.3 at 1km
West of Bar: 6 on3DBulge, biggest M2.1 @ur momisugly 4km, there are 9 vedur.is map
Web cams weathered in last two day and nights when I looked.
Reykjanes Penninsula: M1.0-M1.9: 5 quakes.
Tjornes Fracture zone Large: M2.0: 1. M1.0-1.9: 7.
Twitter: John A Stevenson @volcan01010 · Oct 7
#Holuhraun area now 52.3km2 in 34 days => ~1.5km2/day
=> ~one football (soccer) pitch every 7 minutes! #Bardarbunga
The five most recent 48 hr periods at Bardarbunga:
_________ 10/1-3 10/3-5 _1 0/5-7 _ 10/7-9 _ 10/9-11
M5.1+__ _ _ _0 _ _ _ 0_ _ (5.1,5.5) _ _(5.2)_ _ (5.2)
M 4.5-5.0 _ _ 6 _ _ _ 7_ _ _ _ 2 _ _ _ _ 2 _ _ _ _ 6
M 4.0-5.0 _ _10_ _ _10_ _ _ _9 _ _ _ _ 4 _ _ _ _ 6
M 3.0-3.9 _ _ 5 _ _ _23_ _ _ _9 _ _ _ _ 7 _ _ _ _10
M 2.0-2.9 _ _ 28 _ _ 19_ _ __26 _ _ _ 34 _ _ _ _ 34
@Fissure # _ 20 _ _ 18_ _ _ _8 _ _ __10 _ _ _ _24
@Fis. MaxM _ _ _ 1.4@13_2.7@10 1.5@13 _ 2.0@7
Askja # _ 2grp 26 _ _11 _ _ _ 6 _ _ _ _11a _ _ _11a
S. of Bard _ _ _ _ _ _ _ _ _ _ _ _ _ _ M2.3@1 _ 2 M1.1
W. of Bard _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 9 _ _ __ 8 M3.0
a: again, Askja isn’t showing up on the 3DBulge data, but there are 11 on the map.
Reykjanes Penninsula: M2.0+ 0: M1.0-M1.9: 1 quakes. (less activity)
Tjornes Fracture zone Large: M2.0+: 0. M1.0-1.9: 2. (less activity)
Not much in the news. Estimates that it is still flowing 350 m^3/sec. Total volume erupted is 0.77 km^3
A helicopter video, not very noteworthy, except the final 10 seconds that are a wide angle showing the fissure and the distributaries of the lava river.
http://www.mbl.is/frettir/english/2014/10/11/brand_new_images_of_the_eruption
The fissure quake activity is higher than in recent days. two M2.0 quakes at 7km and 9 km. a dozen small ones M0.7-M1.5 at 12km.
West of Bardarbunga there are more, larger and shallow quakes.
Webcams are socked in day and night for the last several days
The six most recent 48 hr periods at Bardarbunga:
Nothing in the ruv.is news.
Twitter Picture from Airplane: (Stewart Gill) Oct 12
https://twitter.com/search?q=%23Bardarbunga&src=tyah
Twitter webcam caputure: (Jeanne K) Oct 11
https://twitter.com/search?q=%23Bardarbunga&src=tyah
This is a website worth monitoring.
http://icelandreview.com/news
One volcanologist predicts eruption to end in March because with the collapse of the caldera as the driver of the fissure eruption, by March the crater subsidence will reach the level of the fissure.
The Caldera has subsided 12 meters so far.
Otherwise, there is little change in eruption activity, but there are fewer seismic events around the fissure.
[For simplest tabled data, it is easiest – not glamourous, but easiest! – to use the html “pre” text. .mod]
The seven most recent 48 hr periods at Bardarbunga:
My perception is that seismic activity at the fissure is dropping, but minor activity around Askja, West of Bard and South of Bard is increasing. The Caldera had its biggest quake (M5.4) in a week, but many M4.5+ to make up for it. Number of quakes in the M2.0-M3.9 range is down.
Reykjanes Penninsula: M2.0+ 0: M1.0-M1.9: 0 quakes. (less activity)
Tjornes Fracture zone Large: Max M2.4: M1.0-1.9: 6. (more activity)
N.Atl. Nothing offshore
Past eight 48 hour seismic event summaries in the Bardarbunga area:
Notes:
(a): counts from the Vatnajokull map greater than 3Dbulge.
(b): All the activity in the past 4 hrs is in the Bard and West of Bard areas.
(c): 48 hour window, but only 24 hours since the last so there is 24 hrs of overlap with previous period.
Seismic Activity has shifted radically from yesterday. Fewer quakes and they moved. Almost nothing in the past 24 hrs at the fissure and Askja region. Several quakes West of Bard and in the crater. Finally, the number of M2.0-M4.0 quakes is the lowest it has been in a month.
I haven’t seen anything but black and grey in the webcams all week. Must be socked in by low clouds. Yesterday news is the fissure eruption has not declined.
A decent ground level picture is in
http://icelandreview.com/news/2014/10/16/big-quake-hits-bardarbunga-further-subsidence-caldera
Past eight 48 hour seismic event summaries in the Bardarbunga area:
Notes: (a): counts from the Vatnajokull map greater than 3Dbulge. 3DBulge has a long list of unvarified events.
(b): A continuation of the trend of activity from the fissure to West of Bard. At least seven quakes of M2.0-2.8 in the past 8 hours. All Shallow, 1-4 km. Watch this place!
(c) but 7 of 13 in the past 12 hrs, most in the West of Bard crater.
The quakes have changed location!! The crater West of Bardarbunga has many more times those of the fissure and they are shallow.
Seismic Activity has shifted radically from Oct. 15. Almost nothing in the past 48 hrs at the fissure and Askja region. At least 17 quakes in the crater West of Bardardarbuga, at least seven of them in the range M1.9-2.8 in the past 12 hrs. and in the crater. Finally, the number of M2.0-M4.0 quakes is the lowest it has been in a month.
I haven’t seen anything but black (night) and grey(daylight) in the webcams all week. Bard1 and Bard2 have different shades of grey whatever that means. News from: Icelandreview.com