Snow job in Antarctica – digging out the data source

UPDATE: the question has arisen about “occupied” aka “manned” weather stations in Antarctica (Stevenson Screens etc) versus the Automated Weather Stations. This picture on a postage stamp from Australia, celebrating the Australian Antarctic Territory in 1997, may help settle the issue. Note the Stevenson Screen near the “living pod” on the right.

http://www.cira.colostate.edu/cira/RAMM/hillger/AustralianAntarctic.L102.jpg

Here is the larger photo of the first day of issue card, the Stevenson Screen is also just visible above the snowbank in the lower right. Rather close to human habitation I’d say. Looks like its in the middle of an AHI (Antarctic Heat Island).

Click for larger image
Click for larger image

Here’s another picture of a Stevenson Screen close to a building in Antarctica, from the British Antarctic Survey:

[10004058]

Location: Fossil Bluff, Alexander Island

Season: 1994/1995

Photographer: Pete Bucktrout


It seems that folks  are all “wild about Harry” over at Climate Audit, with the revelations occurring there, and no good kerfluffle would be complete without some pictures of the weather stations in question. It seems a weather station used in the Steig Antarctic study , aka “Harry”, got buried under snow and also got confused with another station, Gill, in the dataset. As Steve McIntyre writes:

Gill is located on the Ross Ice Shelf at 79.92S 178.59W 25M and is completely unrelated to Harry. The 2005 inspection report observes:

2 February 2005 – Site visited. Site was difficult to locate by air; was finally found by scanning the horizon with binoculars. Station moved 3.8 nautical miles from the previous GPS position. The lower delta temperature sensor was buried .63 meters in the snow. The boom sensor was raised to 3.84 m above the surface from 1.57 m above the surface. Station was found in good working condition.

I didn’t see any discussion in Steig et al on allowing for the effect of burying sensors in the snow on data homogeneity.

The difference between “old” Harry and “new” Harry can now be explained. “Old” Harry was actually “Gill”, but, at least, even if mis-identified, it was only one series. “New” Harry is a splice of Harry into Gill – when Harry met Gill, the two became one, as it were.

Considered by itself, Gill has a slightly negative trend from 1987 to 2002. The big trend in “New Harry” arises entirely from the impact of splicing the two data sets together. It’s a mess.

So not only is there a splice error, but the data itself may have been biased by snow burial.

Why is the snow burying important? Well, as anyone skilled in cold weather survival can tell you, snow makes an excellent insulator and an excellent reflector. Snow’s trapped air insulative properties is why building a snow cave to survive in is a good idea. So is it any wonder then that a snowdrift buried temperature sensor, or a temperature sensor being lowered to near the surface by rising snow, would not read the temperature of the free near surface atmosphere accurately?

As I’ve always said, getting accurate weather station data is all about siting and how the sensors are affected by microclimate issues. Pictures help tell the story.

Here’s “Harry” prior to being dug out in 2006 and after:

Harry AWS, 2006 – Upon Arrival – Click to enlarge.

Harry AWS, 2006 – After digging out – Click to enlarge.

You can see “Harry’s Facebook Page” here at the University of Wisconsin

It seems digging out weather stations is a regular pastime in Antarctica, so data issues with snow burial of AWS sensors may be more than just about “Harry”. It seems Theresa (Harry’s nearby sister) and Halley VI also have been dug out and the process documented. With this being such a regular occurrence, and easily found within a few minutes of Googling by me, you’d think somebody with Steig et al or the Nature peer reviewers would have looked into this and the effect on the data that Steve McIntyre has so eloquently pointed out.

Here’s more on the snow burial issue from Antarctic bloggers:

The map showing Automated Weather Stations in

Antarctica:

Click map for a larger image

The Gill AWS in question.

http://amrc.ssec.wisc.edu/images/gill.gif

From Polartrec

Theresa was placed at this location partly to

study the air flow in the region. Looking out the window of the plane we can

definitely see the air flowing!!! Jim estimates the wind at about 25 miles per

hour.

Wind Blown snow near Theresa AWS

Wind blown snow at Theresa

With the temperature around 0F the wind chill

was about 20 below, it is obvious this is going to be quite a chore.

George digging out Theresa

Starting to dig out Theresa

The weather station has not been working, so

George needs to figure out what is wrong with it and then fix it. The station is

almost buried in the snow so we will also need to remove all of the electronics,

add a tower section and then raise and bolt all of the electronics and sensors

back in place.

eorge unhooking the electronics box at Theresa AWS

George unhooking the cables.

After refueling the plane, with the fuel in

the 55 gallon drums, Jim and Louie helped dig down to the electronics boxes that

were completely buried plus they built us a wind break that made huge difference

in helping us not be so cold. After about 4 hours we are almost through. As I am

hanging onto the top of the raised tower in the wind, one bunny boot wedged onto

the tower bracing, the other boot wrapped around the tower, one elbow gripping

the tower, my chin trying to hold the wind sensor in place and both bare numb

hands trying to thread a nut onto the spinning wind sensor I really appreciate

the difficulty of what is normally Jonathan’s job. After checking to make sure

Theresa is transmitting weather data we board the plane and head to Briana our

second station.

Theresa after we are finished.

Notice the difference between this

picture and the first one of Theresa.

From Antarctic Diary

More movement

It’s been another flat-out week. The vehicle team have dug

up and moved the Drewery building, which was getting do buried snow was

almost up the windows. Team Met have been on the move too – all the

remaining instruments are now bolted securely to the Laws roof, so we headed

up the the Halley VI building site to relocate the weather station.

Jules starts digging out the weather station

Only 15km away, the Halley VI site looks a lot like Halley V. It’s flat,

white and snowy. Very snowy. The weather station had about 1.5m built up

around it!

Jules and Simon recovering the solar panel

In the hole!

The weather station was a survey reference point for the build project so we

had to find a suitable replacement. Could this be Antarctica’s first

pole-dancing venue?

Penguin Party memories…

After an hour or so sweating it our with shovels, the weather station popped

out and was loaded onto the sledge. Like the reference point, the station’s

new location had to be precise as vehicles are banned from the upwind

section of the site to keep that area ultra-clean for future snow-chemistry

experiments.

Weather station on the move

Driving on a compass bearing and GPS track, we found the new site just under

a kilometre away.

The final setup

UPDATE: here’s another buried station story from Bob’s Adventures in cold climes. Apparently this station is used as a reference for some sort of borehole project.

I dig weather stations

My main task for today was to get a start on raising my weather station. I’d installed it 2 years ago, and with the high accumulation at Summit, it’s getting buried. The electronics are all in a box under the snow, and the only things visible at the surface were the anemometer for measuring wind speed and direction, the thermistor for measuring air temperature, and the solar panel to keep the batteries charged.

The buried weather station. The flat green bit is the solar panel, which was about 1.5 meters off the surface when I installed the station. Can you guess why I would mount it facing down?

In the morning I downloaded all the data from the station, and checked to see that it was all in order. Then it was time for digging. I’d carefully made a diagram when I inastalled the station, so I knew exactly where to dig. A couple of hours later I’d found my box!

At the bottom of the pit with the datalogger electronics.

I brought everything up to the surface, and then was about to fill in the pit, when I realized at least one more scientist at Summit might want to make measurements in it; the pit’s already dug! So tomorrow I’ll help Lora with some conductivity measurements, then fill in the pit, re-bury the box just beneath the surface, and it’ll be ready to go for another 2 years!

And there’s more….

The Australians seem to have AWS problems as well. From the Australian Antarctic Division:

On Monday two groups headed out, with Largy and Denis going up to the skiway to check on the condition of the equipment stored there for the winter and beginning preparations for the coming summer flying season.

Bill, Brian and Ian went up to the Lanyon Junction Automatic Weather Station (AWS) to check its condition and retrieve some of the sensors in preparation for the annual servicing of the various remote units.

Automatic weather station buried 1.5m in snow

A hard life for an AWS – Buried 1.5 metres
Photo: Ian P.
Anemometer

This used to be an anemometer
Photo: Ian P.

And the University of Maine, participating in USITASE, has the same troubles, they write:

We reached our first major destination at the end of today’s travel, the site of the Nico weather station. There are several automatic weather stations spread out over the surface of Antarctica. These stations measure things like temperature, wind speed and wind direction and then relay this data back to scientists via satellite. Anything left on the surface of the snow will eventually be drifted in and buried by blowing snow. This particular weather station (NICO) has not been seen in several years. They tried to locate it via airplane a few years ago and were unsuccessful. Our task was to find the weather station, record its position with GPS, and mark the location with flags so that in the near future, the weather station can be raised and serviced.

We arrived at the coordinates of the station around 10 pm. Our initial scans of the horizon were not productive, so Matthew and John took the lead tractor (with our crevasse-detecting radar) out to survey a grid near our stopping point. The radar should detect a large metal object like a weather station, but the survey was also unsuccessful. After a fine pasta and tomato sauce dinner, John went outside for an evening constitutional. He saw a shiny object out in the distance – further inspection with a pair of binoculars determined that it was the top of the NICO weather station! Several of us marched out to the station, which was actually about a half mile distant, marked the location with bright orange flags and recorded the position via GPS for future reference. Only the top foot or two of the station was still visible. John was in exactly the right place at the right time to see a reflection from this object while we were near the kitchen module, and so allowed us to complete our first task successfully.

Tomorrow, we drive on.

http://www2.umaine.edu/USITASE/moslogs/images03/buried.jpg

http://www2.umaine.edu/USITASE/moslogs/images/AWSsite.jpg


This regular burial and digging out of stations brings the whole network of AWS stations to be used as sensitive climate measurement stations into question.

Get notified when a new post is published.
Subscribe today!
0 0 votes
Article Rating
278 Comments
Inline Feedbacks
View all comments
Pamela Gray
February 4, 2009 8:38 pm

Tim, some changes have occurred in whether or not dinos were warm or cold blooded. The book is not closed on that topic.

February 4, 2009 8:41 pm
Steve Huntwork
February 4, 2009 8:44 pm

Anthony and Eric:
Sorry, I thought I was posting on this website and not CA.
LOL – oh well, we only wanted to insure the integrity of the database.
And yes, my postings on CA should have been deleted.

Pamela Gray
February 4, 2009 8:46 pm

April, that thought occurred to me too as I was trying to find the December AIRS CO2 data. The website has changed to reflect a more “environmentalist” attitude and agenda as opposed to reporting simple data to the public. It’s quite a change from just 6 months ago. hmmmmm

Jeff C
February 4, 2009 9:15 pm

Woo-Hoo! Jon finally won one of his pointless arguments. St Mac deleted three words from a post title. I guess AGW is real after all. May as well shut down the blog, Anthony.

Fred Gams
February 4, 2009 9:54 pm

OT but very interesting. Snake size is related to the mean temperature, so the bigger the snake the warmer the temp. The new discovery suggests that the mean temp was 10 degrees warmer than today. Video included in article.
—————
Largest prehistoric snake on record discovered in Colombia
Named Titanoboa cerrejonensis by its discoverers, the size of the snake’s vertebrae suggest it weighed 1140 kg (2,500 pounds) and measured 13 metres (42.7 feet) nose to tail tip. A report describing the find appears in this week’s Nature….
“At its greatest width, the snake would have come up to about your hips. The size is pretty amazing. But our team went a step further and asked, how warm would the Earth have to be to support a body of this size?”
Assuming the Earth today is not particularly unusual, Head and Dr Jonathan Bloch, Assistant Curator of Vertebrate Paleontology at the Florida Museum of Natural History, estimated a snake of Titanoboa’s size would have required an average annual temperature of 30 to 34°C (86 to 93 F) to survive. By comparison, the average yearly temperature of today’s Cartagena, a Colombian coastal city, is about 28°C.

Edward
February 4, 2009 10:00 pm

Tim L (20:04:17) :
I looked into the research on CO2 levels for that same time period after seeing that article about prehistoric snakes. See Link at:
http://www.victoria.ac.nz/antarctic/people/peter-barrett/pdfs/Barrett%202006%20Second%20climate%20shift-Ch%206%20Chapman%20etal%20VUW%20Press.pdf
The study maintains that the release of methane hydrates raised temperatures by at lease 5C creating the environment to support the largest snake ever.

anna v
February 4, 2009 10:19 pm

MattN (16:34:05) :
09 “Simon: “Why do you energetically challenge findings that suggest supposed warming whilst uncritically accepting any evidence of supposed cooling?””
Because cooling is almost never artificial.

I read about wind chill above, in one of the descriptions of digging out a station.
I know that if my cheap sensor stuck on my windshield gets wet, the temperature drops; also that these sensors are below freezing . What about though water evaporating from the ice ( sublimation)? Does it not drop the temperature? Is there no ice around the sensor, the metal parts of? Of the container/protector?
I am completely ignorant on this.
I am

February 4, 2009 10:26 pm

I agree April the less people involved the less accountability

Edward
February 4, 2009 10:28 pm

GISS includes the Antartica Siple location in its dataset 75.9S 84.2W .
See the link at:
http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=700892840009&data_set=1&num_neighbors=1
It looks like an (incomplete) 22 year solar cycle! No warming there.

Edward
February 4, 2009 10:34 pm

GISS also includes the Byrd AWS station at 80.0S 119.4W in it’s dataset. This station was installed in 1980 and worked for about 8 years. Beginning in 1988 the station was unable to report data for 29% of the time period between 1988 and 2008.
Here is an example of missing data from this Byrd WMO number 89324 location:
1988 April to Dec
1989 Mar-Dec
1990 Nov+Dec
1992 July-Sept, Nov+Dec
1996 May-Dec
1999 June-Oct
2000 Oct-Dec
2001 Jan-Sept, Nov+Dec
2002 May-Oct
2003 Sept-Dec
2004 June-Nov
2005 June-Dec

February 4, 2009 10:38 pm

George E. Smith (16:41:08) :
””””
Well here is what Mr_X failed to realize. That one atmosphere pressure at
-78.5 C sublimation point is the TOTAL ATMOSPHERIC PRESSURE; it is NOT the partial vapor pressure of CO2.

Not so I’m afraid you’ve been misinformed, it is indeed the partial pressure
Therefore I hereby declare, I am not an idiot, and my assertion was not total nonsense.
I never said you were an idiot just that the claim of CO2 ice at Vostok was nonsense, which it is.
So when Mr_X arrives here, I would appreciate an apology.
Not forthcoming since there’s nothing to apologize for.
Reply: That would have been Phil. And this was the post.. Moderators have much better search tools available. For better or worse ~ charles the moderator.
Indeed it was.

E.M.Smith
Editor
February 4, 2009 10:43 pm

George E. Smith (16:41:08) : So at -90 deg C, where Vostok can reach at times, the equilibrium pressure is about 0.3-0.4 atmospheres TOTAL PRESSURE. Therefore at around -90 C at Vostok, and one atmosphere total air pressure, solid CO2 is quite stable, so precipitation of CO2 snow is quite possible, and in fact it is possible any time the air temperature drops below -78.5 deg C.
Um, I hate to say this George, but I think you may have missed something…
Per this chart:
http://en.wikipedia.org/wiki/File:CO2HydrPhaseDiagram.jpg
if we can generalize from what we think is happening on Mars… the CO2 / water clathrate phase has a temperture at 1 bar of about -55 C as I read it. So I’m pretty sure you will get a solid CO2/[6 or 8]H20 clathrate formed. At least until you reach the CO2 solid point of about -78.5 C.
So I think you will need to accept ‘CO2 snow’ at a higher temperature, it just won’t be pure CO2 snow… There is probably some mass transfer rate limit as the CO2 will need to either absorb into the extant water snow or find some very scarce water in the air to join with… But I’ll leave the rate calculation stuff to someone with better chemical skills in that area.
I hope this doesn’t hurt your case too much 😉

Neil Crafter
February 4, 2009 10:49 pm

Chris V
Do you really think that a weather station’s temperature sensors when buried by snow would measure colder than the surface air? Most people here think that due to the snow’s insulating effects the temperature sensor is more likely to read warmer than the surface air. Do you have a mechanism to explain your contention?
Simon
Your argument about digging the stations out of the snow in the 1950’s is rather dubious, as the automated weather stations did not come into play until much more recently, and the manned stations that need to be read every day would be regularly kept free of any accumulating snow, otherwise the staff down there couldn’t read the temps!

February 4, 2009 11:04 pm

The BAS station Halley is an interesing one in regard to variations in temperature readings. It sits on the Brunt Ice Shelf which is apparently moving westward at about 700m per year. The weather station is apparently on the station building itself.
Here’s a link to the BAS site. The station has been rebuilt a number of times and work is currently in progress on an innovative structure built on skis. Imagine the effect of moving a station 700 m might have.
http://www.antarctica.ac.uk/living_and_working/research_stations/halley/index.php

Manfred
February 4, 2009 11:17 pm

Chris V. asked:
“But how does burial affect the temperature trend?”
1. when these stations are installed, they are well above the snow. this may last for a few years until they get buried for the first time. so the warming bias is shifted away from the beginning and resulting in a positve trend, especially for stations with short life span.
2. the aws project started collecting data in 1980. the measurements before did not include snow buried stations. adding the aws data set after 1980 to the data pool should result in a snow cover warming bias after 1980.

Jon Jewett
February 4, 2009 11:23 pm

Re: CO2 in solid form at 1 Atm. Seems to be two questions: can it exist at 1 Atm and can it form naturally at 1 Atm.
In a previous lifetime.
I was a Cargo Engineer on a 125,000 cubic M liquified natural gas tanker. We used to load LNG at a port on the East Coast of Borneo (the Jungle paridise port of Bontang), some 8 miles north of the equator. We would have a mix of water ice and CO2 ice form on the cargo lines. Sort of cool when we’d disconnect the cargo lines and the ice would pop and sizzle on the steel deck. LNG is carried at 1 Atm plus about 2 psi and at -163 C more or less depending on the mixture with heavier hydrocarbons. As I recall we ran about -158 C
Also, I have used CO2 fire extinguishers. The gas is compressed in the cylinder as a liquid. When discharged, one product is CO2 snow.
Finally, at the Maxi-Mart Grocery Store, they have a cooler full of CO2 ice for you to buy by the pound.
So….it will form and it will exist at 1 Atm.
(I went to a trade school, but I am fully qualified to portray a doctor on TV. And I stayed in a Holiday Inn Express last night!)
Best regards,
Steamboat Jack

Denis Hopkins
February 4, 2009 11:31 pm

Re: In the pictures of before and after digging out at the beginning of the article.
If it was up to its head in snow… where was all the snow in the after picture? It was down to its ankles all around for miles… Surely they did not dig out the whole area!

carlbrannen
February 4, 2009 11:37 pm

Couldn’t they, like, solve this problem by arranging for the station to raise itself? It seems like they have a pretty good idea how fast they get buried, I don’t even think you’d have to go so far as to install a sensor. Are there stepper motors that will operate in cold weather?

EricH
February 4, 2009 11:47 pm

Anthony and all,
On a related subject, Artic Ice and its thickness, it may be a good idea to keep an eye on http://www.benhadow.com and http://www.catlinarticsurvey.com.
Ben Hadow and team are walking to the North pole over the next three months and regularly, and frequently, manually drilling to find ice thickness; then relating it to satellite data to try to determine long term trends.
Only problem, they seem to have a mindset which says “Artic ice thinning. We will find out when it will disappear”.
This should help to answer the question, “Is Artic Ice widening in extent but thinning by being melted from below?” And only three months away unless the results are “adjusted”.

February 5, 2009 12:15 am

Horace (18:01:45) :
. . . and Steig et al said they relied only a tiny bit on the AWS’s?

Since the Steig et al paper depends on the satellite data, and the data series comes from 3 seperate satellites carrying three different instruments, I wonder if Steig et al have any of the concerns they and others have mooted about UAH and RSS temperature records with regard to orbital drift, clibration etc.
After all, the back extrapolation of the satellite data to a theoretical temporal data point as far distant prior to the start of the record as the length of the record itself is going to amplify any error quite a lot.
Steig now says the ‘Harry’ and ‘Racer Rock’ data will affect the conclusion by no more than 0.02C and is therefore “minor”. But since the paper adduces a warming trend of 0.1C this is a 20% error. And since the uncertainty is 0.08C and the adduced warming trend from the interpolation and extrapolation is also 0.08C, we are left wondering what science of substance is left.
Also, since Steig and Gavin claimed that the AWS data was not used in the reconstruction, why would the bad data affect the outcome at all? And why was Steve McIntyre able to show a graph where the ‘Harry’ Station data *exactly* matches the reconstruction from the satellite data?

E.M.Smith
Editor
February 5, 2009 12:21 am

Richard M (17:11:28) :
IMO, the eduation system is responsible for creating a couple of generations of folks who are not critical thinkers/problem solvers.

I think you may be onto something here… Mr. McGuire (H.S. chem / physics teacher) always made us work out the answer for ourselves. I once made the mistake of asking ‘What is gasoline?’. About 3 weeks later I had an acceptable answer to give him… (He had handed me a CRC Handbook and suggested some library time…)
In a recent article “kitchen experiments” were discussed . I’m not at all surprised that this went right over many heads. It’s not in a book. Not too surprising that in response to these real life examples we got references to more papers.
That was me. “Kitchen Science” came from the way I was taught science. That it could be done anywhere, by anyone, with the equipment to hand. Again, Mr. McGuire would, when a question was asked, say “Well, let’s see.” and run a lab experiment on the spot. He stresses repeatedly that the lab was the final arbiter of what would actually happen.
We were taught that Science was a way of thinking and that everyone ought to do it. As often as possible. It was a way to find and test answers, by anyone, in any field, at any time. Nowhere in the scientific method we were taught was there a ‘peer review’ step…
Any lab work that “didn’t happen right” was a cause for more enquiry into what we might learn about what really happened. (In college it was a cause for copying the answer from the book… a culture shock for me.) He constantly stressed that we could learn by doing whenever a question came up. And we did.
Mr. McGuire even gave us a lesson on how to make your own glassware with a Bunson burner! We had to turn tubing into a pipette, put a pouring lip on a test tube, and add a handle to a beaker. (I can still do a decent job of working glassware) And he made most all our reagents by re-crystallization of poorer grades and direct production. (The result from the ‘fermentation’ experiment was distilled to make the alcohol used in the organic section…)
His ‘buddy’, the biology teacher, had the same attitude. (I specifically remember a lab on ‘rotting’… and the reaction to what week old milk smelled like…every student had to collect their own ‘stuff’ and had their own stack of 6 Petri dishes and ‘rotting stuff’ to observe and manage).
We were told what ‘equipment’ was used by folks like Darwin, Einstein, Lavoisier, Leyden and others (i.e. a notebook, pen, and brain were most valuable; everything else was to enhance the senses or change the test environment and often optional… but you could make your own if needed.) I remember making a Leyden jar from a Mason type canning jar and foil… Heck, even in Radio Class we took apart old radios (tube type! dawn of the transistor era…) and re-made them stage by stage (diode / headphones, add 1st RF, add audio & speaker, etc.)
Strangely, never once in all my chemistry, physics, biology, genetics, etc. classes from high school through college did anyone say that science required anything called peer review or publishing… It was at most an afterthought; something you could do if you wanted to tell the world about what you had learned… Truth was what you found & Science was how you did it; publishing was for sharing if you ever wanted to. Times change, I guess.
No wonder Al Gore is telling kids to quit listening to older people.
Isn’t AlGore an older person? Maybe we need to advertise his age with his speeches 😉
At any rate, I refuse to give up my right to ‘do science’ when, where and how I wish. I will continue doing “Kitchen Science” and teaching anyone else how to do it; and following the example of Mr. McGuire…
Per the arctic stations self lifter idea: Might I suggest hollow aluminum (or hollow concrete or ferrocement) rather than solid for the rings? If you get the density low enough it will tend to ‘float’ on the snow & ice rather than sink… still need the ‘stepper’ but the physics will be going your way instead of fighting you 😉 Heck, even a ‘ship hull’ shape might work. They would get trapped in the ice pack but didn’t sink as long as they were not crushed.
(picture a ship hull with a blended pyramid deck structure to shed snow and support the equipment mast… add some outrigger lift / straighten spars and you ought to be ‘good to go’!)

papertiger
February 5, 2009 1:18 am

Less clear is:
In the full reconstructions, we used the complete weather station data (1957-2006) from all 42 of the READER database locations listed in Table S2.
However Table S2 has 46 series (including Harry). So is Harry in the full reconstruction or not?
[Response: Table S2 says it has “List of the 42 occupied weather stations … and four automatic weather stations (AWS)” (i.e. 46 entries). Only the 42 occupied stations are used to provide the data back to 1957. The AVHRR data, or the AWS data are used only for calculating the co-variance matrices used in the different reconstructions. Thus the reconstruction can either use the covariance with AVHRR to fill the whole interior back to 1957 (the standard reconstruction), or the covariance with the AWS data to fill the AWS locations back to 1957. – gavin]

Calculating the covariant matrices – thems some impressive sounding words. How about a little humility?
Wikipedia calls it estimation of covariance matrices. Sound about right?
Lets look at the definition of covariance matrix. According to Wiki –
In statistics and probability theory, the covariance matrix is a matrix of covariances between elements of a vector. It is the natural generalization to higher dimensions of the concept of the variance of a scalar-valued random variable.
Did you get that? In English –
It’s an unnecessarily complex mathematical expression created to mask the prejudices of the author from reveiw.
Lets input some random variables to demonstrate.
Jose, Manuel, and Cesar, are Jim’s friends.
Through the use of a covariance matrix applied to the random variable, in this case the first names of Jim’s friends, Prof X guesses that Jim lives in Mexico.
Only much later does Steve McIntyre through reverse engineering discover that Jim lives in California.
By that time the reporters have repeated Prof X’s Mexican origin theory so often that it’s common wisdom. Screeching activists march in the streets demanding Jim be deported.
Jim decides to change his name to Jaime because it is easier then correcting the record.
Close to the truth.
Here’s the nut of the thing.
Prof X can’t say “We didn’t use Harry” and ” the AWS data are used only for calculating the co-variance matrices used in the different reconstructions.”
These two statements are contradictory.

Richard Heg
February 5, 2009 1:33 am

A Bit OT but still the same region.
ScienceDaily (Feb. 5, 2009) — Increasing greenhouse gases could delay, or even postpone indefinitely the recovery of stratospheric ozone in some regions of the Earth, a new study suggests. This change might take a toll on public health.
http://www.sciencedaily.com/releases/2009/02/090204131625.htm

E.M.Smith
Editor
February 5, 2009 1:43 am

KuhnKat (18:16:30) : ask Gavin whether the AWS station data were used to help compute the infill in the “manned” station data. I don’t know, but, SOMETHING has to give the warming and what was available previously could not with standard practices.
I’m slowly getting a working familiarity with the mindset of GISS and how they process data in GISStemp. It isn’t pretty.
A key factor is something called the ‘Reference Station Method’ and the Antarctic report sounds like the same thing. Data are regularly ‘made up’ based on nearby data (mostly in space, but sometimes in time); other large chunks of real data are ‘disposed’ for various inexplicable reasons.
Often (especially in the ‘anomaly’ mode) station real data are replaced with data based on what some other stations are doing, far removed. This process can be repeated several times.
While the FORTRAN is rather hard to decrypt, and the technique is horrid;
(I don’t use that word lightly. I’m OK with ugly code that works well and is basically maintainable. I’d fire anyone who repeatedly did what GISS does… things like scribble scratch files in the source archive, have multiple code copies that can get out of sync, repeatedly mutate data formats and had data moved from file to file to file gratuitously…)
– I’m gong to paste a bit of the comments from a bit of code that I think matters here. The first is the trimSBBX.f (that ‘trims’ the ‘boxes’ of grid data) and the second is the zonav.f program that finds average anomalies over a zone. That sounds similar to what they did in Antarctica. I’ve bolded a bit of the comments where it talks about discarding and shifting data.
C**** This program trims SBBX files by replacing a totally missing
C**** time series by its first element. The number of elements of the
C**** next time series is added at the BEGINNING of the previous record.

[…]
C**** AVG(1–>NM=INFO(4)) is a full time series, starting at January
C**** of year IYRBEGONFO(6) and ending at December of year IYREND.
C**** NSt = # of stations contributing to the sub box (0 for ocnfile)
C**** NstMn = # of stations months contributing to the sub box (oc:#ok)
C**** Dmin = distance of center from nearest contributing station (km)
So if a time series is missing in a ‘sub box’, we fill it in with the data from the first element (replicated), then take the following data and somehow use that to adjust for any drift this would cause by adding some of the following data into preceeding records. I have not fully worked out this code yet and can not point to a smoking gun, but at this point it just looks to me like making up data so a curve does not have jumps in it.
Zones are belts of 30 degrees each. So up to 30 degrees from the S. pole is one ‘belt’. This next bit of comment is from code that finds the average anomalies. Of particular interest are the bold bits where it states that a regional block being added into the zonal mean has it’s values adjusted so that the mean of any overlapping area does not change (that, and the cavalier way that real data are discarded, perhaps to leave gaps that are later filled in by averaging, interpolations, or ??? or maybe just leaving an anomaly out of the computations…)
At any rate, IMHO, the whole ‘reference station method’ (RSM) process is seriously broken and anything using this approach is at best a polite fiction, at worst… [I’ll save you the snip…].
One major problem is that the RSM is based on the idea that correlations have been shown to be a valid proxy, yet it’s a linear fit adjustment that is done, not a ‘correlation function’ that is used in the code. I don’t think ‘linear fit’ has ever been shown in peer reviewed publications to be valid. [sorry, couldn’t resist banging THEM with the peer thing 😉 ]
Another is that repeated applications of this method has not been shown to be valid. (Not all transforms are valid when done recursively repeatedly…)
And finally, I can see no rationale for tossing out real data in one step, then creating false in-fill anomalies in another to make up for it. (Well, maybe finally+1; the whole idea of taking as a valid reference for an inland station, data from the coast, is just the inverse of the San Franciso / Lodi problem. If the coast is moderated, and inland is freezing it’s tail off, falsely moderating inland based on projections from the coast is, er, just wrong…)
From zonav.f:
C**** This program combines the given gridded data (anomalies)
C**** to produce AVERAGES over various LATITUDE BELTS.
C****
C**** Input file: unit 11 (=output of job NCARSURF GRIDDING)
C****
C**** 11: Record 1: INFOI(1),…,INFOI(8),TITLEI header record
C**** Record 2: AR(1–>MONM0),WTR(1–>MONM0),NG grid point 1
C**** Record 3: AR(1–>MONM0),WTR(1–>MONM0),NG grid point 2
C**** etc.
C****
C**** Output files: unit 10
C****
C**** 10: Record 1: INFO(1),…,INFO(8),TITLEO,TXT header record
C**** Record 2: DATA(1–>MONM),WT(1–>MONM),TITLE1 belt 1
C**** Record 3: DATA(1–>MONM),WT(1–>MONM),TITLE2 belt 2
C**** etc.
C**** DATA(1–>MONM) is a full time series, starting at January
C**** of year IYRBEG and ending at December of year IYREND.
C**** WT is proportional to the area containing valid data.
C**** AR(1) refers to Jan of year IYRBG0 which may
C**** be less than IYRBEG, MONM0 is the length of an input time
C**** series, and WTR(M) is the area of the part of the region
C**** that contained valid data for month M.
C**** NG is the total number of non-missing data on that record.
C**** TITLE1,… describe the latitude belts.
C****
C**** INFO(1),…,INFO(8) are 4-byte integers,
C**** All TITLEs and TXT are 80-byte character strings,
C**** all other entries in records 2,3,… are 4-byte reals.
C**** INFO 1 and 5 are irrelevant
C**** 2 = KQ (quantity flag, see below)
C**** 3 = MAVG (time avg flag: 1 – 4 DJF – SON, 5 ANN,
C**** 6 MONTHLY, 7 SEAS, 8 – 19 JAN – DEC )
C**** 4 = MONM (length of each time record)
C**** 6 = IYRBEG (first year of each time record)
C**** 7 = flag for missing data
C**** 8 = flag for precipitation trace
C**** INFO(I)=INFOI(I) except perhaps for I=4 and I=6.
C**** In the output file missing data are flagged by
C**** the real number XBAD = FLOAT( INFO(7) )
C****
C**** JBM zonal means are computed first, combining successively
C**** the appropriate regional data (AR with weight WTR). To remove
C**** the regional bias, the data of a new region are shifted
C**** so that the mean over the common period remains unchanged
C**** after its addition. If that common period is less than
C**** 20(NCRIT) years, the region is disregarded. To avoid that
C**** case as much as possible, regions are combined in order of
C**** the length of their time record. A final shift causes the
C**** 1951-1980 mean to become zero (for each month).

C****
C**** All other means (incl. hemispheric and global means) are
C**** computed from these zonal means using the same technique.
C**** NOTE: the weight of a zone may be smaller than its area
C**** since data-less parts are disregarded; this also causes the
C**** global mean to be different from the mean of the hemispheric
C**** means.
C****
C?*** Input parameters (# of input files, time period)
C?*** Out put parameters (output time period, base period)
PARAMETER (IYBASE=1951,LYBASE=1980)
If this is not of interest, let me know and I’ll stop posting ‘code bits’ and save it for a final report…
Reply: Do you have my phone number? I’m not sure I sent it. I’m back from Brazil. Steve McIntyre’s site may be a better place for this. I’ll email number tonight. ~ charles the moderator.