Comprehensive network analysis shows Climategate likely to be a leak

This lends cred to WUWT’s previous analysis done by our own Charles the moderator: The CRUtape Letters™, an Alternative Explanation,

Climate-Gate: Leaked

by Lance Levsen, Network Analyst – courtesy of Small Dead Animals

http://www.swfwmd.state.fl.us/conservation/indoors/leak.jpg

Introduction

Some time starting in mid November 2009, ten million teletypes all started their deet-ditta-dot chatter reeling off the following headline: “Hackers broke into the University of East Anglia’s Climate Research Unit….”

I hate that. It annoys me because just like everything else about climate-gate it’s been ‘value-added’; simplified and distilled. The contents of FOIA2009.zip demand more attention to this detail and as someone once heard Professor Jones mutter darkly, “The devil is in the details…so average it out monthly using TMax!”

The details of the files tell a story that FOIA2009.zip was compiled internally and most likely released by an internal source.

The contents of the zip file hold one top-level directory, ./FOIA. Inside that it is broken into two main directories, ./mail and ./documents. Inside ./mail are 1073 text files ordered by date. The files are named in order with increasing but not sequential numbers. Each file holds the body and only the body of an email.

In comparison, ./documents is highly disorganized. MS Word documents, FORTRAN, IDL and other computer code, Adobe Acrobat PDF’s and data are sprinkled in the top directory and through several sub-directories. It’s the kind of thing that makes the co-workers disorganized desk look like the spit and polish of a boot camp floor.

What people are missing entirely is that these emails and files tell a story themselves.

The Emails

Proponents of the hacker meme are saying that s/he broke into East Anglia’s network and took emails. Let’s entertain that idea and see where it goes.

There is no such thing as a private email. Collecting all of the incoming and outgoing email is simple in a mail server. Using: Postfix the configuration is always_bcc=<email address>, here are links on configuring the same for Sendmail, and for Exim. Those are the three main mail servers in use in the Unix environment. Two of them, Sendmail and Exim are or were in use as the external mail gateways and internal mail servers at the University of East Anglia (UEA).

When a mail server receives an email for someone@domain.net, it checks that it is authoritative for that domain. This means that a server for domain.net will not accept email for domain.ca. The mail server will usually then run checks on the email for spam, virus, and run other filters. It will then check to see whether to route the email to another server or to drop the email in a users mailbox on that server. In all examples examined in the released emails, the mail gateway forwarded the emails to another server.

The user then has a mail client that s/he uses to read email. Outlook Express, Eudora, Apple Mail, Outlook, Thunderbird, mutt, pine and many more are all mail clients.

Mail clients use one of two methods of reading email. The first is called POP and that stands for Post Office Protocol. A mail client reading email with POP logs into the mail server, downloads the email to the machine running the mail client and will then delete the original email from the users spool file on the mail server.

The second protocol is called IMAP, Internet Message Access Protocol. IMAP works by accessing the mailboxes on the mail server and doing most of the actions there. Nothing is actually downloaded onto the client machine. Only email that is deleted and purged by the mail client is gone. Either protocol allows the user the opportunity to delete the email completely.

Most email clients are setup for reading emails with POP by default and POP is more popular than IMAP for reading email.

The released emails are a gold mine for a system administrator or network administrator to map. While none of the emails released contained headers, several included replies that contained the headers of the original emails. An experienced administrator can create an accurate map of the email topography to and from the CRU over the time period in question, 1998 thru 2009.

Over the course time, UEA’s systems administrators made several changes to the way email flows through their systems. The users also made changes to the way they accessed and sent email.

The Users

Using a fairly simple grep1 we can see that from the start of the time-frame, 1999, until at least 2005 the CRU unit accessed their email on a server called pop.uea.ac.uk. Each user was assigned a username on that server. From the released emails, we can link username to people as such:

In the previously referenced grep comes some more useful information. For instance, we know that Professor Davies was using QUALCOMM Windows Eudora Light Version 3.0.3 (32) in September of 1999. (ref Email: 0937153268.txt). If you look at the README.txt for that version you can see that it requires a POP account and doesn’t support IMAP.

As mentioned previously, POP deletes email on the server usually after it is downloaded. Modern POP clients do have an option to save the email on the server for some number of days, but Eudora Light 3.0.3 did not. We can say that Professor Davies’ emails were definitely removed from the server as soon as “Send/Recv” was finished.

This revelation leaves only two scenarios for the hacker:

  1. Professor Davies’ email was archived on a server and the hacker was able to crack into it, or
  2. Professor Davies kept all of his email from 1999 and he kept his computer when he was promoted to Pro-Vice Chancellor for Research and Knowledge Transfer in 2004 from his position as Dean of the School of Environmental Sciences.

The latter scenario requires that the hacker would have had to know how to break into Prof. Davies’ computer and would have had to get into that computer to retrieve those early emails. If that were true, then the hacker would have had to get into every other uea.ac.uk computer involved to retrieve the emails on those systems. Given that many mail clients use a binary format for email storage and given the number of machines the hacker would have to break into to collect all of the emails, I find this scenario very improbable.

Which means that the mail servers at uea.ac.uk were configured to collect all incoming and outgoing email into a single account. As that account built up, the administrator would naturally want to archive it off to a file server where it could be saved.

This is a simple evolution. You just run a crontab to start a shell-script that will stop the mail server, move the mail spool file into a file somewhere else, nulls the live spool and restart the mail server. The account would reside on the mail server, the file could be on any server.

Alternatively you could use a procmail recipe to process the email as it comes in, but that may be a bit too much processing power for a very busy account.

This also helps to explain the general order of the ./mail directory. Only a computer would be able to reliably export bodies of email into numbered files in the FOIA archive. As the numbers are in order not just numerically but also by date, the logical reasoning is that a computer program is numbering emails as they are processed for storage. This is extremely easy to do with Perl and the Mail::Box modules.

The Email Servers

I’ve created a Dia diagram2 of the network topography regarding email only as demonstrated in the released emails. Here’s a jpeg of it:

CRU's network for email 	  from 1998 thru 2009.
click to enlarge

The first thing that springs to mind is that the admins did a lot of fiddling of their email servers over the course of ten years. 🙂 The second thing is the anomaly. Right in the middle of 2006-2009 there is a Microsoft Exchange Server. Normally, this wouldn’t be that big of an blip except we’ve already demonstrated that the servers at UEA were keeping a copy of all email in and out of the network. Admins familiar with MS Exchange know that it too is a mail server of sorts.

It is my opinion that the MS Exchange server was working in conjunction with ueams2.uea.ac.uk and I base this opinion on the fact that ueams2.uea.ac.uk appears both before and after the MS Exchange Server. It doesn’t change its IP address nor does it change the type of mail server that is installed on it. There is a minor version update from 4.51 to 4.69. You can see Debian’s changelog between the Exim versions here.

I’ve shown that the emails were collected from the servers rather than from the users accounts and workstations, but I haven’t shown which servers were doing the collection. There are two options, the mail gateway or the departmental mail servers.

As demonstrated above, I believe that the numbers of the filenames correspond to the order that the emails were archived. If so, the numbers that are missing, represent other emails not captured in FOIA2009.zip.

I wrote a short Bash program3 to calculate the variances between the numbering system of the email filenames. The result is staggering, that’s a lot of email outside of what was released. Here’s a graph of the variances in order as well as a graph with the variances numerically sorted . Graph info down below.

Variance from Email Number to the 	  last Email Number
click to enlarge
Variances sorted and plotted
click to enlarge

The first graph is a little hard to read, but that’s mostly because the first variance is 8,805,971. To see a little better, just lop off the first variance and rerun gnuplot. For simplicity, that graph is here. The mean of the variances is 402839.36 so the average amount of emails between each released email is 402,839. While not really applicable, but useful, the standard deviation is 736228.56 and you can visualize that from the second graph.

I realize that variance without reference is useless, in this instance the number of days between emails. Here is a grep of the emails with their dates of origin.

I do not see the administrators copying the email at the departmental level, but rather at the mail gateway level. This is logical for a few reasons:

  • The machine name ueams2.uea.ac.uk implies that there are other departmental mail servers with the names like ueams1.uea.ac.uk, (or even ueams.uea.ac.uk), maybe a ueams3.uea.ac.uk. If true, then you would need to copy email from at least one other server with the same scripts. This duplication of effort is non-elegant.
  • There is a second machine that you have to copy emails from and that is the MS Exchange server so you would need a third set of scripts to create a copy of email. Again, this would be unlike an Administrator.
  • Departmental machines can be outside the purview of Administration staff or allow non-Administrative staff access. This is not where you want to be placing copies of emails for the purposes of Institutional protection.
  • As shown with the email number variances, and if they are representative of the email number as it passed through UEA’s email systems, that’s a lot of emails from a departmental mail server and more like an institutional mail gateway.

So given the assumptions listed above, the hacker would have to have access to the gateway mail server and/or the Administration file server where the emails were archived. This machine would most likely be an Administrative file server. It would not be optimal for an Administrator to clutter up a production server open to the Internet with sensitive archives.

The Documents

The ./FOIA/documents directory is a complete mess. There are documents from Professor Hulme, Professor Briffa, the now famous HARRY_READ_ME.txt, and many others. There seems to be no order at all.

One file in particular, ./FOIA/documents/mkhadcrut is only three lines long and contains:

	  tail +13021 hadcrut-1851-1996.dat | head -n 359352 | ./twistglob > hadcrut.dat

	  # nb. 1994- data is already dateline-aligned

	  cat hadcrut-1994-2001.dat >> hadcrut.dat

Pretty simple stuff, get everything in hadcrut-1851-1996.dat starting at the 13021st line. From that get only the first 359352 lines and run that through a program called twistglob in this directory and dump the results into hadcrut.dat. Then dump all of the information in hadcrut-1994-2001.dat into the bottom of hadcrut.dat.

….Except there isn’t a program called twistglob in the ./FOIA/documents/ directory. Nor is there the resultant hadcrut.dat or the source files hadcrut-1851-1996.dat and hadcrut-1994-2001.dat.

This tells me that the collection of files and directories in ./documents isn’t so much a shared directory on a server, but a dump directory for someone who collected all of these files. The originals would be from shared folders, home directories, desktop machines, workstations, profiles and the like.

Remember the reason that the Freedom of Information requests were denied? In email 1106338806.txt, Jan 21, 2005 Professor Phil Jones states that he will be using IPR (Intellectual Property Rights) to shelter the data from Freedom of Information requests. In email 1219239172.txt, on August 20th 2008, Prof. Jones says “The FOI line we’re all using is this. IPCC is exempt from any countries FOI – the skeptics have been told this. Even though we (MOHC, CRU/UEA) possibly hold relevant info the IPCC is not part our remit (mission statement, aims etc) therefore we don’t have an obligation to pass it on.”

Is that why the data files, the result files and the ‘twistglob’ program aren’t in the ./documents directory? I think this is a likely possibility.

If Prof. Jones and the UEA FOI Officer used IPR and the IPCC to shelter certain things from the FOIA then it makes sense that things are missing from the ./documents directory. Secondly it supports the reason that ./documents is in such disarray is that it was a dump folder. A dump folder explicitly used to collect information for the purpose of release pursuant to a FOI request.

Conclusion

I suggest that it isn’t feasible for the emails in their tightly ordered format to have been kept at the departmental level or on the workstations of the parties. I suggest that the contents of ./documents didn’t originate from a single monolithic share, but from a compendium of various sources.

For the hacker to have collected all of this information s/he would have required extraordinary capabilities. The hacker would have to crack an Administrative file server to get to the emails and crack numerous workstations, desktops, and servers to get the documents. The hacker would have to map the complete UEA network to find out who was at what station and what services that station offered. S/he would have had to develop or implement exploits for each machine and operating system without knowing beforehand whether there was anything good on the machine worth collecting.

The only reasonable explanation for the archive being in this state is that the FOI Officer at the University was practising due diligence. The UEA was collecting data that couldn’t be sheltered and they created FOIA2009.zip.

It is most likely that the FOI Officer at the University put it on an anonymous ftp server or that it resided on a shared folder that many people had access to and some curious individual looked at it.

If as some say, this was a targeted crack, then the cracker would have had to have back-doors and access to every machine at UEA and not just the CRU. It simply isn’t reasonable for the FOI Officer to have kept the collection on a CRU system where CRU people had access, but rather used a UEA system.

Occam’s razor concludes that “the simplest explanation or strategy tends to be the best one”. The simplest explanation in this case is that someone at UEA found it and released it to the wild and the release of FOIA2009.zip wasn’t because of some hacker, but because of a leak from UEA by a person with scruples.

Footnotes

1 See file ./popaccounts.txt

2 See file ./email_topography.dia

3 See file ./email_variance.sh

4 See file ./gnuplotcmds

Notes

Graph Information

Graphs created with gnuplot using a simple command file4 for input. I use a stripped down version of the variants_results_verbose.txt file, it’s the same, just stripped of comment and the filenames.. The second graph is a numerically sorted version, $> sort -n ./variance_results.txt > variance_sorted_numerically.txt.

Assigned Network Numbers for UAE from RIPE.NET

RIPE.NET has assigned 139.222.0.0 – 139.222.255.255,193.62.92.0 – 193.62.92.255, and 193.63.195.0 – 193.63.195.255 to the University of East Anglia for Internet IP addresses.

RIPE.NET Admin contact for the University of East Anglia: Peter Andrews, Msc, Bsc (hons) – Head of Networking at University of East Anglia. (Linked In, Peter isn’t in the UEA directory anymore so I assume he is no longer at UEA.)

RIPE.NET Tech Contact for the University of East Anglia: Andrew Paxton

Current Mail Servers at UEA

A dig for the MX record of uea.ac.uk (email servers responsible for the domain uea.ac.uk) results in the following:

	  $> dig mx uea.ac.uk

	  ; <<>> DiG 9.6.1-P2 <<>> mx uea.ac.uk

	  ;; global options: +cmd

	  ;; Got answer:

	  ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 737

	  ;; flags: qr rd ra; QUERY: 1, ANSWER: 2, AUTHORITY: 13, ADDITIONAL: 13

	  ;; QUESTION SECTION:

	  ;uea.ac.uk.			IN	MX

	  ;; ANSWER SECTION:

	  uea.ac.uk.		50935	IN	MX	2 ueamailgate01.uea.ac.uk.

	  uea.ac.uk.		50935	IN	MX	2 ueamailgate02.uea.ac.uk.

The IP addresses for the two UEA email servers are:

ueamailgate01.uea.ac.uk. 28000 IN A 139.222.131.184

ueamailgate02.uea.ac.uk. 28000 IN A 139.222.131.185

Test connections to UEA’s current mailservers:

	  $> telnet ueamailgate01.uea.ac.uk 25

	  Trying 139.222.131.184...

	  Connected to ueamailgate01.uea.ac.uk.

	  Escape character is '^]'.

	  220 ueamailgate01.uea.ac.uk ESMTP Sendmail 8.13.1/8.13.1; Mon, 7 Dec 2009 01:45:42 GMT

	  quit

	  221 2.0.0 ueamailgate01.uea.ac.uk closing connection

	  Connection closed by foreign host.

	  $> telnet ueamailgate02.uea.ac.uk 25

	  Trying 139.222.131.185...

	  Connected to ueamailgate02.uea.ac.uk.

	  Escape character is '^]'.

	  220 ueamailgate02.uea.ac.uk ESMTP Sendmail 8.13.1/8.13.1; Mon, 7 Dec 2009 01:45:49 GMT

	  quit

	  221 2.0.0 ueamailgate02.uea.ac.uk closing connection

About Me

I’ve been a Unix, Windows, OS X and Linux systems and network administrator for 15 years. I’ve compiled, configured, and maintained everything from mail servers to single-signon encrypted authentication systems. I run lines, build machines and tinker with code for fun. You can contact me via: lance@catprint.ca.

Lance Levsen,

December, 2009


Sponsored IT training links:

We offer 100% pass result in first attempt for all kind of IT exams including 70-685 and 70-271. Join 640-460 online course to save a big deal on real exam.


0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

256 Comments
Inline Feedbacks
View all comments
Svein
December 7, 2009 9:10 pm

There’s definitely something about this “indusieumgresium” dude. Do a Google search, and you’ll see that he has posted that video to almost every video site on the entire internet (slight exaggeration), all on Nov 19.

AnonyMoose
December 7, 2009 9:45 pm

co2isnotevil (18:38:05) : The reason some of the hours are off may be because the messages are to and from a wide variety of time zones.

When the last part of the time in the email contains a number such as “-0400”, that indicates the timezone. The filename ctime value matches the time sent. We don’t know whether the filenames were created by recently processing the emails, or by an email archive method.
I consider it unlikely that the university’s email server is archiving with the timestamp of when the message was sent, as it is likely that many people sending to the university will happen to send at nearly the same time as others. So the filename timestamps were probably created as part of organizing this batch of emails.

AndyW
December 7, 2009 10:18 pm

I don’t think that a whistleblower would try to hack into the Realclimate website as well, I think that tends to suggest it was a hacker who got in, got the info and then attempted to force it to be shown on realclimate.
Andy

December 7, 2009 10:48 pm

Why I doubt the FOI 2009 inforrmation was put together as part of a FOI release is it contains only incriminating material and nothing else. That’s not what you provide in such a request. You provide useful and useless information that fulfills the request.
I haven’t been involved a FOI release, but I have in a Justice Dept challenged merger. Eventually went all the way to the Supreme Court. Justice requested all files, so we gave them all files. This was in the days when PCs weren’t common and all records were on paper. We just filled about two dozen boxes, they picked them up and xeroxed them and returned them in about 10 days. None of our folks sorted through the files to find incriminating material, to make sure they found it. God forbid! Nor would anyone in their right mind do it for a FOI request.
This material most likely put together by a disaffected IT worker who wanted to torpedo Phil Jones and/or the whole warmist agenda. He certainly waited for the right moment.

crosspatch
December 7, 2009 10:49 pm

The filenames were created from the date, of that I am certain. Because the day of year and minutes and seconds match. The only thing that does NOT match is the hour which seems to vary. This is very clever as it removes any artifact of the person’s own timezone. It would go like this:
Parse the date, there are several date parsing routines around, into an array. Randomly “adjust” the “hour” value. Convert the date to Unix time. The resulting Unix time will result in a date exactly like what we are seeing here. The date, minutes, and seconds will match up but the hour will be off masking any timezone artifact of the “adjuster’s” computer.
If this wasn’t done, say they date was always 5 hours off. This might indicate that the person who renamed the file to a unix date string was using a computer that was set to a timezone 5 hours different from GMT and that is initially what I thought until I looked at additional files and noted that the time varied but ONLY varied in the hour value. The minutes and seconds were always consistent. This makes any consistent bias of the computer’s timezone impossible.
The original file name of the email had to be erased. I just don’t understand why they weren’t numbered sequentially unless MANY emails were renamed this way and only the “pertinent” files selected for inclusion into the zip archive. If they had originally been numbered sequentially, one would know how many were skipped and roughly how many there were in total. This way there is no way to tell and there is no way to tell where it was done.

jallen
December 8, 2009 1:12 am

Slightly different take:
They are the residual emails of a batch which had already been *sanitized* from the CRU systems, in order to illegally prepare an incomplete response for a future (likely successful) FOIA request. The emails in question were *not* going to be provided under a FOIA request.
These are deleted emails from a sanitized batch which were foolishly or purposely archived and/or discovered by an insider or whistleblower (perhaps the sanitizer himself). The insider then had pangs of conscience or an axe to grind and released them surreptitiously.

jallen
December 8, 2009 1:28 am

Also: The leaker may enjoy protection under the UK’s Public Interest Disclosure Act of 1998, which was enacted to protect whistleblowers.

Roger Knights
December 8, 2009 2:43 am

“This material most likely put together by a disaffected IT worker who wanted to torpedo Phil Jones and/or the whole warmist agenda. He certainly waited for the right moment.”
I think it would have been more damaging to have released the material a couple of months earlier, if the intent was to derail Copenhagen and prevent the warmists from getting their ducks in a row by having Clinton globe-trot to wangle deals with India and China, NZ commit it CO2 reduction, etc.

Thomas Jones
December 8, 2009 2:43 am
Chris Schoneveld
December 8, 2009 2:56 am

Richard (12:34:32) :
“Since climategate is only one word, searching for climategate or “climategate” should have no difference,”
Well, try it out and you will see the different outcomes:
Climategate without quotes gives: 37,200,000
“Climategate” gives: 2,670,000

Ian Summerell
December 8, 2009 3:04 am

Inside job, looks like it to me.
I worked in the civil service for six years and there was one room of a computer server. No one in the offices know what it did. I think it was a server to monater our work. The only person allowed in the room had a MI5 pass.
Was it the work of MI5 and they leaked the files?

BobC
December 8, 2009 3:15 am

I have seen it reported that 30-40 Russian students study at EAU. It is not inconceivable that one of them leaked the files compiled by the FOI officer, so the Russian connection is not disproven by this analysis. It is made more likely given the original server in Russia (for the leaked files) being one used by a local university (which a student from that university would have had access to).

Mike
December 8, 2009 4:20 am

It always looked like the data had been gathered by the UEA, and I could never understand where the idea of “hacker” came from. If it is the UEA themselves who have been spreading this rumour, then it is yet more proof of their fraudulent behaviour and one can see why someone on the inside might have got fed up with them and whistelblown.
But, there still is the possibility that the FOI officers PC/data was hacked.
As for the allegations that the Russian state were involved, it just shows the paranoia that is endemic in the UEA!

December 8, 2009 4:58 am

Bobc, as far as I can see the only reason anyone is suggesting a Russian involvement is that the files just happened to be placed on a Russian Server. The whole internet knows the reputation of Russia regarding the web, on a (village) forum I run, we block anyone signing up with a Russian email address because they are notorious for spam.
… or is it, could it be, that the real enemy of the warmers, is the old “cold-war enemy?”

Abendigo
December 8, 2009 5:31 am

Despite the “Evil Hackers” front the CRU is putting on this, they must be frantically searching for the leaker. Any word or insight on this?

Syl
December 8, 2009 6:25 am

Alan Shore (16:40:52) :
“Thirdly, he didn’t even consider the idea that in all likelihood the documents were attachments to the emails. ”
Since the data files are NOT attachments to the emails it seems the mighty warmers haven’t even bothered to check either the emails or the data themselves. Why? Afraid of what you may find perhaps?
Tut Tut

Hangtime55
December 8, 2009 9:54 am

I knew that the emails from the CRU were leaked . In all of the posts i’ve been responding to , I ALWAYS used the ” hacked/leaked ” phrase .
Mr. Paul Hudson, a weather presenter and climate correspondent for the BBC says he was FORWARDED the chain of e-mail material on October 12th , 2009 , more than five weeks before it was ‘ hacked ‘ and made public on the internet so how could Russian hackers be responsible for breaking into the university computers when Hudson was FORWRDED the same data ?
This brings up another question , if Hudson was forwarded the ClimateGate files on October 12, 2009 , could there be other recipients of the same information ?
Why it was said that the hacked/leaked emails came from a Russian Server could be a clue . I am only assuming that around the same time that the ClimateGate files were forwarded to Mr. Paul Hudson of the BBC , other copies could had been sent the same day or before or after Hudson received his copy .
Nevertheless , again I knew the CRU files were leaked and this should be the end to Rajendra Pachauri, chairman of the Intergovernmental Panel on Climate Change (IPCC), chairman of the Intergovernmental Panel on Climate Change (IPCC) sad and pitiful attempt to distract the world from the Conspiracy within the ClimateGate files .
Rajendra Pachauri is taking a page out of George W Bush’s playbook , chasing ‘ imaginary ‘ hackers or terrorists while the plot behind the scenes continue to grow.

Richard
December 8, 2009 11:55 am

Chris Schoneveld (02:56:53) :
Richard (12:34:32) :
“Since climategate is only one word, searching for climategate or “climategate” should have no difference,”
Well, try it out and you will see the different outcomes

This does not change the fact of my statement. It SHOULD have no effect? Comprehende?
if it does thats shows there is something wrong with the search mechanism, which is acting funny with Climategate. 310,000,000 hits sometimes and 30,000,000 at others for example.

Richard
December 8, 2009 11:58 am

PS – Lance Levsen – thats a real neat job, quite convincing that its not a hack.

Richard
December 8, 2009 12:11 pm

crosspatch (12:48:52) :
“But water accounts for about 60% of the greenhouse effect, while CO2 accounts for 26%. ”
I find that difficult to believe. CO2 is 0.038% of the atmosphere. The spectrum across which it absorbs IR is much narrower than H2O. In order for CO2 to provide half the total greenhouse warming of Earth, there would have to be more CO2 than water vapor in the atmosphere. CO2 is a much less efficient greenhouse gas than is water vapor

Your reasoning makes sense. I plucked the figure out of Wikipedia where it says CO2 is 9 – 26% I guess I just copied the 26. So that would make the average 17.5%.
Still this seems big. If water vapour is 10 times more than CO2, and water absorbs radiation over a greater range of the spectrum. Maybe its the infra-red where the CO2 absorbs more. That would account for it being a more effecient greenhouse gas

December 8, 2009 1:58 pm

Richard, Crosspatch,
Here’s a plot of the atmospheric transmittance between the surface and space. The different colors represent which gas is doing most of the absorption. This has been compiled from the complete HITRAN 2008 database. Gas color codes are on the left. The gray line is representative of the average energy flux leaving the planet.
http://www.palisad.com/co2/absorb.gif
Simulations show that about 1/3 of the GHG absorption is CO2 and 2/3 is from water vapor. Overall, about half of the surface radiation is absorbed by the atmosphere.
These are all average values representing a wide range of conditions. If the atmosphere is very dry, the CO2 fraction approaches and can even exceed half of the effect. If the atmosphere is very moist, the CO2 contribution can drop to about 10% of the total effect. For surface covered in heavy clouds, nearly 100% of the surface energy is absorbed by the atmosphere (and clouds), independent of GHG concentrations.
George

Chuck Hakkarinen
December 8, 2009 7:15 pm

crosspatch (09:32:25) :
No, the filenames of the emails are the unix epoch time of the email. The missing numbers are elapsed time between the emails. The filenames are the datestamps. I believe this is the date when the file was archived as there is generally a couple of hours difference between that date stamp and the date/time stamp in the headers of the email.
For example:
1123622471.txt
1123622471 is unix time August 9, 2005 21:21:11 GMT
The date/time stamp on that email is:
Date: Tue Aug 9 17:21:11 2005
Notice the hour is different but the minutes/seconds are the same.
^^^^^^^^^^^^^^^^^
If the unix time is 2100 GMT and the email date/time stamp is 1700, on 9 Aug, then this suggests to me that this particular email was sent from the Eastern (Summer) Time zone of the USA or Canada, which happens to be GMT-4 in August. Having not read any of the emails in the zip file, can someone confirm or refute my interpretation?

Richard
December 8, 2009 9:12 pm

co2isnotevil (13:58:14) : from that graph I would say that Ozone is an even more potent GHG than CH4

Daniel Fierro Sydney
December 9, 2009 6:26 am

Treason. All these politicians who attend Hoaxenhagen should be all sacked and trialed for Treason. Especially Australia’s Kevin Rude.

JMANON
December 9, 2009 9:51 am

I accepted at the start that this had to be an inside job.
But who? someone who happened on a file or directory being complied by someone for some nefarious purpose? or compiled by the leaker?
This analysis has been very insightful but doesn’t really answer who compiled it, when and why.
I have to suspect that this folder was carefully managed for leaking.
Why?
Because, suppose I am a researcher at UEA and I want to hide stuff from FOI disclosure. This is a pretty serious action and I’d try everything else first, as is evidently the implications, nobble the UEA FOI officer, and so on. Only as the position becomes desperate enough would I start deleting stuff but I might choose to simply move any incriminating or sensitive material to a new directory somewhere that can be retrieved when the problem has passed or if necessry to avoid going to jail “Oh, I’ve just found it all. Don’t know how it got there but you know what filing is like.”
But if that were the case, I’d just go through all my material and just move anything and everything suspect over to the hideaway stash. I wouldn’t spend time filtering it.
You might argue that that is what we have but I suggest it is not.
Why? because we only have single copies of emails with multiple replies embedded in them which means only a copy was selected that had the replies. Originals or copies in the inbox with earlier replies don’t seem to appear.
If I was creating a stash it wouldn’t matter that I had collected all versions in there – disc space is not a problem, time is. Time to selectively save and selectively delete. only had the most complete copy.
That suggests to me that someone took care to compile this, someone familiar enough to know what to include and what not and with the time to do it.
I suspect anyone discovering a hideaway folder would probably have sneek preview and then release pretty much all of it as it stood and as quickly as possible so as not to have copies on machines subject to inspection (an insider leaking).
So it still begs the question as to who compiled the folder, under what circumstances and did they go through afterwards and trim out a lot of fat and if so why?
It makes me think that some one was careful and familiar enough and with time enough when selecting emails only to include the latest copy with the most replies.
It raises the question as to the purpose of the folder again and I am not convinced it is simply a hideaway folder for incriminating stuff under FOI in case they need to produce it to stay out of jail, say.
In here we might have found the missing raw data, for example. all safe together.
I don’t know how true duplicate email thing is, I didn’t read all of the emails but the searchable website suggests it is so e.g. search for Phil Jones and we don’t find it there.
I also don’t know how significant this is. I may be reaching.
So the nature of the emails in there requires a bit of thought to create a realistic scenario that accounts not just for what is in there but what is not and while some could be eliminated as obviously private and some as unrelated, why only the copies with the most replies?
If we can aanswer that, we might know something more.
Since its release there are probably few corners as yet unexamined though the implications of some of that stuff has yet to be fully developed but we probably shouldn’t expect more.
On the other hand, if complied with a view to leaking, and carefully compiled by someone with sufficient knowledge to know already what is significant, then it would have to be that much of what is there has been carefully selected and the possibility that whoever did it was familiar enough with the material to appreciate the significance of the individual items does argue that some of the most damaging has yet to be released.
If so then what is to be released will probably be released near the end of Copenhagen and will probably be material that won’t be too voluminous and won’t require too much evaluation for the full import to be appreciated within a day or so.
Of course, I could just be indulging in wishful thinking and hoping for more and more dramatic revelations… and if a carefully planned release by some one with time to sift through the docs and with sufficient knowledge to appreciate all that was contained, then I’d have to suspect a new release soon.