The Climategate email network infrastructure

Computer Room 1, University of East Anglia Jan 13, 2009 - somewhere in here is the email and the data

Guest Post by David M. Hoffer

Since both ClimateGate 1&2 there has been considerable confusion in regard to how the emails were obtained, how the FOIA requests were managed, and what was or wasn’t possible in that context.  There is no simple answer to those questions.

The ClimateGate emails span a period of nearly two decades.  During that time period, email systems evolved substantially in terms of technology, implementation, operational procedures, and the job descriptions of those responsible for them.  Other technologies such as backup systems, archive, and supporting technologies for legal compliance also changed (as did the laws themselves).  Saying exactly what was and wasn’t possible for such simple actions as deleting an email have completely different answers over time, and also based on the technology that was implemented at any given time.  With so many moving targets, it is impossible to draw any conclusions to a 100 percent certainty.

This article is written to cover the basics of how email systems and their supporting infrastructure work, and how they have evolved over time.  With that as a common background, we can then discuss everything from the simple questions regarding who could delete what  (and when), how the emails might have been obtained, and possibly most interesting of all, raise some serious questions about the manner in which the FOIA requests were handled at the CRU.

EMAIL 101

There are many, different email systems, and many different ways for end users to access them.  The basics are common to all of them however.  Each user has a “client” that allows them to access their email.  It could be an internet browser based client such as the ones used by Hotmail and Gmail, or it could be an email client that runs on your desk top computer like Outlook or Eudora.  For the purposes of this discussion I am going to discuss how things work from the perspective of an email client running on a desk top computer.

The email client connects an email server (or servers in a very large implementation).  To send an email to someone on a different email server, the two servers must “talk” to each other.  In most cases they do so over the internet.   How the clients interact with the servers however, is part of understanding why deleting an email that you sent (or received) is not straight forward.  The reason is that an email is never actually “sent” anywhere.  Once you write an email it exists on the disk drive of the computer the client software is installed on.  Press “send” and it goes….nowhere.  It is still there, exactly as it was before you “sent” it.

A copy however, has now been sent to the email server you are connected to.  That email server makes yet another copy and sends it to the email server the recipient is connected to.  That email server then makes still one more copy and sends it to the email client on the recipient’s computer, which in turn writes it to the local hard drive.  There are now a minimum of four copies of that one email.

image

But wait, there may be more copies.  When researchers first started exchanging information via email, they were commonly years ahead of the rest of the world.  Most large organizations had central IT shops, but they ran financial applications for the most part, email was a curiosity at best.  Many researchers were left to run their own email systems, and it wasn’t that hard to do.  Solaris was the UNIX operating system in vogue in those days, and Solaris came with a pretty good email system built in called Sendmail.  There were many other options too.  The bottom line was that early email systems were frequently run by researchers on their own computers.

As time went on, email became more common, and it became more important.  The volume of data, performance matters, and security were all becoming beyond the skill set of anyone but someone whose full time job it was to run IT (Information Technology) systems.  Researchers began giving up ownership of their own email system and central IT shops took over.  Email was becoming mission critical, and a lot of data was being stored in email systems along with records of contract negotiations and other important “paper” trails.  Losing email was becoming a painful matter if important information disappeared as a result.  As a consequence, the systems that protected the data on email systems also began to mature and be run professionally by IT departments.

The early email systems were just a single server with local hard drives.  As they grew in capacity and overall usage, plain old hard drives could no longer keep up.  Storage arrays emerged which used many hard drives working together to increase both capacity and performance.  Storage arrays also came with  interesting features that could be leveraged to protect email systems from data loss.  Two important ones were “snapshots” and “replication”.

Snapshots were simply point in time copies of the data.  By taking a snapshot every hour or so on the storage array, the email administrator could recover from a crash by rolling back to the last available snapshot and restarting the system.  Some storage arrays could handle keeping a few snapshots, others could maintain hundreds.  But each snapshot was actually a full copy of the data!  Not only could a storage array store many copies of the data, consider the question of deletion.  If an email was received and then deleted after a snapshot, even by the central IT department itself, the email would still exist in the last snapshot of the data, not matter what procedure was used to delete it from the email system itself.

What if the storage array itself crashed?  Since the storage arrays could replicate their data to other storage arrays, it wasn’t uncommon to have two arrays and two email servers in a computer room so that no matter what failed, the email system could keep on running.  What if the whole computer room burned down?  Replication to another storage array at a completely different location is also very common, and should the main data centre burn down, the remote data centre would take over.  Keep in mind as you think this through that the ability of the storage arrays to replicate data in this fashion is completely and totally independent of the email system itself.

Early email systems were, as mentioned before, most often a single server with internal hard drives.  A modern “enterprise class” email system would be comprised of many servers and storage arrays more like this:

image

If you recall that just sending an email makes, at minimum, four copies, consider what “one” copy on a large email system actually translates to.  In the figure above, there are two copies on the storage arrays in the data centre.  If snapshots are being used, there may be considerably more.  Plus, there is at least one more copy being replicated to a remote data center, which also may have regular snapshots of data.  That’s a LOT of copies of just one email!  And we haven’t even started talking about backup and archive systems yet.

Let’s return to the question of deleting email.  It should be plain to see that in terms of current email technology, deleting an email just from the email system itself is not a simple task if your intention is to erase every single copy that ever existed.

As an end user, Phil Jones is simply running an email client connected to an email server run by somebody else.  He has no control over what happens on the server.  When he deletes an email, it is deleted from his email client (and hence the hard drive on his computer), and from his view of his emails on the email server.  Technically it is possible to set up the email server to also delete the email on the server at the same time, but that is almost never done, and we’ll see why when we start discussing backup, archive, and compliance.

On the other hand, are we talking about what was most likely to happen when Phil Jones deleted an email in 2009?  Or what was most likely to happen when Phil Jones deleted an e-mail in 1996?  The answers would most likely be entirely different.  In terms of how email systems have been run in the last ten years or so however, while it is technically possible that Phil Jones hit delete and erased all possible copies of the email that he received, this would have done nothing to all the copies on the sender’s desk top and on the sender’s email server… and backup systems.  Let’s jump now into an explanation of additional systems that coexist along with the email system, and make the possibility of simply deleting an email even more remote.

Backup Systems

Just as we started with email and how it worked at first and then evolved, let’s trace how backup systems worked and evolved.  There are many different approaches to backup systems, but I’ll focus here on the most common, which is to make a copy of data to a tape cartridge.

At first, backup was for “operational” purposes only.  The most common method of making a backup copy of data for a server (or servers) was to copy it to tape.  The idea was that if a disk drive failed, or someone deleted something  inadvertently, you could restore the data from the copy on tape.  This had some inherent problems.  Suppose you had a program that tracked your bank account balance.  But for some reason you want to know what the bank account balance was a week ago, not what it is today.  If the application didn’t retain that information, just updated the “current” balance as it went, you would have only one choice, which would be to restore the data as it existed on that specific day.  To do that, you’d need one tape for each day (or perhaps one set of tapes for each day in a larger environment).  That starts to be a lot of tape fast.  Worse, as data started to grow, it was taking longer to back it up (and the applications had to be shut down during that period) and the amount of time at night where people didn’t need their applications running kept shrinking as companies became more global.

Several approaches emerged, and I will be covering only one.  The most common  by far is an approach called “weekly full, daily incremental”.  The name pretty much describes it.  Every weekend (when the backup window is longest), a full copy of the data is made to tape.  During the week, only what changed that day is copied to tape.  Since changes represent a tiny fraction of the total data, they could be run in a fraction of the time a full copy could.  To restore to any given day, you would first restore the last “full copy” and then add each daily “incremental” on top until you got to the day you wanted.

This worked fine for many organizations, and larger ones bought “tape libraries” which were exactly what they sound like.  They would have slots for dozens, sometimes hundreds, of tape cartridges, several tape drives, and a robot arm that could change tapes for both backup processes and for restore processes.  The problem was that the tape library had to be as close as possible to the servers so that data could be copied as fast as possible (performance degrades sharply with distance).   The following depicts the email system we’ve already looked at, plus a tape backup system:

image

By making regular copies of data to tape, which was a fraction of the cost of disk storage, the IT department could have copies of the data, exactly as it existed on any given day, and going as far back as the capacity of the tape library (or libraries) would allow.  Now try deleting an email from say a year ago.  In addition to all the copies on disk, there are at least 52 copies in the tape library.  Since we have a tape library however, it is easy to make still more copies, automatically, and most organizations do.

Disaster Recovery

What if there was a major flood, or perhaps an earthquake that destroyed both our local and remote data centers?  In order to protect themselves from disaster scenarios, most IT shops adopted an “off site” policy.  Once the backup was complete, they would use the copy of the data on tape to make… another copy on tape.  The second set of tapes would then be sent to an “off site” facility, preferably one as far away as practical from the data centers themselves.

image

Consider now how many copies of a given email now exist at any given time.  Unlike that financial application whose “current account balance” is constantly changing, email, once received, should never change.  (But it might which is a security discussion just as lengthy as this one!).  Provided the email doesn’t change, there are many copies in many places, and no end user would have the security permissions to delete all of them.  In fact, in a large IT shop, it would take several people in close cooperation to delete all the copies of a single email.  Don’t organizations ever delete their old data?

Data Retention

The answer to that question can only be answered by knowing what the data retention policy of the organization is.  Many organizations just kept everything until the cost of constantly expanding their storage systems, tape libraries and the cost of housing off site tapes started to become significant.  Many organizations decided to retain only enough history on tape to cover themselves from a tax law perspective.  If the retention policy was implemented correctly, any tapes older than a certain period of time would be removed from the tape library and discarded (or possibly re-used and overwritten).  The copies in the offsite storage facility would also be retrieved to be either destroyed or re-used so that the offsite data and he onsite data matched.

Archive

As email systems grew, the backup practices described above became problematic.  How long people wanted to keep their email for was often in conflict with the retention periods for financial purposes.  They were designed for general purpose applications with ever changing data.  As the amount of data in an email system started to grow exponentially due to ever larger attachments, graphics, and volume, the expense and pressure on even an “incremental” backup window became enormous.  That’s where archive started to emerge as a strategy.  The storage arrays that supported large email systems were very expensive because they had to be ultra reliable as well as ultra high performance.  But 99% of all emails were being read on the day they were sent… and never again.  Only if something made an older email important… evidence of who said what and when from a year ago for example, would an email be accessed again after it was a few days old.  So why house it on the most expensive storage the organization  owned?  And why back it up and make a copy of it every week for years?

Many organizations moved to an “archive” which was simply a way of storing email on the cheapest storage available.  If someone needed an email from a year ago, they would have to wait minutes or perhaps hours to get it back.  Not a big issue provided it didn’t need to be done very often.  Some organizations used low performance low cost disk, some even went so far as to write the archive to tape.  So, for example, the email you sent and received in the last 90 days might open and close in seconds, but something from two years ago might take an hour.  Not only did this reduce the cost of storing email data, but it had the added benefit of removing almost all the email from the email system and moving it to the archive.  Since the archive typically wasn’t backed up at all, the only data the backup system had to deal with in its weekly full daily incremental rotation was the last 90 days.  This left an email system, with the integrated backup and archive systems, looking something like this:

image

For most IT shops, if you ask them how many copies of a given email they have if it was sent a year ago, they can’t even answer the question.  Lots.

What does that mean in terms of FOIA requests?  Plenty.

Compliance

The world was rolling along quite nicely using these general techniques to protect data, and then the law got involved.  Enron resulted in Sarbanes Oxley in the United States and similar laws in other countries  FOIA came in existence in most western countries.  Privacy laws cropped up.  Suddenly IT had a new problem, and a big one.  The board of directors was suddenly asking questions about data retention.  The IT department went from not being able to get a meeting with the board of directors to having the board shining a spot light on them.  Why?

Because they (the board of directors) could suddenly go to jail (and some did) because of what was in their email systems.  Worse, they could even go to jail for something that was NOT in their email system.  The laws in most jurisdictions took what you could delete, and what you could not delete, to a whole new level.  Worse (if you were a member of the board of directors) you could be held responsible for something an employee deleted and shouldn’t have…. or didn’t delete and should have.  Bingo.  The board of directors is suddenly no longer interested in letting employees decide what they can and cannot delete, and when.  The same applied in most cases to senior management of public institutions.

Back to the original question.  Could Phil Jones have deleted his emails?  When?  In the early days when his emails were all in a server run by someone in his department?  Probably.  When the email system moved to central IT and they started backing it up regularly?  No.  He would only be able to permanently delete any given email provided that he had access to all the snapshot copies on all the storage arrays plus the archive plus all the backup tapes (offsite and onsite).  Fat chance without the express cooperation of a lot of people in IT, and the job of those people, based on laws such as FOIA, SOX and others was to expressly prevent an end user such as Phil Jones from ever doing anything of the sort, because management had little interest in going to jail over something someone else deleted and shouldn’t have.

So…did CRU have a backup system?  Did they send tapes off site?  Did they have a data retention policy and what was it?  Did they have an archive?  If they had these things, when did they have them?

With all that in mind, now we can look at two other interesting issues:

  • What are the possible ways the emails could have been obtained?
  • Were the proper mechanisms to search those emails against FOIA requests followed?

Short answer: No.

In terms of how the emails could have been obtained, we’ve seen various comments from the investigation into ClimateGate 1 that they were most likely obtained from accessing an email archive.  This suggests that there at least was an email archive.  Without someone laying out a complete architecture drawing of the email systems, archive system, backup system, data retention policies and operational procedures, we can only guess at how the system was implemented, what options were available, and what options not.  What we can conclude however is that at some point in time, an archive was implemented.  Did it work like the description above about archives?  Probably.  But there are many different archive products on the market, and some IT shops refer to their backup tapes as an archive just to confuse matters more.

In addition, without knowing how the investigators came to the conclusion that the emails were obtained from the archive, we don’t have any way to assess the quality of their conclusions.  I’m not accusing them of malfeasance, but the fact is without the data, we can’t determine if the conclusions are correct.  Computer forensics is an “upside down” investigation in which the “evidence” invariably points to an innocent party.  For example, if someone figured out what Phil Jones username and password was, and used them to download the entire archive, the “evidence” in the server logs would show that Phil Jones did the deed.  It takes a skilled investigator to sort out what Phil Jones did (or didn’t do) from what someone using Phil Jones credentials did (or didn’t do).  So let’s put aside what the investigators say they think happened and just take a look at some of the possibilities:

Email Administrator – anyone who had administration rights to the email system itself could have made copies of the entire email database going back as far as the oldest backup tapes retained with little effort.  So…who had administration rights on the email system itself?  There’s reason to believe that it was not any of the researchers, because it is clear from many of the emails themselves that they had no idea that things like archives and backup tapes existed.

Storage Administrator – In large IT shops, managing the large storage arrays that the application servers are attached to is often a job completely separate from application administration jobs such as running the email system.  Since the storage administrator has direct access to the data on the storage arrays, copying the data from places such as the email system and the archive would be a matter of a few mouse clicks.

Backup Administrator – This again is often a separate job description in a large organization, but it might be rolled in with storage administration.  The point being however, that whoever had backup administration rights had everything available to copy with a few mouse clicks.  Even in a scenario where no archive existed, and copying the data required restoring it from backup tapes that went back 20 years, this would have been a snap for the backup administrator.  Provided that the tapes were retained for that length of time of course, the backup administrator could simply have used the backup system itself, and the robotics in the tape library, to pull every tape there was with email data on it and copy the emails to a single tape.  This is a technique called a “synthetic full” and could easily run late at night when it would just look like regular backup activity to the casual observer.  The backup administrator could also “restore” data to any hard drive s/he had access too… like their personal computer on their desk.

Truck Driver – yes, you read that right, the truck driver.  Google keywords like “backup tapes stolen truck” and see what you get.  The results are eye popping.  The companies that specialize in storing tapes off site for customers send a truck around on a regular basis to pick up the weekly backup tapes.  There have been incidents where entire trucks (and the tapes they were carrying) were stolen.  Did anyone steal CRU’s tapes that way?  Probably not.  The point is however that once the tapes leave your site and are entrusted to another organization for storage, they could be copied by anyone from the truck driver to the janitor at the storage site.  Assembling 20 years of email from backup tapes could be a real hassle of course.  On the other hand, an offsite storage facility frequently has as part of the service it provides to clients…great big tape libraries for automating copying of tapes.   Encryption of backup tapes was a direct response to incidents in which tapes with valuable (and/or embarrassing information) wound up in the wrong hands.

But encryption has only been common for a few years.  That raised an interesting theoretical question.  The last email release ends in 2009, and the rest of the release is, in fact, encrypted.  One can only wonder, does the CRU encrypt their backup tapes, and if so, when did they start doing that?

Administrative Foul Up – One of the biggest “cyber crimes” in history occurred when a company doing seismic processing for oil companies cycled the tapes back to their customers for the next round of data, and sent old tapes to different customers.  One of their customers figured it out, and started checking out the data they were being sent which was from their competitors.  It wasn’t the first time it happened, and it wasn’t the last time.

Janitor – Let’s be clear, I’m not accusing anyone, just making a point.  There’s an old saying about computer security.  If you have physical access, then you have access.  Anyone with physical access to the computer room itself, and the right technical skills, could have copied anything from anywhere.

The FOIA Requests

There are dozens of emails that provide glimpses into both how the email systems at CRU were run, and how FOIA requests were handled.  Some of them raise some very interesting questions.  To understand just how complex compliance law can be, here’s a brief real world story.  Keep in mind as you read this that we’re talking about American law, and the CRU is subject to British law which isn’t quite the same.

In the early days of compliance law, a large financial services firm was sued by one of their clients.  His claim was that he’d sent instructions via email to make changes to his investment portfolio.  The changes hadn’t been made and he’d suffered large losses as a result.  His problem was that he didn’t have copies of the emails he’d sent (years previous) so his legal case was predicated upon the financial firm having copies of them.  To his chagrin, the financial firm had a data retention policy that required all email older than a certain date to be deleted.  The financial firm figured they were scot free.  Here’s where compliance law starts to get nasty.

A whistle blower revealed that the financial firm had been storing backup tapes in a closet, and had essentially forgotten about them.  A quick inspection revealed that a number of the backup tapes were from the time in question.  The financial services firm asked the judge for time to restore the data from the tapes, and see what was on them that might be relevant.  The judge said no.

The judge entered a default judgment against the financial services firm awarding the complainant  $1.3 Billion in damages.  The ruling of the court was that the financial services firm was guilty by virtue of the fact that they had told the court the data had been deleted from that time period, but it hadn’t been.  They had violated their own data retention policies by not deleting the data, and were guilty on that basis alone.  Wake up call for the financial industry…and everyone else subject to compliance law, which includes FOIA requests.

Suddenly deleting information when you said you hadn’t was a crime.  Not deleting information when you said you had, was a crime.  Keeping information could wind up being used against you.  Not keeping information that it turns out you were required to keep (by the tax department for example) could be used against you.  No one serious about compliance could possibly take the risk of allowing end users to simply delete or keep whatever they wanted.  From their own personal accounts certainly, but not from the company email server.  Ever.

In that context, let’s consider just a few words from one email in which Phil Jones, discussing with David Palmer whether or not he’d supplied all the email in regard to a specific FOIA request, says “Eudora tells me…”

These few words raise some serious questions.  Eudora is an email client, similar to the more familiar Outlook.  So, let us ask ourselves:

Why was David Palmer relying on Phil Jones to report back all the emails he had?  Compliance law in most countries would have required that David Palmer have the appropriate search be done by the IT department.  This would have captured any emails deleted by Phil Jones that were still retained by the CRU based on their data retention policy.

Was David Palmer aware of the proper procedure (to get the search done by IT)?  If not, was he improperly trained and who was responsible for properly training him in terms of responding to FOIA requests?  If he was aware… well then why was he talking to Phil Jones about it at all?

Phil Jones specifically says that “Eudora tells me” in his response to Palmer.  Since Phil Jones evidently did the search from his own desk top, the only emails he could search for were ones that he had not deleted.  But, that doesn’t mean he found all the emails subject to the FOIA request, because email that he did delete was more than likely retained on the CRU email server according to their data retention policies.  As in the case of the financial company, the CRU may well have said they didn’t have something that they did.  In fact, we can surmise this to be highly likely.  There are multiple emails showing up in which, for example, Phil Jones says he is going to delete the message right after sending it.  But we now have a copy of that specific message.  Did he send it and then forget to delete it?  Probably not.  The more likely answer is that he did delete it, not realizing that the CRU data retention policy resulted in a copy being left on the server.  If the CRU responded to an FOIA request and didn’t include an email that met the FOIA parameters because they failed to search all their email instead of just the email that Phil Jones retained in his personal folder… well, in the US, there would be some prosecutors very interested in advancing their careers…

“Eudora tells me” is even more curious from another perspective.  Why “Eudora”?  Why didn’t he say that he’d searched all his email?  Why specify that he’d used the search capabilities in Eudora?  Personally, I have three email systems that I connect to, and a different email client for each one.  Searching all my email and searching all my email in just one email client are two completely different things.  Most interesting!

While I am comfortable discussing with IT shops how to architect email systems to protect data and properly service legal requirements such as FOIA requests, I have to admit that I wouldn’t know where to start in terms of submitting one.  If I did, I just might ask for all the emails they have pertaining to FOIA requests, and I’d be specific about wanting all the email, regardless of it being in the main email system, the archives, or on backup media, and all the email that ever existed, regardless of having been deleted by end users.  Then for the capper, I’d ask for their data retention policy and see if they managed to meet it in their response.

Just sayin’

dmh

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

151 Comments
Inline Feedbacks
View all comments
PaulH
November 30, 2011 1:13 pm

Excellent article and discussion. Puts me in mind of the book “The Cuckoo’s Egg: Tracking a Spy Through the Maze of Computer Espionage”
http://www.amazon.com/Cuckoos-Egg-Tracking-Computer-Espionage/dp/1416507787

davidmhoffer
November 30, 2011 1:24 pm

mrrabbit;
In others words, while replication in IT circles may not be snapshots – a snapshot can be replication.>>>
Ahem. No. Well…. maybe. Ask three experienced IT managers what a snapshot it and you will get four answers. So to have the discussion, we’d have to make certain we both mean the same things when we use the same terms, that said….
A “snapshot” is a point in time copy of the data, in most cases predicated upon the original data that the “snapshot” was taken of.
Replication on the other hand, is a physical copy of the data that is physically seperate in all respects from the original data.
Here is the important distinction. Suppose that I have a large email system. Suppose further that I replicate data to a remote site. In addition, I also snapshot my data every hour. Now…something bad happens.
Bad Thing Example 1: My email system gets infected with a virus, and scrambles all the data on disk. Guess what happened to the data replicated to the remote site? Bad news, it is scrambled too. The replication tool makes no distinction between good data and bad data. If something gets scrambled on one site, then it gets scrambled on the replica. Delete from one, delete from the other. BUT, since we’ve also been running snapshots, we can simply role back to the last good snapshot (which is less than an hour ago) and with a few mouse clicks, we’re up and running.
Bad Thing Example 2: My primary data centre burns down. All the snapshots in the world will not help me, they are ALL gone. Well, from that site anyway. If my replication software was doing what it was supposed to, all my data has been replicated to another storage array and I can just mount that to a server and start running again. (assuming I wasn’t so unlucky as to be in the data centre when it burned down 😉 )

November 30, 2011 1:24 pm

But if converting zip to txt only failured, losting some part of email addres
(some 8 bit ascii conversion?).
——-?? ??.WWW > ???.WWW
So ——-?? ??.WWW is somebody?
Im’´only quessing.
Ilkka.

davidmhoffer
November 30, 2011 1:28 pm

mrrabbit;
– delete the wrong snapshot – try to recover from the other snapshots – only to wonder why they end up with nothing.>>>
Yup, if you are messing with snapshots, one had best be familiar with exactly how that specific snapshot implementation works, they are NOT all the same. In some, delete any given snapshot, and all the snapshots older than that one go poof and disappear. In other implementations, deleting one only affects that one.
The other thing that messes IT up is performance. In some architectures, more than two or three snapshots will bring the storage array to its knees. In others, you can have hundreds without a performance impact.

davidmhoffer
November 30, 2011 1:33 pm

dmacleogravacleo says:
November 30, 2011 at 10:27 am
question on exchange-outlook relationship where non-cached is used.
where is the local copy kept?
there is no ost file like cached mode, is it a temp file deleted at some point?>>>
I’m not up to speed on their latest implementation, but in general the term “cache” means a temporary means of storing some subset of the data for performance purposes. So, the ultimate place that your emails are stored in should be the same either way. I’d think there would be someone deep on how Outlook works these days who could comment further?

davidmhoffer
November 30, 2011 1:43 pm

mikerossender;
Document requests can only be served against you for documents under your control.>>>
Correct. I only made the point about the copies being in both the sender’s environment and in the recipients environment to illustrate how the system works. Of course, this makes (for example) Michael Mann’s emails even more interesting than they would be otherwise. There very well could be email trails that no longer exist at the CRU, but they do at Penn State, and the time stamps show they existed at the time an FOIA request was submitted. If so, it would indicate that the email was deleted after the FOIA request came in, which would be…in North America, jail time, I’ve no idea about Britain.

November 30, 2011 1:46 pm

http://www.ResearchResearch, to FIOA ripper, 1 minute and i found a new, delete emails..
http://www.ecowho.com/foia.php?file=3000.txt&search=www.ResearchResearch
Ilkka.

davidmhoffer
November 30, 2011 1:46 pm

mikerossander;
If you delete the email on your desktop, it will be flagged for deletion on your institution’s server copy and overwritten the next time the computer decides it needs to use that particular piece of the disk. If a message was sent, received and deleted all within the time between incrementals, it may not be captured for backup at all.>>>
It is POSSIBLE to set up an email system to work exactly like that. It is UNLIKELY that any system designed to current best practices and to meet current compliance law would do so.

Berndt Koch
November 30, 2011 1:51 pm

A couple of other things to consider if you are trying to delete a specific email.. in general people will ‘reply’ to an email or forward an email quoting the original (as can be seen from both sets of Climategate emails) so if you REALLY want to delete your original email you also have to find all the emails where your original email was quoted or forwarded.. and all their copies..
Also emails can be sent to multiple people, so multiple servers in different IT shops with different backup and archive policies..
I’m not sure the term delete really applies.. maybe reduce the number of copies by 1 or 2 % would be better!

davidmhoffer
November 30, 2011 1:54 pm

mikerossander;
There is, in fact, a reward offered for anyone who can recover once-overwritten data from any reasonably modern media. It has gone uncollected for several years.
Again, thank you for a great post.
I wasn’t aware of that, can you post a link? I’d like to read what their terms and conditions are. In terms of your overlapping blobs explanation, yup, that methodology is pretty tough to use these days. There are other methods though. We’re straying into some pretty exotic technology however, and I haven’t reviewed it in detail for several years so I may very well be behind the times on that one. I also haven’t been working with military organizations for quite some time, so all the really cool stuff that I got to hear about at the risk of getting shot for hearing it is no longer accessible to me (wah!)

Joe Public
November 30, 2011 1:57 pm

I’ve no need to buy Detective Fiction Novels; I just read the enormous variety of educational postings on WUWT.
Thank you for a very instructive read.

Dan Murphy
November 30, 2011 1:59 pm

David,
Excellent post, and great comments (and commentators!)
The only thing missing that I would have expected to see in the general discussion of this issue is the recent emergence of “de-duplication” technology. I would expect that there were often several or more users with the same e-mail on their local systems. Suppose that Michael Mann copies both Kevin Trenberth and Phil Jones on an e-mail. They would both have copies of that e-mail on their local systems, and backups of their local systems would have copies of the same e-mail. Would you kindly take a moment to address how de-duplication software would impact the situation from the standpoint of the release of ClimateGate e-mails, and the attempts to delete e-mails to avoid FOIA requests?
Thanks in advance,
Dan Murphy

Joe Public
November 30, 2011 2:00 pm

davidmhoffer says: November 30, 2011 at 1:00 pm
RE: “Do you think that the authorities (UEA and Norfolk Police) know who did this and how?”
“Not a clue, ….. that’s the domain of high end security consultants and government agencies with three letter acronyms.”
And the delay in releasing their findings might, just might, be deliberate.

davidmhoffer
November 30, 2011 2:01 pm

Mike Wilson;
I did not see any reference to how many companies are virtualizing their tape systems. I helped IBM test their first VTS in Texas while it was still in beta in the mid 1990s>>
Do you remember the product name? My guess is that it is the VTS (or the more common term VTL) that they used to rebrand from Falconstore. A few years ago, they bought a company named Diligent and renamed it Protectier. Protectier is both a VTL and a deduplication platform for use as a backup target.
Your description of using VTL between the backup servers and the tape libraries pretty much on the money. I decided to leave VTL, disk to disk, and deduplication out of the article as it was getting long enough as it was.

davidmhoffer
November 30, 2011 2:05 pm

PaulH;
Puts me in mind of the book “The Cuckoo’s Egg: Tracking a Spy Through the Maze of Computer Espionage”
http://www.amazon.com/Cuckoos-Egg-Tracking-Computer-Espionage/dp/1416507787
Wow, thanks! I knew the basics of the story from being involved with certain customers…but I digress. Anyway, I didn’t know it had come out as a book. Putting it on my reading list!

40 Shades of Green
November 30, 2011 2:18 pm

Great article. One of the mostninformative ever on wuwt which is saying a lot. Thanks.

Mike Wilson
November 30, 2011 2:34 pm

David,
It was an all IBM product. It was VTS model B10 which was v1.0 of IBM Virtual Tape Server running inside a IBM 3494 ATL. The library manger driving the VTS ran a specialized version of TSM (Tivoli Storage Manager) under the covers. The tape drives storing the data on the backend in the ATL were 3 IBM 3590 Magstar drives storing a whopping 10GB uncompressed per cartridge. The virtual drives presented to the system were 16 3490 tape drives.

Mike Wilson
November 30, 2011 2:48 pm

David,
Also this VTS product has today has evolved into the currrent TS7700 line of products. You can google it if you are interested.

davidmhoffer
November 30, 2011 2:49 pm

Mike Wilson;
It was an all IBM product. It was VTS model B10 which was v1.0 of IBM Virtual Tape Server running inside a IBM 3494 ATL. The library manger driving the VTS ran a specialized version of TSM (Tivoli Storage Manager) under the covers. The tape drives storing the data on the backend in the ATL were 3 IBM 3590 Magstar drives storing a whopping 10GB uncompressed per cartridge. The virtual drives presented to the system were 16 3490 tape drives.>>>
Yup, that was well before the Falconstore product for sure. To complicate the discussion, TSM is an “incremental forever” architecture rather than a “weekly full, daily incremental”. In an incremental forever architecture, tape reclamation and save set consolidation are very important to do, but they are very hard on the tape drives, cartridges and the library itself. So, doing those processes in a virtual environment such as the VTS is a HUGE benefit. Today’s IBM Protectier can present several virtual libraries and thousands of virtual tape cartridges plus it does deduplication too.

KnR
November 30, 2011 2:56 pm

I think you left one out ‘cleaners’ ever office has them and most being don’t even know who they are , and given there low paid sub-contractors to sub-contractors who are normal treated poorly and have no loyalty to those they clean . They make for dam good ways of getting access to sometime even the most secure room or building , everyone has bins and these people do need access to do their job and you can’t expect important people to clean their own toilets now can you. As for the idea these people are vetted, well that costs moneys and takes time and who would that for just ‘stupid cleaners ‘

davidmhoffer
November 30, 2011 2:56 pm

Dan Murphy;
The only thing missing that I would have expected to see in the general discussion of this issue is the recent emergence of “de-duplication” technology.
Would you kindly take a moment to address how de-duplication software would impact the situation from the standpoint of the release of ClimateGate e-mails, and the attempts to delete e-mails to avoid FOIA requests?>>>
I just knew someone would ask that one….
De-duplication is a very old technology (late 70’s) that everyone forgot about, and has now been “invented” again. I’ll try and respond later tonight or else tomorrow AM. I know what the answers are, just condensing them into something understandable and less than a hundred pages is a bit of a challenge.
Stay tuned!

Mike Wilson
November 30, 2011 3:05 pm

davidmhoffer,
Yes, it is absolutely getting crazy out there. IT professionals can barely keep up with what is going on, so you know Phil Jones had no clue. He doesn’t even have a clue concerning his own profession.
Thanks for the informative article!

Crispin in Waterloo
November 30, 2011 3:09 pm

@R Barker:
>…I wonder what gems might be revealed in the all.7z file(s).
+++++
We had a discussion in the livingroom (including a high end IT server expert) and posit the following:
The encrypted file may be something as simple as a whole backup file from early in the ‘game’ containing something of great importance, perhaps from other sources, showing collusion to manipulate the public with falsified temperature data. It might be necessary to crack the code then decrypt the (original) result
It may be a carefully assembled set of data including the deleted programmes and original temperature data (which were claimed to be unavailable only after they were asked for). I think that is what most people expect – lots of UEA emails.
They may be a nested set of encrypted files with sequential revelations, each level requiring a serious brute force attack to open the hard way. Crack it once and you get a trove plus another huge encrypted file: (1(2(3(4(5(6)))))) with the most important ‘reveal’ sitting at position 6.
The way things are going, the last seems most likely. We have no idea what is going on behind the scenes re threats to publish a password. Sending the target a pwd would only open level 1 but serve to hint at what will happen (and more) if they do not, for example, come forward to admit the fraud and perfidy and take down the whole sorry mess from the IPCC to RC. The additional mails in CG2 are so damning to those who have papered over their asses since CG1, there are no doubt more layers of revelation still to be released.

Dan Murphy
November 30, 2011 3:22 pm

David,
Thanks, I’ll look for your de-duplication post. I had not realized that it was “resurrected” technology!
Dan

davidmhoffer
November 30, 2011 3:38 pm

Dan Murphy;
Thanks, I’ll look for your de-duplication post. I had not realized that it was “resurrected” technology!>>>
Yup. We were doing the equivelant of de-duplication, snapshots, server virtualization, storage virtualization, and many other things in the 70’s and 80’s. Then servers and storage became uber cheap, and we forgot about all those nifty tools because it was more work to run them than they were worth. Then the number of servers we needed started to explode, and the amount of data started to explode…. and so we “invented” them all over again.