People send me stuff. Here we have another case of value added adjustments that increase the slope, much like temperature.
This email forwarded from Steve Case reads as follows:
The University of Colorado’s Sea Level Research Group just published the 2013 Release #1 of their Global Mean Sea Level Time Series.
I discovered that these periodic releases are on the net all the way back to 2011 Release #1. So I downloaded all nine of them.
2012 release #1 has 628 entries up to January of 2011 so I had Excel’s slope function calculate the rate of sea level rise for that time series of 628 entries across all nine releases.
What I found is that the rate of sea level rise has been bumped up twice since then, once in 2011 and the the latest in the current release. Here’s a link to a graph to illustrate the point:
http://oi45.tinypic.com/2vmenpv.jpg
Coupled with the GIA increase of 0.3 mm/yr that was made prior to these nine releases the rate of sea level rise has been bumped up 0.43 mm/yr in the last few years.
This sort of thing has been going on more or less regularly and it seems to go only one. way.
Here are the links to the data:
http://sealevel.colorado.edu/files/2011_rel1/sl_ns_global.txt
http://sealevel.colorado.edu/files/2011_rel2/sl_ns_global.txt
http://sealevel.colorado.edu/files/2011_rel3/sl_ns_global.txt
http://sealevel.colorado.edu/files/2011_rel4/sl_ns_global.txt
http://sealevel.colorado.edu/files/2012_rel1/sl_ns_global.txt
http://sealevel.colorado.edu/files/2012_rel2/sl_ns_global.txt
http://sealevel.colorado.edu/files/2012_rel3/sl_ns_global.txt
http://sealevel.colorado.edu/files/2012_rel4/sl_ns_global.txt
http://sealevel.colorado.edu/files/2013_rel1/sl_ns_global.txt
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
![sl_ns_global[1]](http://wattsupwiththat.files.wordpress.com/2013/01/sl_ns_global1.png?resize=533%2C372&quality=75)
![2vmenpv[1]](http://wattsupwiththat.files.wordpress.com/2013/01/2vmenpv1.gif?resize=503%2C355)
“MieScatter says:
January 25, 2013 at 3:26 am
I can’t find the sampling frequency, but the resolution is ~4 km. 12.5 full orbits in a day at ~4 km resolution is up to 10,000 measurements per day (assuming all ocean). ”
————————————–
For the latest satellite, I believe that the sampling rate is 20 samples per second. I’m not sure about the earlier instruments. After allowing for land, ice, and some rejected noise, they presumably get around 1,000,000 measurements a day.
======================
“The error in the AVERAGE value if an individual measurement has a 10 cm error and you have 60,000 independent measurements would be (in metres) 0.1 / SQRT(60000) = 0.0004 m or 0.4 mm.”
—————————————
Surely more like 100 cm+ — waves, tides, water temp, winds, etc
=======================
“you can read the altimeter papers if you want, but they illustrate that there’s no reason to suspect fraud in the precision they report.”
Yes, reading them is a good idea. There are a number of potential error sources not allowed for in your calculations. Satellite orbit errors, attitude errors (the RA may not always be pointed exactly straight down, ionospheric delay modeling, etc).
I don’t think “fraud” is the right term. But “unrealistic” might be applicable.
On top which, I’m pretty sure that the user handbook — while excellent — doesn’t fully explain their methods. It’s about time I reread the thing. But that won’t happen for a few months.
Peter says:
January 25, 2013 at 1:09 am
However, i’m sure they reference their data against known surface heights (large salt pans) then use this to correct satellite drift (the adjustment). They must give a reason / data for their correction.
===============================
I’m 98% sure that they use DORIS (a sort of “inverse GPS” network of fixed ground stations) for orbit determination. I think they augment DORIS with GPS, but I may have my satellites confused. Anyway, they know the position of the satellite pretty well. Unfortunately, when you are trying to measure variations in a moving, irregular surface, to a fraction of a millimeter, even very small biases, instrument drifts, and data handling problems can cause problems unless you have VERY long timespans to average over.
Is anyone keeping a list of Global Warming scientists (cough, cough) or individuals who have been involved in falsifying data, tampering with data, adjusting data; whether directly or by manipulating measurement methods or calibrations of measurement equipment?
Such a list would be beneficial to organizations looking to hire scientists or individuals of integrity. No organization wants to knowingly hire an unethical person, knowing it could lead to faulty work or litigation for incompetent work.
Altimeters are calibrated to an “absolute” coordinate system through GPS. This was traditionally done with GPS surveyed tide gauges. Salt pans are not used (as far as I know, for sure they didn’t used to be). Even if they were, they’d need GPS measurements to see how they’re moving in an absolute coordinate system tied to the earth..
That’s one issue I have with the GIA correction. It’s taking the measurements back from an absolute reference to being referenced to the sea floor. I can understand why alarmists would be interested in this, but I don’t think it’s appropriate for a primary dataset.
I’m also concerned with recent info from the GPS community that the location of the earth center is not nearly as well known as I thought it was and even worse is not even systematically off. If you can first imagine that the Z coordinate is off, then your value for sea level will also be off because of the N-S asymmetry of the oceans. This wouldn’t matter for trends, but apparently the error is not necessarily constant. So imagine that your best location for earth center is actually drifting N. Then GPS determined locations would have a slow drift and “calibrated” sea level would appear to be rising (even if it were actually not).
The GPS community really, really needs to nail down their coordinate system. Else all precise measurements and calculations based on it will be in error.
Ric Werme says:
January 25, 2013 at 5:50 am
The first WUWT readers read about is was in http://wattsupwiththat.com/2011/05/05/new-sea-level-page-from-university-of-colorado-now-up/ where they (rather unhelpfully) reported:
I thought I added to that post a gripe that sea level (a dimension of length) now includes an adjustment that is volume (length cubed).
——————————————–
You’re right about the volume correction Ric. And I think that even the CU folks have acknowledged that it’s controversial and perhaps inappropriate. It will, I’m pretty sure, cause observed and satellite measured sea levels to diverge by substantial amounts if it is applied over millennia (e.g. since Roman times). But I somehow got the idea into my head that the volume element was only about 0.03mm per year and there is an additional .27mm per year of GIA in the CU computations that isn’t accounted for. Perhaps I’ve simply made that up. I will add that to my (quite long) list of things to check.
Here’s a good review of how sea level is determined with satellites: http://sealevel.jpl.nasa.gov/technology/
Here’s a 15% systemic error that is not mentioned in any measurements: https://c3.nasa.gov/nex/projects/160/
Here’s the real problem that the satellite people must resolve: http://tidesandcurrents.noaa.gov/sltrends/MSL_global_trendtable.html
These are long term sea level trends measured by tide gauge sites around the globe. If you take the average of all 194 sites, there is no trend. These are the measurements that really count as far as coastal communities are concerned. Sea level with respect to the ITRF center of the earth is an academic debate.
In my limited understanding, sea level rise means higher low and high tides on every sea-shore on Earth; sea shore property goes underwater.
This depends on whether that sea shore actually is on a sinking or rising continental plate.
If a glacial isostatic adjustment (GIA) correction is used, an extra rise is added to the measurement, a rise that does not happen physically,
one that can not be measured from the ground.
As the ocean basins get larger, the sea level drops; sea shore properties get wider beaches.
Sea level is a proxy for non-floating ice extent; ice over the ground and temperature.
So, it must be kept going up, or the facade drops to the ground.
Thermometer and satellite low-level atmospheric temperatures are going slowly down for the last decade or so.
HADCRUT4 temperature trend since 2002 is now near -0.5°C (-0.9°F) per century.
What gives?
I once worked for a company that built beaches for the US Army Corps of Engineers. If I knowingly submitted fraudulent data to the USACE, it was a felony punishable by loss of my job, a large fine, and possible jail time.
This is a clear example of the same thing. Fraudulent data was submitted to the federal government (NASA) as part of a tax payer funded project. It should be prosecuted exactly the same way. The person(s) who submitted the data should be terminated, fined, and possibly jailed.
MieScatter says: January 25, 2013 at 3:26 am
osopolitico, sea level is typical measured with a laser altimeter. An example is on the Jason satellite, and the handbook is here:
ftp://podaac.jpl.nasa.gov/allData/ostm/preview/L2/GPS-OGDR/docs/userhandbook.pdf
The error budget shows that an individual measurement expects an uncertainty of the order of cm to ~11 cm.
If I’ve got it right, then that is one measurement. But your uncertainty in measuring an average value gets smaller as you take more measurements. It crosses the equator ~25 times per day. I can’t find the sampling frequency, but the resolution is ~4 km. 12.5 full orbits in a day at ~4 km resolution is up to 10,000 measurements per day (assuming all ocean). Let’s say 6,000 to represent unfrozen ocean. The average cycle length at which they report a value is just under 10 days, so about 60,000 measurements per value.
There are major problems calibrating satellite instruments to our un-cooperative planet, and GRASP will resolve that giving us an accuracy to 1 mm (ie, we don’t have that now): The baselines between RF/Optical phase centers of all sensors on the supremely-calibrated GRASP spacecraft will be known to 1 mm accuracy and stable to 0.1 mm/year,….
All well and good … but besides the obvious problem pointed out by many in here that repeated measurements of something which varies on momentary, daily, monthly and multi-decadal irregualr cycles, there is also the problem that these laser satellite measurements depend on a calculated TRF (Terrestrial Reference Frame) due to variations, degradations and irregularities in satellite orbits. (as was pointed out previously in an excellent article here in WUWT).
Hence the percieved need for the proposed GRASP mission. (to solve the problem!)
As per below:
http://ilrs.gsfc.nasa.gov/docs/GRASP_COSPAR_paper.pdf
Anthony – See http://wattsupwiththat.com/2013/01/24/sea-level-rate-of-rise-shown-to-be-partially-a-product-of-adjustments/#comment-1208537 – my apologies for leaving out the specific reference to “fraud”.
Also _multiple_ comments claiming that these measures are based on manufactured data:
“…not a measured thing, but a figure introduced from outside”
“…the measurements … are being calibrated to radiative greenhouse theory, not anything real on Earth’s surface”
“This year it seems, the data is being made fit for the next IPCC assessment report…”
“Trying to engineer an acceleration to fit the models.”
These are all claims of fraud, whether or not that particular word is used.
I will note that saying “This sort of thing has been going on more or less regularly and it seems to go only one. way.”, without discussing _why_ adjustments are made, or that they might be justified, does leave unpleasant implications hanging for the reader.
Anthony, this is similar to your point about malleable history. The revisions to the data plots of the satellites is ongoing. The most egregious adjustments were shortly after they killed Envisat and Jason I. Since then, they’ve only had Jason II to play with, and play with it they have. They regularly change the historical plots with Jason II. If anyone wants to trip down memory lane with the satellite measurements and what they did with the Jason I plots after it quit measuring sea level, you can go here. http://suyts.wordpress.com/2012/09/01/jason-i-the-other-killed-satellite/
The conflation of the data sets is ludicrous. There is no validity to what they’re doing anyway, but, it’s fun to watch them alter history and pretend its some reflection of reality.
I would not be concerned about the physics of an individual satellite. Satellite locations are known with great precision. The altimeter will take many measurements of land based locations whose altitude is very precisely known so instrument drift can me monitored and corrected.
There are so many other places between the instrument and the “global average sea level” in which a bias can be introduced. But staying close to the instrument – I would point out that every phenomena that would give an incorrect altitude reading (clouds, planes, ships) all bias the reading in the upward direction. So if there is a significant false negative rate for the rejection algorithm, then that would introduce an “instrument related” bias. And as Dr. Lindzen points out – we are talking about tiny numbers here.
@Robertv
Thanks for that great link to Dr Nils-Axel Mörner’s video on sea levels.
Also ‘nearby’ was Donna Laframboise talking about the IPCC at
Both are 30 minutes long and worth the time.
RobertInAz says: January 25, 2013 at 8:52 am
I would not be concerned about the physics of an individual satellite. Satellite locations are known with great precision. The altimeter will take many measurements of land based locations whose altitude is very precisely known so instrument drift can me monitored and corrected.
Nope, they don’t measure the TRF with as “great precision” as they would wish. Currently they resort to cobbling together data from 4 different systems, including the GPS satellite system which was not designed for the job. To the extent that the GRACE system has not been as useful as hoped, and to the extent they want a new launch (GRASP) to resolve the problems.
Thus, we assess that current state of the art reference frame errors are at roughly the mm/yr level, making observation of global signals of this size very difficult to detect and interpret. This level of error contaminates climatological data records, such as measurements of sea level height from altimetry missions, and was appropriately recognized as a limiting error source by the NRC Decadal Report and by GGOS. (http://ilrs.gsfc.nasa.gov/docs/GRASP_COSPAR_paper.pdf)
The calculation of error to +/- 0.1mm in the “60,000” seal level readings is potentially false because it’s not 1 item measured 60k times, it’s 60k different things measured once each. You cannot reduce errors in the reading by measuring lots of things once. Actually, you can’t actually reduce errors by measuring the same thing lots of times if the measurement system can only measure to +/- 75mm — you just get 60k readings with an accuracy of +/- 75mm.
And of course exactly the same applies to air temperatures. There’s no justification for averaging thousands of geographically distinct thermometer readings to come up with an average temperature quoted to hundredths of a degree. Each individual reading is unlikely to be any more accurate than +/- 1 at best (and the older ones will be worse because they weren’t being taken for the same reasons) given siting issues and UHI effects. The next day’s reading is recording a different temperature, not taking another measurement of the same one, so it cannot increase accuracy. All IPCC temperatures should be given to +/- 1 degree, but then where would they be?
Dr. Morners presentation in the linked Youtube video at the top of this thread is priceless…
Re Bjarne Bisballe says: January 25, 2013 at 3:14 am
The approximate 18.6-year cycle has been known for some time as the Metonic Cycle (see below). The cycle is the basis for the 19-year tidal epoch used to define the sea level datum. See American Council of Surveying and Mapping Bulletin at the NOAA website:
http://tidesandcurrents.noaa.gov/publications/Understanding_Sea_Level_Change.pdf
In particular, note Figure 2. “The variation of Mean Range of tide (1900 – 1996) at Seattle, WA, demonstrates the need for averaging the National Tidal Datum Epoch over 19 years.”
In the US, sea level is not just an academic or climate exercise. The various sea level datums, i.e. sea level with respect to adjacent land, are the basis for demarcation between public lands and private lands, historic land grants, and various land ownership boundaries. Quoting from the ACSM Bulletin:
“The importance of a uniform system of tidal datums for all tidal waters in the U.S, its territories, and trusts was recognized and established by the National Tidal Datum Convention of 1980. As a result, NOAA’s definitions of tidal datums—Mean High Water (MWH), Mean Lower Low Water (MLLW), and LMSL—were authorized as the official policy of the U.S. Federal Government.
“Local mean sea level is a term used to denote the average height of the ocean relative to land. Because the ocean surface is dynamic (being influenced by seasonal-to-decadal oceanographic and meteorological processes), we need to use a long period of observations to determine LMSL. The LMSL for the United States is determined as part of the National Tidal Datum Epoch (NTDE) which is based on 19 years.
“Nineteen years is also the length of the Metonic Cycle of recurrence of the lunar phases. This lunar cycle, first determined by Meton of Athens in 432 BC, captures a long-period change in the amplitude of the tide due to the orbital paths of the Earth and Moon relative to the Sun. The Metonic Cycle was selected because it includes daily, monthly, annual, and decadal changes in the amplitude of tides over 19 years.”
John Peter says: January 25, 2013 at 1:08 am
So we have the adjustments to sea levels and the surface temperature records and I wonder what else. What about ocean heat content?
========================
That apparently has been rigged too; Joel Shore recently provided a link to a NOAA chart that showed OHC rising these last fifteen years while SST remained flat. This is blamed on CO2, of course.
There can be no doubt we are still within the long “coda” related to the Great Melt. That is by definition since we are still in the interglacial. So long as we are still in the interglacial and there is still continential ice in the usual places, this will remain the case. If we see a longer term flattening or God forbid a long term fall … be … very … worried.
The GIA at 0.3mm/year, ten percent of the total, is particularly galling because it makes the numbers go up by 0.3mm/year even if every tide gauge in the world stays constant. Because the ocean basins are thought to be getting larger in capacity, it takes 0.3mm/year of additional water just to keep sea level from falling. The U. of Colorado is therefor re-defining sea-level as water capacity, which is perhaps useful if you are trying to account for where all the water in the world is, but it is totally misleading if you are talking about the effect on coastlines and coastal inhabitants, upon which the climate change discussion of sea level is based.
it’s a matter of preserving talking points:
3mm rise per year
vanishing ice caps
shrinking glaciers
drowning poley bears
completely exonerated.
of all these, the best one to take away from them is the last so it is also the best protected.
@KR, If it walks like, talks like and looks like??
Fraud or incompetence?
These systemic adjustments with astounding claims of precision, accuracy and certainty cause you no qualms?
What happened to clearly stating what you know, the precision of this knowledge and the acknowledgement of what is not known?
Infinite adjustments = attenuation of trust.
The increase in the slope from 2012_4 to 2013_1 seems to be due to two reasons.
First, we seem to have a real increase in sea level after the decrease seen in 2010.
Secondly, data processing of Jason-2 data has been changed from GDR Version C to D.
This change is documented on the Aviso Homepage. We learn that version D has several data adjustments such as
and some more. More interesting is the result of these corrections. Which is, as some might have guessed, an increase in sea level rise.
I have calculated the slope of 2012_4 vs. 2013_1 in OpenOffice Calc. The results are shown in this graph. The data correction has led to an increase in the slope of 0.05 mm/year in 2012 (compared 2012_4 to 2013_1), leading to an total plus of 0,25 mm from 2008 (compared 2012_4 to 2013_1).
Strange thing that. Whatever correction is being made to climate data, it always seems to lead in only one direction.
The University of Colorado is remarkably forthright about their re-definition of Global Mean Sea Level (GMSL) as something other than the level of the sea. They write (emphasis added):
[Ref: http://sealevel.colorado.edu/faq#n3113 ]
The University of COLORADO has a sea level study group?