Further Problems with Kemp and Mann

Guest Post by Willis Eschenbach

In my previous post I discussed some of the issues with the paper “Climate related sea-level variations over the past two millennia” by Kemp et al. including Michael Mann (Kemp 2011). However, some commenters rightly said that I was not specific enough about what Kemp et al. have done wrong, so here’s what further investigation has revealed. As there is no archive of their reconstruction results, I digitized their estimate of reconstructed global sea level rise as shown in their Figure S2 (A). First, here is their Figure, showing their reconstruction of sea level.

Figure 1. Kemp Figure S2 (A) SOURCE 

I digitized the part of their graph from 1650 onwards, to compare it to recent observation. Figure 2 shows those results:

Figure 2. Kemp 2011 reconstructed global sea level change, 1650-2000 

So what’s not to like in these latest results from Kemp and Michael Mann?

The first thing that seems strange is that they are claiming that globally there has been a sea level rise of 200 mm (8 inches) in the last fifty years (1950-1999). I know of no one else making that claim. Church and White estimate the rise 1950-2000 at 84 mm (three and a quarter inches) mm, and Jevrejeva says 95 mm (three and three-quarters inches), so their reconstruction is more than double the accepted estimates …

The next problem becomes apparent when we look at the rate of sea level rise. Figure 3 shows the results from the Kemp 2011 study, along with the MSL rise estimates of Jevrejeva and Church & White from worldwide tidal gauges.

Figure 3. Kemp 2011 reconstructed rate of global sea level rise, 1650-2000, along with observations from Jevrejeva (red circles) and Church and White (purple squares).

Kemp et al. say that the global rate of sea level rise rose steadily since the year 1700, that it exceeded 3mm per year in 1950, that it has increased ever since, and in 2000 it was almost 5 mm/year.

Jevrejeva and Church & White, on the other hand, say it has never been above 3 mm/year, that it varies up and down with time, and in 2000 it was ~ 2 mm/year. In other words, their claims don’t agree with observations at all.

In addition, the Kemp 2011 results show the rate of sea level rise started increasing about 1700 … why would that be? And the rate has increased since then without let-up.

So we can start with those two large issues — the estimates of Kemp et al. for both sea level and sea level rise are very different from the estimates of established authorities in the field. We have seen this before, when Michael Mann claimed that the temperature history of the last thousand years was very different from the consensus view of the time. In neither case has there been any sign of the extraordinary evidence necessary to support their extraordinary claims.

There are further issues with the paper, including in no particular order:

1. Uncertainties. How are they calculated? They claim an overall accuracy for estimating the sea level at Tump Point of ± 40 mm (an inch and a half). They say their “transfer function” has errors of ± 100 mm (4 inches). Since the transfer function is only one part of their total transformation, how can the end product be so accurate?

2. Uncertainties. The uncertainties in their Figure S2 (A) (shaded dark and light pink in Figure 1 above) are constant over time. In other words, they say that their method is as good at predicting the sea level two thousand years ago as it is today … seems doubtful.

3. Uncertainties. In Figure 4(B) of the main paper they show the summary of their reconstruction after GIA adjustment, with the same error bands (shaded dark and light pink) as shown in Figure S2 (A) discussed above. However, separately in Figure 4(B) they show a much wider range of uncertainties due to the GIA adjustment. Surely those two errors add in quadrature, and end up with a wider overall error band.

4. Tidal range. If the tidal range has changed over time, it would enter their calculations as a spurious sea level rise or fall in their results. They acknowledge the possible problem, but they say it can’t happen, based on computer modeling. However, they would have been better advised to look at the data rather than foolishly placing their faith in models built on sand. The tidal range at Oregon Inlet Marina, a mere ten miles from their Sand Point core location, has been increasing at a rate of 3 mm per year, which is faster than the Kemp reconstructed sea level rise in Sand Point. Since we know for a fact that changes in tidal range are happening, their computerized assurance that they can’t happen rings more than a bit hollow. This is particularly true given the large changes in the local underwater geography in the area of Sand Point. Figure 4 shows some of those changes:

Figure 4. The changes in the channel between Roanoke Island and the mainland, from 1733 to 1990.

Note the shallows between the mainland and the south end of Roanoke Island in 1733, which are noted on charts up to 1860, and which have slowly disappeared since that time. You can also see that there are two inlets through the barrier islands (Roanoke Inlet and Gun Inlet) which have filled in entirely since 1733. The changes in these inlets may be responsible for the changes in the depths off south Roanoke Island, since they mean that the area between Roanoke and the mainland cannot easily drain out through the Roanoke Inlet at the north end as it did previously. Their claim that changes of this magnitude would not alter the tidal range seems extremely unlikely.

5. Disagreement with local trends in sea level rise. The nearest long-term tide station in Wilmington shows no statistically significant change in the mean sea level (MSL) trend since 1937. Kemp et al. say the rise has gone from 2 mm/year to 4.8 mm per year over that period. If so, why has this not shown up in Wilmington (or any other nearby locations)?

6. Uncertainties again, wherein I look hard at the math. They say the RMS (root mean square) error in their transfer function is 26% of the total tidal range. Unfortunately, they neglected to report the total tidal range, I’ll return to that in a minute. Since 26% is the RMS error, the 2 sigma error is about twice that, or 50% of the tidal range. Consider that for a moment. The transfer function relates the foraminiferal assemblage to sea level, but the error is half of the tidal range … so best case is that their method can’t even say with certainty if the assemblage came from above or below the mean sea level …

Since the tides are so complex and poorly documented inside the barrier islands, they use the VDatum tool from NOAA to estimate the mean tidal range at their sites. However, that tool is noted in the documentation as being inaccurate inside Pamlico Sound. The documentation says that unlike all other areas, whose tidal range is estimated from tidal gauges and stations, in Pamlico Sound the estimates are based on a “hydrodynamic model”.

They also claim that their transfer function gave “unique vertical errors” for each estimate that were “less than 100 mm”. This implies that their 2 sigma error was 100 mm. Combined with the idea that their VLSI error is 50% of the tidal range, this in turn implies that the tidal range is only 200 mm or so at the Sand Point location. This agrees with the VDatum estimate, which is almost exactly 200 mm.

However, tides in the area are extremely location dependent. Tidal ranges can vary by 100% within a few miles. This also means that the local tidal range (which is very local and extremely dependent on the local geography) is very likely to have changed over time. Unfortunately, these local variations are not captured by the VDatum tool. You can download it from here along with the datasets. If you compare various locations, you’ll see that VDatum is a very blunt instrument inside Pamlico Sound.

That same VDatum site give the Pamlico Sound two sigma errors (95% confidence interval) in converting from Mean Sea Level to Mean Higher High Water (MHHW) as 84 mm, and for Mean Lower Low Water as 69 mm.

The difficulty arises because the tidal range is so small. All of their data is converted to a “Standardized Water Level Index” (SWLI). This expresses the level as a percentage of the tidal range, from 0 to 100. Zero means that the sample elevation is at Mean Lower Low Water, 100 means it is at MHHW. The tidal range is given as 200 mm … but because it is small and the errors are large, the 95% confidence interval on that tidal range is from 90 mm to 310 mm, a variation of more than three to one.

Their standardized water level index (SWLI) is calculated as follows:

SWLI = (Sample Elevation – MLLW) / (MHHW – MLLW) x 100     (Eqn. 1)

When adding and subtracting amounts the errors add quadratically. The sample elevation error (from the transfer function) is ± 100 mm. The MLLW and MHHW two sigma errors are 69 mm and 84 mm respectively.

So … we can put some numbers to Equation 1. For ease of calculation lets suppose the sample elevation is 140 mm, MLLW is 0 mm, and MHHW is 200 mm. Mean sea level is halfway between high and low, or about 100 mm. Including the errors (shown as “±” values) the numerator of Eqn. 1 becomes (in mm)

(Sample Elevation – MLLW) = (140 ± 100 – 0 ± 69) 

Since the errors add “in quadrature” (the combined error is the square root of the sum of the squares of the individual errors), this gives us a result of 140 ± 122 mm

Similarly, the denominator of Eqn. 1 with errors adding in quadrature is

(MHHW – MLLW) = (200 ± 84 – 0 ± 69) = 200 ± 109 mm

Now, when you divide or multiply numbers that have errors, you need to first express the errors as a percentage of the underlying amount, then add them in quadrature. This gives us

(140 ± 87%) / (200 ± 55%) *100

This is equal to (.7 ± 103 %) x 100, or 70 ± 72, where both numbers are percentages of the tidal range times 100. Since the tidal range is 200 mm, this means that the total uncertainty on our sample is about 72 percent of that, or ± 144 mm. So at the end of all their transformations, the uncertainty in the sample elevation (± 144 mm) is larger than the sample elevation itself (140 mm).

All of that, of course, assumes that I have correctly interpreted their very unclear statements about the uncertainties in their work. In any case, how they get a Tump Point two sigma error of about 40 mm (an inch and a half) out of all of that is a great mystery.

Those are my problems with the study. Both the rate and the amount of their reconstructed sea level rise in the last fifty years are much greater than observations; tidal ranges in the area are varying currently and are quite likely to have varied in the past despite the authors’ assurances otherwise; and their methods for estimating errors greatly underestimate the total uncertainty.

w.

[UPDATE] One other issue. They say regarding the C14 dating:

High-precision 14C ages (8) were obtained by preparing duplicate or triplicate samples from the same depth interval and using a pooled mean (Calib 5.0.1 software program) for calibration.

This sounded like a perfectly logical procedure … until I looked at the data. Figure 5 is a plot of the individual data, showing age versus depth, from Supplementary Tables DR3 and DR4 here. They have used the “pooled mean” of three samples at 60 cm depth, and three samples at 80 cm depth.

Figure 5. Age and depth for the Sand Point samples in the top metre of the core. Red squares show C14 dates. Horizontal black bars show the 2-sigma uncertainty (95% confidence interval).

Look at the 60 cm depth. The three samples that they tested dated from 1580, 1720, and 1776. None of their error bars overlap, so we are clearly dealing with three samples that are verifiably of different ages.

Now, before averaging them and using them to calibrate the age/depth curve … wouldn’t it make sense to stop and wonder why two samples taken from the exact same one-centimetre-thick slice of of the core are nearly two hundred years different in age.

The same is true at the 80 cm depth, where the ages range from 1609 to 1797. Again this is almost a two hundred year difference.

What am I missing here? How does this make sense, to average those disparate dates without first figuring out what is going on?

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

90 Comments
Inline Feedbacks
View all comments
Rhoda Ramirez
June 26, 2011 1:09 pm

Does anyone know what Kemp’s scientific reputation was before Mann? Cause I’ll discount anything I read from him after.

Jan v J
June 26, 2011 1:14 pm

verney
Although
Matthew VII 24-26
may be better (more literal).

Wil
June 26, 2011 1:43 pm

Frauenfeld, O.W., Knappenberger, P.C. and Michaels, P.J. 2011. A reconstruction of annual Greenland ice melt extent, 1784-2009. Journal of Geophysical Research 116: 10.1029/2010JD014918.
Background
The authors write that the “total annual observed melt extent across the Greenland ice sheet has been shown to be strongly related to summer temperature measurements from stations located along Greenland’s coast, as well as to variations in atmospheric circulation across the North Atlantic,” and they indicate that the total extent of Greenland ice melt “has been increasing during the last three decades,” with the melt extent observed in 2007 being “the greatest on record according to several satellite-derived records.” Therefore, one could well ask, if these several observations might not constitute a manifestation of what climate alarmists typically describe as one of the major consequences of the supposedly unprecedented rate and degree of recent global warming?
What was done
In an exercise designed to broach this question, Frauenfeld et al. created, as they describe it, “a record of total annual ice melt extent across Greenland extending back approximately 226 years by combining satellite-derived observations with melt extent values reconstructed with historical observations of summer temperatures and winter circulation patterns.”
What was learned
The three U.S. researchers report discovering that “the recent period of high-melt extent is similar in magnitude but, thus far, shorter in duration than a period of high melt lasting from the early 1920s through the early 1960s,” and they say that the greatest melt extent over the last two and a quarter centuries did indeed occur in 2007. However, as they go on to say, “this value is not statistically significantly different from the reconstructed melt extent during 20 other melt seasons, primarily during 1923-1961.”
What it means
Frauenfeld et al. conclude that “there is no indication that the increased contribution from the Greenland melt in the early to mid 20th century … resulted in a rate of total global sea level rise that exceeded ~3 mm/yr.” And they note that this observation suggests that “Greenland’s contribution to global sea level rise, even during multi-decadal conditions as warm as the past several years, is relatively modest,” which is a far, far cry from the catastrophic result climate alarmists claim should occur in the face of their unprecedented global warming claim.

Jeff Alberts
June 26, 2011 2:37 pm

Gerry says:
June 26, 2011 at 7:50 am
And this is supposed to be a peer-reviewed paper? Doesn’t say much for the reviewer given what seem to me to be fairly basic flaws. You would have thought that they would have learnt from the hockey stick fiasco that anything they publish will be subject to independent scrutiny. Maybe since they ‘know they are right’ they don’t care?

It’s not that they don’t care, it’s that the know any rebuttal will most likely not get past the gatekeepers. If it does, it will be so butchered as to be a non-rebuttal.

June 26, 2011 5:09 pm

Look at the 60 cm depth. The three samples that they tested dated from 1580, 1720, and 1776. None of their error bars overlap, so we are clearly dealing with three samples that are verifiably of different ages.
Now, before averaging them and using them to calibrate the age/depth curve … wouldn’t it make sense to stop and wonder why two samples taken from the exact same one-centimetre-thick slice of of the core are nearly two hundred years different in age.

This is the surreal world of CAGW in a nutshell…
Take a set of disparate values and calculate an average…
These can be any old numbers provided they have been selected by THE TEAM
Then calculate an anomaly to obscure the underlying values…
Then claim the result represents the GLOBAL situation…
Works with temperatures and sea level every time…
This technique has a name:
Calculate Random Average Post-normal

RoHa
June 26, 2011 5:24 pm

O.K. Time for the easy version again.
Is sea level going up, down, or sideways?
Inquiring minds want to know.

tokyoboy
June 26, 2011 5:30 pm

The reading of a tide gauge is a result of many changes. No. 2415 in the link below is the sea level trend of Osaka over more than 100 years, and shows a 2.6-METER SEA LEVEL RISE. Actually this reflects the land subsidence due to building construction and aquifer depletion, which are conspicuous in the developing times (1920-40s and 1950-70s). Other places (sorry in Japanese) show similar subsidence effects, but a few places exhibit general sea level fall, of which no one is sure about the true causes.
http://cais.gsi.go.jp/cmdc/center/graph/kaiiki5.html
The place-to-place variability is evident also in the places below:
http://cais.gsi.go.jp/cmdc/center/graph/kaiiki3.html
Based on these observations I feel tide gauge records tell nothing about the global sea level trend.

fhsiv
June 26, 2011 6:39 pm

Thank you Berényi Péter for the reference to USGS PP 1773. I enjoyed reviewing the text and graphics of the report while enjoying a Sunday afternoon beer! I was able to learn a little about some hydrogeology on the other side of the continent, and as a bonus, I spotted a few hockey sticks!
The exceedingly well documented paper provides groundwater data from hundreds of wells on the Atlantic Coastal Plain between Virginia and Georgia (including the barrier islands of North Carolina). Also included is a discussion of the hydrogeology of the region provided as background for the ultimate purpose of the paper (groundwater modeling intended to address future water supply issues related to the development of groundwater resources).
Cursory review of the hydrogeologic data combined with application of fundamental engineeering principals suggests that if the tidal gauges utilized in the Kemp report were going to be subject to ground water extraction related ground settlement then it would be most likely due to water extraction from the youngest, shallowest and most consolidation-prone Pleistocene sediments. The maps provided in the report indicate that the extent of the shallowest confined aquifer (Yorktown Aquifer) includes the entire area of the North Carolina barrier islands. And intersetingly enough, when you get to the end of the report, the modeled well hydrographs for the Yorktown Aquifer take the shape of inverted hockey sticks! Water levels remain nearly unchanged from 1900 to about 1940 and then begin to decrease exponentially (coincident with the timing of known development of the aquifer) to about 1990 and then bounce around at the new lowered level up to the present time.
Consolidation of the dewatered aquifer materials would be expected to be relatively rapid after the time of drawdown. However, consolidation of the fine grained (clayey) sediments in the overlying aquiclude would be expected continue for a long period of time after the pore pressures were reduced by dewatering due to their relatively impermeable nature (i.e. it takes a long time for the water to get out of the low permeability materials). My guess is that this condition has resulted in some engineering issues for public works and private developments on the islands and is probably well documented in local agency records and consultant reports. Someone familiar with the local area could probably shed some light.
I’d research it on the internet if I had more time today. However, my beer is done!

u.k.(us)
June 26, 2011 7:23 pm

Jeff Alberts says:
June 26, 2011 at 2:37 pm
“It’s not that they don’t care, it’s that the know any rebuttal will most likely not get past the gatekeepers. If it does, it will be so butchered as to be a non-rebuttal.”
===
Oh Noes, they do care, yet fear if the truth be spoken, the words will be twisted to foster the agenda.
How’s that?

RoHa
June 26, 2011 10:39 pm

Thanks, Willis. It’s all clear to me now.

Scottish Sceptic
June 26, 2011 10:58 pm

John A says: June 26, 2011 at 3:30 am
“Willis you should write this up as a comment and send it to the journal”
HOW ABSURD!
Willis, you should be Paid to write this up and send it to the journal.

dp
June 26, 2011 11:43 pm

It is hard to imaging we’re talking about these trivial variations like they were some kind of serious problem. This is a non-trival climate induced variation: http://en.wikipedia.org/wiki/Lake_Agassiz and it was not a problem. Unemployed people without shoes or a roof over their heads or SUV’s in the drive did quite well all around the world while this kind of thing was happening. It is called adapting. It is part of the cycle of life on Earth. When did half a degree and 6″ of sea level rise become a crisis?

Dave Springer
June 27, 2011 7:04 am

Berényi Péter says:
June 26, 2011 at 4:26 am
This interpretation is consistent with the fact the bulk of local sea level rise acceleration (relative to coastal elevation) happened in the late 19th century, when industrial scale drilling for groundwater became feasible.
“It means Kemp at al. possibly detected a local signal unrelated to global sea level change, but caused by recent local anthropogenic effects on coastal elevation.”
Bingo! It took me less than an hour to figure out what was behind their data and I wrote it up a week ago on the first Mann/Kemp thread. The big clue was a great acceleration in rate of sea level rise circa 1880. This is entirely inconsistent with anthropogenic CO2 driving steric sea level rise. That factor is a consistent linear rise that began in the late 18th century not the late 19th. Isostatic subsidence caused by unloading of northern glaciers is also not consistent with a spike in rate of sea level rise like that as that will also be more or less linear over thousands of years. Thus I wondered what else was happening around 1880 to explain the spike. A bit of digging revealed that between 1850 and 1900 North Carolina’s gigantic tobacco industry was established engendering huge land and water use changes which would almost certainly effect the coastal aquifer. Aquifer level change can and does cause isostatic sea level change.
The authors and reviewers of this latest hockey stick paper are clearly not competent earth scientists. No competent earth scientist would have overlooked what was glaringly obvious to me and I’m not even an earth scientist just a well-read layman.

KR
June 27, 2011 7:05 am

Willis
Kemp et al state in their conclusions:
“According to our analysis, North Carolina sea level was stable from BC 100 to AD 950. …rose at a rate of 0.6 mm/y from about AD 950 to 1400 as a consequence of Medieval warmth, …sea level was stable from AD 1400 until the end of the 19th century due to cooler temperatures associated with the Little Ice Age. A second increase in the rate of sea-level rise occurred around AD 1880–1920; in North Carolina the mean rate of rise was 2.1 mm/y in response to 20th century warming.”
Where do you get 200 mm in 50 years? The data shows that rise in 100+ years, due to the 2.1 mm/year rate Kemp indicates.
I can find no statement in Kemp et al of a 5 mm/year increase, as you claim – can you please point out where this is stated?

Ryan
June 27, 2011 7:26 am

The problem with salt marsh sediments is that they only record when sea level is rising, and have now way of recording when it is falling. So it is no big surprise that you can take a core at 80cm depth and get carbon dating at 1600yrs and 1800yrs simultaneously. All it means is that sea level was probably more or less the same over that period. So if you can’t measure any increase in sea level over a 200year period, the tehnique can’t be very good can it?

Kelvin Vaughan
June 27, 2011 7:49 am

RoHa says:
June 26, 2011 at 5:24 pm
O.K. Time for the easy version again.
Is sea level going up, down, or sideways?
Inquiring minds want to know.
All three. Especially in a force 9 or above.

Chris D.
June 27, 2011 8:26 am

Hoser says:
June 26, 2011 at 9:15 am
Consider this little guy:
http://www.vims.edu/~jeff/fiddler.htm
Read it all.
Anyone whose ever visited the salt marshes will know that these things are everywhere.

Ryan
June 27, 2011 9:08 am

@KR
“I can find no statement in Kemp et al of a 5 mm/year increase”
It’s derived from the graph. The last 2 dots are in 1990 and 2000 with a 50mm difference between them, i.e. 5mm/yr. This is more than double previous claims from Team AGW based on tide guage data.

June 27, 2011 9:27 am

RE: Carbon dating,
I see three issues.
1) given the nature of the area, the odds of a smooth top down distribution of samples, oldest on the bottom, youngest on the top, is problematic at best. Storms churning up the sand, in between storms, animals churning up the sand.
2) I’ve always read that it was improper to do C14 dating on entities that grew in water. This is because of the accumulation of CO2 in the water. BOth from the air, and from sediments washed into the water. Water will always contain a complete mish mash of both young and old CO2, so the ration of C12 to C14 is completely unpredictable.
3) Many of their samples are taken after the beginning of the Industrial Revolution. All of that ancient carbon released into the air makes any attempt at C14 dating meaningless at best.

Ryan
June 27, 2011 9:42 am

@KR
By the way, if you really want to see how useless this proxy is then take a look at figure S6 of the original source paper (in the supporting literature) which from 1900 in effect shows the tide guage data superimposed on the salt marsh proxy. The tide guage data (which is tightly constrained in its error bands) is roughly linear from 1900 to 2000 whilst the salt marsh proxy shows a “hockey stick” curve totally absent from the measured data. It can hardly be said to be predictive therefore.

KR
June 27, 2011 12:06 pm

Ryan 71
I looked at Fig. S6 – the North Carolina data, a pretty smoothed out curve from the 193 points of data, is still within 1-1.5 standard deviations of the tidal gauge data. The tidal gauge data is well constrained, the NC reconstruction much less so.
In terms of predictive capability, I would definitely go with the tidal gauge data, as it’s far more precise. But in terms of long term reconstruction of the last 2K years, I believe (just my opinion, mind you) that the point of the paper is to establish that foraminiferal data provides a reasonable estimate. And that longer term sea level rise rates do correspond to best estimates of past temperatures.

Don K
June 27, 2011 12:13 pm

“Now, before averaging them and using them to calibrate the age/depth curve … wouldn’t it make sense to stop and wonder why two samples taken from the exact same one-centimetre-thick slice of of the core are nearly two hundred years different in age.”
Two possible explanations jump out:
1. A centuries long period of extremely slow sediment depositation. That’s probably not impossible.
2. The samples are from ‘lag deposits’ (aka ‘Tempestites’). Lag deposits form when something — often a tropical storm — reworks sediments, eroding fine material and concentrating the coarser/denser material in a layer at the base of the redeposited sediment. Some lag deposits are widely known because the concentrated material is spectacularly rich in vertebrate bones and teeth — i.e. it forms a ‘bone bed’. It’d be hard to overlook a real bone bed type deposit, but maybe these lag deposits — if that’s what they are — are less obvious.
If I recall correctly, there isn’t much discussion of the geology/geological setting in the Kemp paper.

Dave Wendt
June 27, 2011 2:21 pm

I try to restrain myself from posting comments on these sea level threads, but occasionally my sense of frustration overwhelms my personal resignation to being a lone voice in the wilderness and I am moved to try again.
It is entirely pointless to engage in these tendentious disputes about whose millimeter or tenth of a millimeter speculations about what the global sea level is doing, because measuring the the sea level to that kind of precision both a logical and physical impossibility.
I have linked to this document on numerous occasions
http://www.osdpd.noaa.gov/ml/ocean/J2_handbook_v1-3_no_rev.pdf
It is the data product handbook for the Jason-2 satellite which is the latest and greatest of the satellites which have provided the altimetry data for all those lovely graph of how GMSL has been proceeding upward at of 3mm/yr for thirty years. Due to continuing improvements in the technology it is probably at least an order of magnitude better, and more likely two, than the TOPEX/ POSEIDON units from the beginning of the record. When it comes to accuracy the handbook has this to say
2.3.1. Accuracy of Sea-level Measurements
Generally speaking OSTM/Jason-2 has been specified based on the Jason-1 state of the art,
including improvements in payload technology, data processing and algorithms or ancillary data
(e.g: precise orbit determination and meteorological model accuracy). The sea-surface height shall be provided with a globally averaged RMS accuracy of 3.4 cm (1 sigma), or better, assuming 1 second averages.
Even that claim is likely a statistician’s fantasy, because they also admit to not being able to determine significant wave height to better than 0.4 METERS! Since significant waves are present over most of the world’s oceans most of the time, I find there accuracy claims a bit difficult to resolve.
But aside from the technical difficulties involved in the metrology, which have admittedly been subject to profound improvements, there still exists a fundamental logical conundrum when it comes to attempting to measure GMSL. When you have an object isolated in space, it is possible to measure its dimensions with incredible precision, but GMSL is a measurement of height which by definition must be made in reference to some fixed datum and their is nothing on the actual physical planet that is reliably fixed at the millimeter level. Archimedes is widely misquoted as saying “Give me a lever and a place to stand and I’ll move the world”. When it comes to measuring GMSL we suffer from the same problem as Archimedes, there is literally no place to stand to make the measurement.
The current sats are tied into the GPS satellite fleet in ways that greatly increase the accuracy of their orbital ephemeris but the whole system is still referenced off ground sites which are subject to their own displacements, which are supposedly accounted for based on measurements from the GPS sats that they are referencing. It’s equivalent to setting up a contractor’s level in a row boat and trying to take readings off a level rod that’s in another rowboat, at least at the millimeter level of precision they are claiming to achieve.
To get around this both systems are actually based off the reference ellipsoid and the geoid, two completely arbitrary and artificial human constructs which are only vaguely related to the physical reality of the planet. The sea level anomalies posted in the data are actually anomalies from the reference ellipsoid and the undulations of the geoid, and since the advent of the GRACE and GOCE satellites has demonstrated that the geoid is neither constant or well modeled, the geoid model they are using now is significantly different from the one used at the beginning of the record. Even if we were able to magically surmount all these difficulties and arrive at an accurate measure of the Global Mean Sea Level, it would still be worthless information because the seas are not level and changes in the global mean will tell us nothing reliable about what will happen at any particular piece of local coast. I could go on much longer but my anger has dissipated and I’m running out of rhetorical gas, but just for my sake can we give it a rest.