Finally: JPL intends to get a GRASP on accurate sea level and ice measurements

A climate science bombshell: New proposal from NASA JPL admits to “spurious” errors in current satellite based sea level and ice altimetry, calls for new space platform to fix the problem.

People send me stuff. Today it is a PowerPoint presentation from NASA JPL that touts the new GRASP (Geodetic Reference Antenna in Space) satellite project. I’d say it is more than a bit of a bombshell because the whole purpose of this new mission is to “fix” other mission data that apparently never had a stable enough reference for the measurements being made. This promises to rewrite what we know about sea level rise and acceleration, ice extent and ice volume loss measured from space.

What is most interesting, is the admissions of the current state of space based sea level altimetry in the science goals page of the presentation:

The difference between tide gauge data and space based data is over 100% in the left graph, 1.5 mm/yr versus 3.2mm/yr. Of course those who claim that sea level rise is accelerating accept this data without question, but obviously one of the two data sets (or possibly both) is not representative of reality, and JPL’s GRASP team aims to fix this problem they have identified:

TRF errors readily manifest as spurious sea level rise accelerations

That’s a bucket of cold water reality into the face of the current view of sea level rise. It puts this well-known and often cited graph on Sea Level Rise from the University of Colorado (and the rate of 3.1 mm/yr) into question:

What’s  a TRF error? That stands for Terrestrial Reference Frame, which is basically saying that errors in determining the benchmark are messing up the survey. In land based geodesy terms, say if somebody messed with the USGS benchmark elevation data from Mt. Diablo California on a regular basis, and the elevation of that benchmark kept changing in the data set, then all measurements referencing that benchmark would be off as well.

USGS Benchmark on Mt. Diablo – Image from geocaching.com

In the case of radio altimetry from space, such measurements are extremely dependent on errors related to how radio signals are propagated through the ionosphere. Things like Faraday rotation, refraction, and other propagation issues can skew the signal during transit, and if not properly corrected for, especially over the long-term, it can introduce a spurious signal in all sorts of data derived from it. In fact, the mission summary shows that it will affect satellite derived data for sea level, ice loss, and ice volume in GRACE gravity measurements:

In a nutshell, JPL is saying we don’t have an accurate reference point, and therefore the data from these previous missions likely has TRF uncertainties embedded:

The TRF underlies all Measurement of the Earth

Without that stable Terrestrial Reference Frame that puts the precision of the baseline measurements well below the noise in the data, all we have are broader uncertain measurements. That’s why the plan is to provide ground based points of reference, something our current satellite systems don’t have:

To help understand the items in the side panels:

GNSS = Global Navigation Satellite System – more here

SLR = Satellite Laser Ranging  – more here

DORIS = Doppler Orbitography and Radiopositioning Integrated by Satellite – more here

VLBI = Very Long Baseline Interferometry – more here

Taken together, these systems will improve the accuracy of the TRF, and thus the data. It’s rather amazing that the baseline accuracy didn’t come first, because this now puts all these other space based measurement systems into uncertainty until their TRF issues are resolved, and that’s an inconvenient truth. We’ll never look at satellite based sea level data or GRACE ice volume data in quite the same way again until this is resolved.

PowerPoint here: Poland 2012 – P09 Bar-Sever PR51 (PDF)

More info: http://ccar.colorado.edu/~nerem/EV-2_GRASP-final.pdf

UPDATE: Here’s an estimate of impacts:

Source: http://www.gps.gov/governance/advisory/meetings/2011-06/bar-sever.pdf

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

126 Comments
Inline Feedbacks
View all comments
Don K
October 31, 2012 2:21 am

feet2thefire says:
October 30, 2012 at 8:42 pm
So let me get this straight:
They have satellites passing over land and sea without any regular initialization and measurement against – on each pass or ten – a solid reference point?
==================
Not really. They actually do have a pretty solid Earth based (“Geocentric”) reference system. Depending on the satellite, they use GPS and/or DORIS (a sort of “backwards” GPS with fixed ground stations broadcasting reference signals) to continuously measure satellite position. The most recent satellites with Radar Altimeters claim average position uncertainties of a few cm. Beyond that, they average many measurements a second made continuously whenever they are not over land or ice to achieve what they hope to be sub mm accuracy.

Don K
October 31, 2012 3:00 am

Mike McMillan says:
October 30, 2012 at 9:42 pm
Why bother getting the baseline correct when you’re throwing in totally bogus ‘corrections’ like the glacial isostatic adjustment (GIA). … They purport to throw it in to account for volume changes, …
They’ve also applied an ‘inverse barometer.’ Given that the barometric pressure variation over 7/10ths of the earth’s surface should average out close to zero, that’s another avenue for mischief.
====================
I believe that there are two components to the CU GIA adjustments. One is the standard adjustment for ongoing changes in land reference elevation and the second — much smaller — is a recent addition to the handling that purports to “correct” for ongoing sinking of the ocean floor caused by the weight of water melted since the peak of the last glaciation. The first part is essential for tidal gauge measurements, but strikes me as being kind of weird for satellite measurements. The second seems to me to be even weirder since I think it will cause satellite and tidal gauge values to diverge over time unless the same “correction” is applied to the tidal gauge data.
BUT — the computations of sea level rise subtract “old” sea levels from “new” and the GIA components — whether appropriate or not — should cancel out (A+X) – (B+X) = A-B.
Inverse barometer is possibly a bit better justified. The problem is that about a third of the earth’s surface is land or ice and that average pressures over those areas may be different than average pressures over unfrozen sea surfaces. And not always by the same amount. It seems likely that without the inverse barometer correction, we’d see a seasonal affect in satellite measured sea level as the largely land Northern Hemisphere heated and cooled at a different rate than the largely water Southern Hemisphere.

Vince Causey
October 31, 2012 4:46 am

Can they “adjust” the satellite data in accordance with the correct reference points, or is the entire dataset no more than junk?

October 31, 2012 4:53 am

Oso Politico wrote:

I don’t know how much credence the NOAA has, but take a look at this link:
http://tidesandcurrents.noaa.gov/sltrends/sltrends.shtml
Notice all of the little green arrows pointing up and indicating a positive rise in sea level. Well, if you look at the box below you will see that ‘green’ is for 0 to 3 mm per year. Zero. So one doesn’t know if there is really some indication of a rise. They could all be zero, for all we know.

It is true, the green arrows are vague
But just hover the mouse on each one
If sea rise here is really a plague
This chart seems to deflate all the fun
For the stations with century scale
There’s no hockey stick giving a scare
Just a gradual trend … and they fail
To show any “accelerate” there
Looks like 2mm here is the rule
As it’s been since a hundred years back
This site serves as a useful new tool
(Look at Kodiak’s big earthquake whack!)
===|==============/ Keith DeHavelle

D.I.
October 31, 2012 4:57 am

Can anyone tell me how they can have a ‘Terrestrial Reference Frame’ accuracy of ~1mm considering the Earths Crust is in constant movement?.

Leo G
October 31, 2012 5:44 am

Mišo Alkalaj (Oct 31 at 12:32 am) says: “If my calculations are at least roughly correct, the sea level rise of 3,9 mm/year should correspond to heat input of 2,2893 W/m2 … expansion coefficient: 0,02315% / °K (average in range 5-30 °C)”
The thermal expansivity of sea water at average temp (~277K) and average pressure (~40MPa) is about 0.0187% / K.

October 31, 2012 5:46 am

A case of ”send more money”.
Greater accuracy would be good but would the alarmists believe a sea level drop?

October 31, 2012 6:08 am

Not that there is any meaning to this but it is strange looking at this slide:
http://wattsupwiththat.files.wordpress.com/2012/10/grasp_mission.jpg?w=640&h=492
one satellite measures “Sea Level Rise” (They don’t measure sea level lowering?)
one satellite measures “Ice Loss” (they don’t measure ice gain?)
one satellite measures “Gravity Changes”
Just strange wording

October 31, 2012 7:23 am

I’m curious as to just how they are going to make the system accurate to within 1 mm. I’ve been frankly enormously doubtful of this sort of precision from the beginning — humans would have a hard time achieving this precision with a really long measuring stick. Sure, interferometry has the capability of being very precise, but the atmosphere is hardly a linear propagation medium, paths are very long, the natural variability of the surfaces being measured with time is large (think ocean waves and tides and storms, think land surface tides, thermal expansion of the surface, and the fact that the surface is covered with vegetation and heat waves of the sort that create mirages and can provide false signals).
If I understand it, TFA above points out that in order to measure the true relative (to an arbitrary point deemed “the center of the Earth” as it moves through space in time) location of the surface, one has to begin by knowing the true relative location of the measuring satellite. The only way to set such a location in a satellite that itself has a constantly varying orbit as inhomogeneous drag forces act on it, many body forces act on it, and those forces themselves effectively vary in space and time (as the Earth isn’t a perfect sphere, tidal pseudoforces are constantly altering with the relative positions of sun, moon, and even planets (given a long integration baseline, even very weak perturbations can add up to millimeter sized changes in expected position) is to use positions on the Earth as reference points that triangulate the location of the satellite.
Sadly, there are no stable, stationary reference points on the surface of the Earth, and even if there were that does not completely eliminate the problem associated with uncertainties in wave propagation itself across a distance of many powers of ten of kilometers, so that 1 millimeter errors — still comparable to the “signal” one wishes to resolve, and accuracy on the order of one part in a billion — could be obtained.
To give you an idea of magnitude of the problem, let’s assume that a point on the ocean’s surface has a natural variability of one meter. The precise number won’t much matter, it is order of a meter (more some places, less others) as waves and tide operate. Statistically, in order to obtain a measurement of the mean, accurate to one millimeter, one requires many, many random samples drawn from the ensemble of possible snapshots at the point. Furthermore, one has to hold fairly precisely at the sampled point laterally as the satellite passes not directly overhead, but at an ever changing angle, through an atmosphere with ever varying moisture content, thermal profile, and hence index of refraction — even small aiming or obliquity errors cause one to sample the sides of waves over a different latitude and longitude than you think. This lateral pointwise precision needs to be reproducible — you have to hit and sample the same point, orbit after orbit, day after day.
Tidal gauges are perfectly designed for this because they can sample sea surface level at a single fixed geographical location thousands of times a day, day after day, year after year, with no holes. Even so there is clearly a lot of noise on their measured results because the systematic variation of the ocean via tides involves propagating and interfering large scale waves that are constantly being affected by things like weather events air pressure, storms, water temperature and that exhibit chaotic nonlinear properties. One cannot blithely “remove the seasonal and other periodic signals” from the data, because that presumes that you know what they are supposed to be, which begs the question of just what you are measuring. Or rather, one can remove them, but one is left with rather large and irreducible uncertainties — a single tropical storm like Sandy creates an air pressure linked storm surge that affects sea levels everywhere within a thousand miles or so over a period of weeks and creates persistent perturbations in the global sloshing of tidal waves around between the continents with a lifetime order of months. There are order of tens of tropical storms in the major oceans, and uncounted minor ones. All of this has to be averaged out to obtain 1 millimeter accuracy, and I personally don’t think even tidal gauge data can accomplish this, not even with many gauges, on a time frame less than decades.
Satellites are going to be strictly worse than gauges in nearly every respect but one. They cannot possibly sample single points as consistently. They suffer from the same problem of being unable to differentiate e.g. land subsidence, alterations of ocean volume due to a changing ocean floor and the locations of the continents (which frankly I don’t think anybody knows how to even estimate, since it would require measurements we don’t know how to make at points on the ocean floor and land surface all over the world) and so on from “true” SLR due to presumed GW. They suffer from a far worse problem in that if they were tidal gauges, they’d be tidal gauges mounted on a rod attached to a slowly sinking chaotic oscillator that swung them around all day, turned them off every few minutes for hours at a time, sampled only a handful of times when they were on, and got occasionally kicked by a drunken and irascible station keeper as he stumbled on by in the night. Finally, the samples they draw are not iid, nor are they systematic. They are too frequent to avoid autocorrelation, to sparse to be directly integrable as a continuous stream of samples damn the autocorrelation.
Consequently it is a bitch to do proper statistical analysis and return a fair estimate of the error, because you can’t use the number of samples to compute a standard deviation in the traditional way (too much autocorrelation) but if you use only the samples sufficiently separated to be able to count as iid, you have so few samples that it takes “forever” to get 0.1% accuracy in the mean from data with a natural variation of 1 meter.
One expects to need order of a million independent samples to get this sort of accuracy, and the time interval between the samples needs to be random and at least tens of autocorrelation times apart (neglecting the truly long period signals like the tides themselves). Ocean waves have a period on the order of tens of seconds, so one can sample at most once every few minutes. There are 1440 minutes in a day. One could get at most 1000 independent samples a day, which means that it takes three years to get 1 mm accuracy at any given point.
The satellite does do better in one respect that helps with this. It can perform its far, far worse sampling over many spatially separated points at the same time. It has to deal with spatial frequencies as it does so — again, if I were designing this I’d very much use random numbers to generate a monte carlo sampling with some sort of minimum distance of several correlation lengths — and it has to deal with the fact that the sampled area then becomes a curved surface, not a simple radial point — but one can imagine using this sort of thing to both get a better “global average” and to at least think that one is getting much smaller overall errors. This is clearly visible in the graphs — the satellite SLR data has very little noise — too little noise, with systematic variations too perfectly removed IMO.
In the end, though, it would be very disturbing if the satellite curves and the tidal gauge curves diverge, as they are apparently doing (although the presented graphs don’t show the extension of the tidal gauge data presumably to “hide the divergence”). Tidal gauges have their flaws, but they also are fully expected, collectively, to reflect true SLR as well as anything else, and their sources of error, as noted above, are straightforward and a simple average over the sites will almost certainly yield a highly accurate picture of SLR globally. The ocean can hardly rise “only in the middle” and not on all of the contributing coasts and gauge sites.
rgb

Resourceguy
October 31, 2012 7:44 am

It’s now a race to spend money with alarmism in order to head off massive overspending by alarmism that threatens the viability of the country.

P. Solar
October 31, 2012 9:20 am

>>
Unfortunately they betray their bias and expected outcome in this presentation, they are assuming as a given that both are accelerating.
If they were unbiased, the wording would be something like “Determine if sea level is changing” and “Determine if Ice mass is changing, and the direction of those trends.”
>>
Hey man, they’re asking for big bucks, they’re not going say something silly like that in the prospectus. The fact that they put a spot light on the problem seems to say more about their intent.
Don’t image Hansen is running the whole of NASA, now.
There is nothing that prevents them from finding a negative acceleration (which is undoubtedly the case since even the bogus rise is slowing now).

Neil Jordan
October 31, 2012 9:47 am

Re Mišo Alkalaj says: October 31, 2012 at 12:32 am
Kev-in-Uk says: October 31, 2012 at 1:32 am
Leo G says: October 31, 2012 at 5:44 am
Assumption of an average depth, volume, and area results in a cylindrical ocean with linear depth vs. volume relationship. In actuality, the depth vs. volume relationship is nonlinear and described by the “hypsographic curve” which can be obtained from online references. Surface area vs. volume is also nonlinear. To throw some more fat into the fire, the porosity of the above-sea level soil needs to be taken into consideration as the assumed rising water must first fill in the pore spaces before the assumed increasing water volume manifests itself as an actual rise in water surface elevation.
Re rgbatduke says: October 31, 2012 at 7:23 am
Tide gages include mechanical and hydraulic damping (e.g. a stilling well) to minimize the effect of waves and rapid water movement. For example, see:
http://www.stevenswater.com/water_level_sensors/index.aspx

Bean
October 31, 2012 10:36 am

All this sounds a great deal like the discussions that occurred when GPS was first going up. The difference then was between the physicists who supported a system that required very precise measurement of the orbital parameters and computation of orbits based on understood physical phenomenon versus the math-statistics camp that supported orbital error determination and reduction by use of Kalman filter techniques. Everything was proposed including laser retroreflectors on GPS satellites in an attempt to identify and correct multiple sources of orbital error determination. In the end, it all came down to the problem now being discussed – precise and accurate terrestrial reference frames. GPS was highly self consistent but all that was not very useful until it was tied to a terrestrial reference frame whose position was extremely well determined.
Measuring absolute sea level from the center of the geoid is nice to know with respect to temperature versus volume calculations. But absolute sea level measurement with a highly consistent satellite system is not as useful as sea level determined relative to a terrestrial reference frame since what is really desired in terms of sea level information is the relative comparison of sea level to a terrestrial reference frame. More or less, we need the satellite equivalent of tide gauges and that depends on multiple accurate and precise measurements relative to a terrestrial reference frame.

Duster
October 31, 2012 11:00 am

Anthony, you say,

… if somebody messed with the USGS benchmark elevation data from Mt. Diablo California on a regular basis, and the elevation of that benchmark kept changing in the data set, then all measurements referencing that benchmark would be off as well. …

In point of fact, the displacement along the San Andreas fault is between 30 and 40 mm per year. In addition, the entire North American Plate is in motion at roughly 1.15 cm!! per year. There are also vertical shifts associated with both the Sierra and Coast Ranges that regularly affect altitude estimates based on the Mt Diablo datum and all the other data (datums to those who find that clearer) that are employed in the State Plane system. So, in fact correction is needed regularly. There is no such thing as a fixed point on the planet’s surface, either horizontally or vertically.

Duster
October 31, 2012 11:12 am

In case my previous post is a little unclear, every “motion” measured on this planet is a relative motion. Regardless of whether a geologist claims sea level changes are isostatic or eustatic, ANY measurement will contain components of both forms, plus errors introduced due to essential uncertainties about the precise position and elevation any datum used for mapping. .

Crobar
October 31, 2012 12:12 pm

Read the darn presentation, the errors that can e mitigated by the proposed system are of the order of 0.45mm/year. This is almost an order of magnitude less than the sea-level signal.

October 31, 2012 12:55 pm

It amazes me how many comments were made out of complete ignorance that these measurement obviously have DOD implications. Let’s pretend for a change we live in the real world and not some fantasy fueled by the fossil fuel industries. In the real world our Navy has the largest Naval Base in the world next to ground that is subsiding next to an ocean having twice the amount of sea level rise as a global average. Sea level rise or fall doesn’t affect a ship at sea, but ships aren’t built at sea and they don’t spend their whole lives there. Having accurate data on future sea level changes is essential for the Navy’s future planning.
I’ve heard one person say we have NOAA, so why do we need the NSIDC, talking about sea ice. NOAA uses the NCDC to collect weather information all over the world and archives it. NOAA is a branch of the Department of Commerce, so someone figured it was a good idea to gather weather information to help commerce. The NSIDC gets it’s arctic sea ice information from the NIC, which is the Navy. The NIC is interested in daily navigation, so it is biased towards sea ice being present. The NSIDC is interested in archiving the information and will examine the data in more detail. There are obvious national security issues involved in arctic sea ice and that’s why there are satellites measuring it.
Let’s keep it simple and show me any of those NASA satellite programs that were done for alarmists, like some of you have claimed! I haven’t seen any climate program that didn’t have it’s origin for commerce or national defense reasons. Is it really such a big deal if some university gets to analyze the data for climate purposes or GISS, which is part of NASA, gets to analyze temperature data from the NCDC? Denmark manages to make arctic sea ice maps, so why shouldn’t we? These agencies are looking at data that exists with computers, so it isn’t like it involves a hugh expense.
As far as a three year lifespan for a satellite goes, that’s a normal prediction. IceSat2, which is to be launched in 2016, has such a three year lifespan and seven years of fuel. NASA tends to low ball the number, but a satellite can be taken out once it is functioning in orbit. The arctic sea ice might not even be there when it’s launched, but there are other reasons, such as ice sheets, topography and vegetation data.
From my take this JPL proposed satellite is looking for acceleration in sea level rise, which doesn’t mean increasing, but increasing at a faster rate. With 97% of Greenland melting and NYC flooded in 2012, you might want to spend some money to avoid spending much more in future expenses, if proper planning isn’t done. Remember, it isn’t just the victims of a disaster that foot the bill.

Kev-in-Uk
October 31, 2012 1:50 pm

Gary Lance says:
October 31, 2012 at 12:55 pm
Wirth respect, I’m not sure I am following your logic re the Naval base/ships issue. I am assuming you are referring to dry docking of ships? and in that instance, I would agree, that if we had say a few feet of sea level rise, this could present a problem to dry docks that could be overtopped by rising seas. However, as both a yachtsman and an engineer, I fail to see how this point could remotely warrant the slight ‘alarmist’ tone in your post. (apologies if I misread that tone!).
Firstly, a general SLR of 2mm per year would equate to 300mm or about 1 foot, in 150 years. Now, I am pretty sure that all the current ships in use will have been replaced by then, probably built in bigger and newer yards!
Secondly, dry docks and harbours can be ‘raised’ relative to the adjacent sea level quite easily, I would have thought. And again, on a timescale of adding 1 foot of ground around a dry dock, and 1 foot of extra height on the dock gates every 1o0 or so years – this is hardly a majorly technical task (IMHO)
To put the same SLR in terms of the recent storm surges seen via Sandy – do you not think that the overtopping protection likely to have been ‘designed’ into dry docks from tidal and storm surges would not already more than cover the anticipated 1 foot of SLR in 100 years (or whatever)?
Just trying to inject some perspective to your naval issue…….
regards

October 31, 2012 6:37 pm

Kev-in-Uk says:
October 31, 2012 at 1:50 pm
The information about Norfolk comes from the Navy. Perhaps you should stop the video and review the charts of the reinsurer near the end. I have found these insurance charts in google images. The charts don’t involve payouts based on claims, but are talking about increases in incidents. The insurance charts are clear evidence of more extreme weather related events.

The IPCC report claimed the arctic would be ice free around 2100. Few think it will not be ice free by 2020 and some believe it will be ice free by 2015. We don’t have experience with large ice sheets melting and there is data to support them melting faster than previously thought.
If the ships aren’t going to be around when sea level becomes a problem, then they will be replaced by ships built in Norfolk. It’s a good idea to know just how fast the past sea level rise is and what can be expected in the future. Our present trend of negative Northern Hemisphere snow cover anomalies in June is going to make those every 150 year Greenland melts a new reality. People are going to quickly catch on and realize those sea level rises by the IPCC are conservative estimates. So far, ice sheets have contributed very little to sea level rise. To put the snow cover anomaly in perspective, Greenland has an ice sheet of 1.7 square kilometers and 2012 had an anomaly of 5.8 square kilometers with nothing in the trend to suggest things will get better.
http://vortex.accuweather.com/adc2004/pub/includes/columns/climatechange/2012/590x558_07091839_figure5a.png
My point was a simple one and I asked for an example of a climate program that didn’t originate out of concerns for commerce and defense.

D Böehm
October 31, 2012 6:59 pm

Sea level rise has not been accelerating. It is currently on its long term trend line. However, the more recent decadal trend has been declining. This is in line with ocean heat content (OHC). And the long term rise since the LIA has been constant, despite a large rise in CO2. This is also supported by ARGO data, which falsifies the models. Even NOAA shows that SL anomalies are very minor.
There is no empirical evidence to support the belief in catastrophic AGW, or in AGW for that matter. If the globe was warming any more than its natural global warming trend since the LIA, the warming would show up in the OHC and the sea level. But it doesn’t.
Readers may disregard all of Gary Lance’s wild-eyed arm waving. He is simply wrong. There is nothing unusual or unprecedented occurring.

October 31, 2012 8:18 pm

D Böehm says:
October 31, 2012 at 6:59 pm
It’s amazing NASA relies on satellite data, when they could just use your mouth. Wishful thinking isn’t data to support your position. Hint: Either the ice stops melting because it stopped warming or it ran out of ice. There is nothing to suggest it should stop warming and there are only suggestions that the warming should be a certain amount over a certain period of time.
You brought up AGW here and I just said ice sheets are melting and sea level rise is going to happen faster than expected. The only evidence needed is simple observation, but that requires finding video information that actually shows you the changes in Greenland.
It isn’t that hard to calculate the amount of radiative forcing produced by Milankovitch Cycles over a time period, but that amount of radiative forcing is enough to drive us in and out of ice ages.
The unusual or unprecedented has just started, but the concept of frequency is beyond your cognitive capacity. Get used to paying for it and keep telling yourself it’s all natural! The bill for exceptional weather isn’t going to ask what you believe or care one way or the other. All the voodoo science in the world won’t protect you from that!

D Böehm
October 31, 2012 8:32 pm

Gary Lance opines:
“The unusual or unprecedented has just started…”
No, it has not. There is no empirical, testable evidence for your baseless assertion. The climate Null Hypothesis has never been falsified. Get educated, puppy.

October 31, 2012 8:46 pm

Let’s presume, for the sake of discussion, that sea levels are rising (or for that matter lowering) sufficiently to have an effect on humans. Would it be more pragmatic to direct our energies and resources to adapting ourselves to a higher (or lower) sea-level or for us to attempt to adapt the planet in order to accommodate our preferences?

Neil Jordan
October 31, 2012 9:09 pm

Re Gary Lance says: October 31, 2012 at 6:37 pm
Kev-in-Uk says: October 31, 2012 at 1:50 pm
The National Council Marine Board “Responding to Changes in Sea Level – Engineering Implications”
http://www.nap.edu/openbook.php?isbn=0309037816
provides guidance on responding to sea level rise predicted as of the 1987 publication date. Harbor structures are addressed. See Page 107 for dry docks, wet docks, and floating dry docks. The report summary provides an overview of main points of sea level rise discussed here. Hampton Roads in Figure 1.1 shows a relative sea level rise of 3.6 mm/year, or about 1.2 ft per century, based on data before 1987.
The 1987 predictions predate recent sea level measurements, so this report would also provide a useful comparison between what was predicted and what has occurred.

November 1, 2012 5:50 am

Tide gages include mechanical and hydraulic damping (e.g. a stilling well) to minimize the effect of waves and rapid water movement. For example, see:
http://www.stevenswater.com/water_level_sensors/index.aspx

Sure, but in the end this is just a mechanical way to average over or filter out short period fluctuations. One might just as well not have such things, record the levels on a granularity of (say) 0.1 seconds, and apply numerical filters. At least in the latter case one would have more control over precisely what your frequency cut-off looked like, and one could ensure that it was precisely the same gauge to gauge, where mechanical devices (especially ones that have some ability to be mechanically tuned) are almost certainly all going to be slightly different in the way they damp out the short time stuff.
This isn’t intended to be critical — I’m sure that the mechanical damping mechanisms do in fact serve their purpose, and that the tide gauges are in fact pretty accurate. They are also apples to apples across a very long time series of data, which is almost more important than their accuracy or how perfectly they dampen and filter the noise relative to the signal. They are still susceptible to a wide range of sources of non-predictable “noise” — ordinary variation in the atmospheric pressure uplift or press down the ocean by amounts comparable to the annual or even decadal signal on a timescale of hours to days. One “assumes” that these effects average out, and of course in a long enough time base, and with enough independent samples they will, but in the meantime one feature of “randomness” is how surprisingly much “order” is embedded in it, how easy it is to find the fluffy little sheep, bounding along in the clouds or the big dipper outlined in the stars.
That’s why personally I’d prefer to have the microscopic data, noise and all, for a long, detailed timeseries, with damping only of effects less than order of a second or so. Then form the full fourier spectrum of the result down to the upper bound frequency cut-off. Then identify the primary structure, fit its shape, and look for the asymptotic value of the zero-frequency component that is, in fact, the best estimate of the SL. And I’d probably do this in a long time window (effectively coarse graining intervals of the series) to TRY to get the secular slow time variation of the zero-frequency component, oxymoronic as that is.
Which is, no doubt, very similar to what is actually done with a lot of the timeseries data of this sort, and is as good as it gets given the data, as long as there are no confirmation-bias thumbs on the scales or sources of long period systematic error. Which, sadly, seem to exist in abundance in climate science. It isn’t the completely automated filters applied to timeseries that bother me, in other words, it is the corrections for things like UHI or systematic measurement error in the temperature series or the difficulty of determining an absolute/reliable frame of relative reference in the SLR satellite series.
rgb

Verified by MonsterInsights