New website gives you the real deal on sea level rise and rates

New analysis and graphing tools for sea-level data at SeaLevel.info

Guest essay by David Burton

www.SeaLevel.info now has interactive regression analysis (line/curve fitting) and visualization (graphing) tools available for mean sea level (MSL) measurements from over 1200 tide gauges, plus spreadsheets which combine various subsets of that data. This article is intended as a primer, for how to use these new tools.

But first, a few notes:

Note #1: This is a work in progress. I already have a large “to-do list,” but suggestions & corrections are nevertheless very welcome.

Note #2: These tools are my free contribution to the community. There’s no charge to use them.

Note #3: These tools are ideologically neutral.

One of the inspirations for this site was Paul Clark’s famous WoodForTrees, which provides some similar tools for temperature data. Paul is a lifelong “Green” activist, but to his credit, the usefulness of his site is independent of his ideology.

The data shows what the data shows, regardless of my opinions, Paul’s, or anyone else’s. I hope these tools are useful to everyone doing sea-level work, regardless of their views on climate issues.

Nevertheless, in the interest of full disclosure, I consider myself a “lukewarmer.” I am a signer of the Global Warming Petition. In my opinion, the weight of the evidence indicates that anthropogenic global warming is real, modest and benign, and anthropogenic CO2 emissions are beneficial to both human agriculture and natural ecosystems.

Site overview

When you visit SeaLevel.info, you’re greeted with a picture of Maui kitesurfers, and a menu on the left.

clip_image002

1. On the “Home” page you can scroll down to see the latest announcements, mostly about the site. There are also links to the usual “About” and “Contact” pages (which might get combined, when I get a round tuit).

2. The “Resources” page is a large compilation of climate-related resources, with an emphasis on sea-level. It includes search tools, glossaries, a climate conversion factor cheat sheet, and links to other tools (like WoodForTrees). It has a lot of information about sea-level (of course), but also about sea ice, floods, climate reports, climate cycles, temperatures, greenhouse gases, energy, the great North Carolina 2010-2012 legislative sea-level kerfuffle, and more.

3. The “Blogs etc.” page is a list of top climate blogs, plus some other useful climate sites, and a little bit of doggerel. I’ve listed blogs from both sides of the climate debate, though I’ve excluded a few prominent ones which are so heavily censored to prevent dissent that I judge them to be useless (or worse).

4. The “Papers” page has a list of about 120 papers and articles in four categories: Sea-level rise acceleration, How long a sea level record should be for long-term trend analysis, “Science [is a mess],” and “Other.”

5. The “Feedbacks” page has information about feedback systems in general, and climate feedbacks in particular. It is intended to contain a complete list of all significant known or hypothesized climate-related feedback mechanisms. I have a degree in Systems Science, from long ago, which is the discipline that studies such things, and I still remember a little of it.

6. The “Data” page is the topic of the rest of this article.

The Data Page

Sealevel.info has coastal Mean Sea Level (MSL) measurement data from more than 1200 tide gauges. (Currently, all of the data is from PSMSL and NOAA.) On the Data page you can:

1. Search for a tide gauge

2. View or download Excel-compatible spreadsheets.

3. View thumbnail pages corresponding to the spreadsheets.

It begins with a few links:

clip_image004

1. Search. You can search for a tide gauge location by PSMSL station number, NOAA station number, or full or partial station location (e.g., “Honolulu”). Currently it just finds one gauge; I plan to add the ability to find a list of stations (e.g., search for “NC, USA” and find all the North Carolina tide gauges).

clip_image006

2(a). Spreadsheets. If you click on “Spreadsheets & thumbnails” (or just scroll down a bit), you’ll see the section where you can select a Microsoft Excel-compatible spreadsheet with sea-level measurement data from various sets of tide gauges, or a “thumbnail” page with little versions of the MSL graphs for the corresponding tide gauges:

clip_image008

Although the spreadsheets are regular web pages, they have additional information embedded, in Excel-compatible HTML-export format, including formulae for calculating averages, medians, etc., and full-precision (unrounded) numbers. The spreadsheets are compatible with Excel 2003 and later.

The easiest way to download a spreadsheet and view it in Excel, all in one step, is to run Excel, and then in the “Open” dialog paste the spreadsheet’s URL in place of the file name. (Alternately, you can right-click the spreadsheet web page and “save as” a file on your computer, and then open the saved file in Excel.)

Here’s a screenshot from one of the spreadsheets, as viewed in the Internet (not in Excel). These spreadsheets are quite large, so this screenshot is severely truncated, in both length and width:

clip_image010

This excerpt shows the first 11 spreadsheet columns for 13 northern European tide gauges (out of 375 gauges in that spreadsheet).

Scroll to the bottom of each spreadsheet to see summary info (averages & medians).

If you click a column header, it will sort the spreadsheet by that column, but for any manipulation more involved than that you’ll need to download the spreadsheet and manipulate it in Excel. (If you need to do that but you don’t own a copy of Microsoft Excel, then email me and I’ll convert the spreadsheets that you need into a different format for you, for use in a different spreadsheet program.)

If you click a link in the “Location” column (“column D”) of the spreadsheet, it will take you to the sealevel.info analysis page for that tide gauge. If you click a link in the “NOAA stn” column (“B”) it will take you to NOAA’s page for that gauge. If you click a link in the “PSMSL stn” column (“C”) it will take you to PSMSL’s page for that gauge.

In the “Trend” and “±95% CI (trend)” columns you can see the result of linear regression (i.e., fitting a line to the data) for these 13 stations. At four of the stations sea-level is falling, at seven it is rising, and at the other two there’s no statistically significant trend (i.e., the 95% confidence interval is larger than the linear trend).

In the “Accel” and “±95% CI (accel)” columns you can see the result of quadratic regression (which detects acceleration or deceleration). At two of these 11 locations sea-level rise has been accelerating slightly. At one location sea-level rise has been decelerating slightly. At the other ten locations there has been no statistically significant acceleration or deceleration.

At the bottom of the spreadsheet you can see the calculated averages and medians (for the whole spreadsheet).

If you download the spreadsheet and load it into Excel, you can do interesting tests. You could, for example, delete gauges which have less than fifty years of data (since the literature indicates that at least 50-60 years of data is needed to determine a robust sea-level trend from a single tide gauge). The Average and Median cells are determined by Excel formulae, so they will update automatically in Excel, when you make such changes.

2(b). Thumbnails. Corresponding to the spreadsheets we also have “thumbnail” pages, which contain tiny versions of the graphs for all the tide gauges listed in the corresponding spreadsheets. Here’s an example:

clip_image012

In this screenshot you can see that the mouse cursor is hovering over the first thumbnail, which causes it to pop-up a “tool-tip” showing the station number and location name. If you click on a thumbnail it will take you to the analysis page for that tide gauge.

 

The Individual Tide Gauge Analysis Pages

For any tide gauge which you select by any of the above means (by searching, or by clicking on a thumbnail, or by clicking on the “Location” in one of the spreadsheets), you’ll be taken to the analysis page for that tide gauge. Here’s an example (Honolulu), graphed with the default options:

clip_image014

There are many options available. You may:

a. View a graph of sea-level measurements from that location (of course). That’s the blue graph trace.

b. Follow links to the NOAA and PSMSL pages for that tide gauge.

c. Follow links to view the “next” or “previous” tide gauge.

d. Calculate linear and quadratic regressions (line/curve fitting).

e. View graphical representations of the linear and/or quadratic regressions.

f. View confidence intervals and/or prediction intervals for the regressions.

g. View or suppress a juxtaposed graph of CO2 measurements, to see what effect CO2 has had on sea-level rise or fall at that location.

h. Apply boxcar or triangle smoothing to the sea-level graph.

i. Adjust the appearance of the graphs (color scheme, line thickness, etc.)

j. Adjust the date ranges used in the regression calculations and/or displayed on the graph.

k. Select the data source (PSMSL and/or NOAA).

l. Save or bookmark your customized graph.

Note the regression results (d). Here’s a close-up:

clip_image016

As you can see, the slope and acceleration calculated are:

slope = 1.434 ±0.211 mm/yr

acceleration = -0.01004 ±0.01454 mm/yr²

From the slope we can see that over the 110 year history of that tide gauge, sea-level has risen at an average rate of 1.434 ±0.211 mm per year, i.e., 4.8 to 6.5 inches per century.  (That’s very typical, by the way.)

The negative sign on the “acceleration” indicates that the rate of sea-level rise at Honolulu has decelerated, but 0.01 mm/yr² is a very small number (which, if it persisted for a century, would cause the rate of sea-level rise to change by only 1 mm/yr), and the confidence interval is broader than the rate of deceleration, so it should properly be reported as “no statistically significant acceleration or deceleration” (which is also typical).

The various options that you select all become part of the URL for the web page. For instance, if you check the “thick” line weight option, then “&thick” gets added to the URL. So this is the link to the same graph, but with thicker traces:

http://www.sealevel.info/MSL_graph.php?id=1612340&thick

Here it is without the green CO2 graph juxtaposed (the “&co2=0” at the end suppresses it):

http://www.sealevel.info/MSL_graph.php?id=1612340&thick&co2=0

Here it is in black & white, with thick traces. (Several other color schemes are also available.):

http://www.sealevel.info/MSL_graph.php?id=1612340&thick&colors=2

Black & white is useful for preparing graphs for printed material that won’t be printed in color. Here’s how it looks:

clip_image017

Here’s almost the same thing, but with 3-month boxcar smoothing. The smoothing doesn’t affect the regression analyses (except for prediction intervals), but it makes the graph prettier:

http://www.sealevel.info/MSL_graph.php?id=Honolulu&colors=2&thick&boxcar=1&boxwidth=3

Note that in that URL we used the station name (“id=Honolulu”) instead of the station number (“id=1612340”). You may also use the PSMSL station number (“id=155”), or the old-style coast-station code (“id=760-031”). They all produce the same graph:

clip_image018

Here it is with quadratic regression graphed instead of linear, plus I’ve restricted it to using data from 1930 on, and I’ve graphed it with the regression curves extended to 2100, using thick traces, an alternate color scheme, and minimal smoothing:

http://www.sealevel.info/MSL_graph.php?id=Honolulu&quadratic=1&thick&co2=0&linear=0&lin_ci=0&quad_pi=0&quad_ci=1&colors=3&g_date=1900/1:2099/12&c_date=1930/1:2019/12&boxcar=1&boxwidth=2

It looks like this:

clip_image020

Note that in that screenshot I’ve hovered the mouse cursor over a point on the graph to view the exact value (in this case, the high-end of the 95% confidence interval for 2100).

Also note that linear trend is slightly different (because we’ve excluded data before 1930). The data which was excluded from the calculations is graphed with a lighter shade of blue, and a red double-dagger footnote is added to the page:

Light blue data is excluded from regression calculations

This version is almost the same, but the green CO2 graph is included, and the confidence intervals have been replaced with prediction intervals (dotted orange):

http://www.sealevel.info/MSL_graph.php?id=Honolulu&quadratic=1&thick&linear=0&lin_ci=0&g_date=1900/1:2099/12&c_date=1930/1:2019/12&boxcar=1&boxwidth=2&quad_pi=1&quad_ci=0&colors=3

Note: There’s a link on the right side of the page, to a short video by Dr. Gerard Verschuuren, which explains the difference between “confidence intervals” and “prediction intervals.”

clip_image022

Cautions

These tools can produce regressions and graphs for more than 1200 tide gauges, but for many of them I’ve never even looked at the graphs. Doing regression calculations and generating graphs without manual “sanity checks” has pitfalls.

For instance, some of the MSL data at some locations has been identified by PSMSL and/or NOAA as being of questionable accuracy, and these tools do not currently take that into account. (It’s on the to-do list.)

Also earthquakes can cause sometimes-extreme distortions in the trend. Consider the case of Seward, AK:

clip_image024

March 27, 1964 was the date of the Great Alaska Earthquake (magnitude 9.2!). Seward got a full meter of sea-level rise, all at once. So if you’re going to calculate the MSL trend for Seward, you should start with the data after that date, like this:

http://www.sealevel.info/MSL_graph.php?id=9455090&c_date=1964/5:2019/12&co2=0&thick:

clip_image026

Note that NOAA did that on their web page for Seward:

clip_image028

So, when using these tools to analyze sea-level data, I recommend that you also click on the NOAA and PSMSL links, to view their pages about the tide gauges of interest, and I recommend that you look in particular for warnings about questionable data, apparent datum shifts, and earthquakes.

Future

There are still many rough edges to these tools. I have an ever-growing to-do list, and I look forward to WUWT readers growing the list even more.

“Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passions, they cannot alter the state of facts and evidence.”

– John Adams

“Fallacies do not cease to be fallacies because they become fashions.”

– G. K. Chesterton

“Save me, O LORD, from lying lips and from deceitful tongues.”

– Psalm 120:2

Warmest regards,

Dave

www.sealevel.info

0 0 votes
Article Rating
165 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Dave in Canmore
January 21, 2017 9:08 am

Great site Dave! I just happened to find this the other day by accident. Had to love this aside: “Paul is a lifelong “Green” activist, but to his credit… ” It’s a shame that so many people have ruined the reputation of being an environmentalist that one finds oneself saying “He’s an environmentalist -but really he’s okay!”
Thanks for your contribution !

Reply to  Dave in Canmore
January 21, 2017 1:36 pm

Thanks, Dave. Yes, I find myself talking about environmentalists (and lawyers) the way Old Southerners used to deny their racism: “but some of my best friends are ___greens___” (fill in the blank).

Joe Ebeni
Reply to  daveburton
January 21, 2017 2:07 pm

Great site. CONGRATS. Perhaps I, as a conservative conservationist, can again support an EPA that truly focuses on scientifically supported programs for clean air, water and soil. Perhaps wasted money can now be applied to real “green” programs that make a difference in citizens lives.

South River Independent
Reply to  daveburton
January 22, 2017 2:18 pm

Not all Southerners are racists. There are as many racist Northerners.

Reply to  daveburton
January 23, 2017 12:40 am

Very true. Nor even most, S.R.I. That’s why I wrote, “used to.”

January 21, 2017 9:09 am

DB great start on a valuable tool. Would appreciate adding one datum where available, and keep those stations flaged. Estimate of vertical land motion. According to Nils Axel Moerner, there are about 70 PSMSL tide gauges with diff GPS vertical land motion within 10km. About 30 of these are ‘newish’ and about 40 are long record. Climate change isn’t responsible for plate tectonics (Seward, Juneau in my book essay) or GIA.

tomwys1
Reply to  ristvan
January 21, 2017 10:26 am

Dave has done a beautiful job putting this all together for easy access. Notice that sea level rise rate’s in tectonically inert places are clearly linear while CO2 has had a massive 38% acceleration since 1880. Dave’s graphics pull the rug out from under those who claim that there is ANY effect of CO2 on the rate of sea level rise.
THANK YOU Dave, for being might “go to” guy for easy access to this information!!!

tomwys1
Reply to  tomwys1
January 21, 2017 10:29 am

It was supposed to be “my” vs “might,” but you have done a mighty service for all, and my fossilized brain assured that that was duly recorded as such!!!

Robert
Reply to  ristvan
January 21, 2017 1:07 pm

‘Climate change isn’t responsible for plate tectonics.’ (?!?)
Says you! 😉

Reply to  Robert
January 21, 2017 1:31 pm

Yup, sez I 🙂

Reply to  ristvan
January 21, 2017 1:30 pm

Some data pointers from my files concerning the suggested VLM addition where reliably estimated.. NOAA tech report NOS-CO-OPS 065 gives VLM via JPL GPS estimates for 16 US tide gauges (TDs). Weoppleman et al in GPC 2007 gives 224 globally (with a nice map), with 160 within 15 km of TG. Table one gives specifics for a geographically dispersed 28 globally. PSMSL website has a good discussion, with links to CGPS@TG and to SONEL for the data.

Reply to  ristvan
January 21, 2017 2:13 pm

Thanks, guys.
Yes, ristvan, vertical land motion (VLM) is difficult to pin down.
I currently have no CORS or SONEL (GPS-based VLM) data for any of the gauges, but if you scroll waaay to the right in the spreadsheets you’ll find Peltier’s VM2 & VM4 PGR estimates for most of the tide gauges (though it’s not his latest versions).
I agree that GPS-based VLM estimates would be a useful addition to both the spreadsheets and the individual tide gauge analysis pages, and that adding Peltier’s (and perhaps some others’) PGR estimates to the the individual tide gauge analysis pages would also be helpful.
However, a couple of years ago, during the NC sea-level kerfuffle, I looked briefly at CORS data for vertical land motion in the vicinity of NC’s tide gauges. Frankly, the data looked like garbage. The numbers were all over the place.
I hope it’s better now, but the bottom line is that I’m hesitant to trust claimed sub-millimeter precision for attempts to determine vertical land motion from GPS data. Maybe if the GRASP or E-GRASP mission ever flies it will improve matters.
Ristvan, would you mind sending me an email, so I have your address, for follow-up conversation?

Reply to  ristvan
January 21, 2017 2:58 pm

DB, on its way very shortly. Glad to support your magnificent endeavor.I agree with you that CORS and related is ‘garbage’. There are laughable peer reviewed papers trying to back VLM out of tide gauges using ‘regional’ tide expectations. That is as stupid as BEST regional temperature expectations. The NOAA tech paper I referenced is in part an example. Thats why I carefully said diff GPS ( ordinary Gps does not have the requisite vertical resolution, so you need an additional terrestrial fixed source signal). Hence Diff.

Don K
Reply to  ristvan
January 21, 2017 3:46 pm

Yes, ristvan, vertical land motion (VLM) is difficult to pin down.<

That’s probably because VLM is extremely difficult to measure — especially when the amounts of motion are very small. Conventional surveying is surely pretty much useless because the reference points are likely moving up or down along with the tidal gauge. Likewise differential GPS. That allows us to see motion over a region with considerable accuracy , but even after using it, we won’t know the vertical motion of the region.
The long term answer is probably measurement of “elevation” using satellites, but there are a bunch of problems with that, not the least of which is that the RMS error in satellite radial distance estimation at any given time seems typically to be several cm. It’s also genuinely harder to measure elevation than “lat-lon” using satellite references.
I expect that eventually there will be good estimates of VLM for all the gauges anyone cares about, but getting them looks to be something that requires observations for a good many years using sophisticated equipment.

Reply to  ristvan
January 21, 2017 5:08 pm

DK, agree. But diff GPS is the best we got now. If is good enough for flat plain Midwestern farmers applying fertilizer from their $200k tractors based on minute soil elevation/nutrient gradients by acre, is good enough for me. I trust self interested farmers, since am one. Now, my farm is not flat plain (400 feet delta elevation, all contours in SW Wisconsin Uplands), and our most expensive new tractor is ‘only’ $100k. My light haying 4wd compact Ford 1810 with 770 loader bucket cost $19k back in 1985. Used. Still going strong. No GPS. My comment still applies.

rgbatduke
Reply to  ristvan
January 24, 2017 7:20 am

I have no time to actually discuss this (teaching two classes, busy busy busy) but in addition to VLM correction (which in some locations is critical, as is evident from the enormous differences site to site in mean SLR in a presumably approximately hydrostatic ocean, with some places up to 8 mm/year rise and others 1 to 2 of fall) it would be enormously useful to:
a) Use the data on the average seasonal rise/fall by month, coupled with the average seasonal TEMPERATURE by month, to compute what one might call the “local thermal expansion coefficient”. Warm water expands and floats on top of cold. In my back yard in Marshallberg NC (less than 10 miles from the Beaufort gauge) the water off of my dock is too shallow to even float my Carolina Skiff at all by the highest of tides in the winter, because the water itself is cold, dense, and around ten inches to a foot shallower than it is in the summer ALL the time.
b) Take the mean sea surface temperature data (ideally measured AT each of the tide gauges, where somebody was brilliant enough to put a thermometer on them as well as measuring the tides, using local second source data where this isn’t available) and compute the trend expected from thermal expansion alone from the temperature trend in the local ocean (probably a very slight warming most places).
c) Detrend the tide gauge with this. This is the “irrelevant” part of SLR as far as melting icecaps are concerned. Yes, it would be lovely to correct for VLM as well, but that is damn difficult given that the entire crust of the Earth experiences a tidal motion vertically (IIRC) of some 18 cm every six hours, and we don’t HAVE a tide gauge for that that reaches down to the center of mass of the Earth, accurate to 18 cm. I’m not sure that most of the GPS based determinants can even resolve this gross motion (I should stick a good GPS out there somewhere and see). I tend to think of GPS resolution as order meters (at best), not centimeters, although with better clocks and more of them in more satellites, they may eventually get there. For perspective, a 1 ns resolution clock only resolves distances at 3×10^8 x 10^-9 = 0.3 m or 30 cm. To do 3 cm scale resolution, you need 100 independent clock based rules (variance reduces like square root of the number of samples) or clocks with 0.1 ns resolution and satellites whose positions are known within cm as well. Not easy to accomplish, at least not yet, although they are working on it.
But thermal is “easy”, and probably more important as it will be SIMILAR across the sites, not different. Might eventually want to add things like salinity and/or local density, which is even more relevant than temperature per se — the temperature variation is CAUSED by density variation.
rgb

Reply to  ristvan
January 24, 2017 12:19 pm

Uh oh. A big homework project from the professor.

Steve Fraser
January 21, 2017 9:20 am

Great work, Dave!
I especially appreciate the comments on Seward, and reading the gauges with a sense of geologic perspective. We were there ithis summer, and the Bus driver commented about how that earthquake affected the soils in the area. Large, flat areas resulted… massive subsidence and landslides.
I look forward to the continuing development of your work.

Philip
January 21, 2017 9:27 am

I would be wary of any sea level measurements on the Hawaiian islands.
These are volcanos, forced upwards by magma rising from the Hawaiian hotspot in the Pacific plate. As the plate moves the volcano away from the hotspot, it slowly subsides back towards the ocean floor.
Sea level changes here have more to do with volcanos and plate tectonics than anything else.
For an intro see https://en.m.wikipedia.org/wiki/Hawaiian%E2%80%93Emperor_seamount_chain
Note the long chain of now submerged “islands”, which is the direction all of the current islands are going, with the possible exception of the big island itself.

Michael Jankowski
Reply to  Philip
January 21, 2017 10:31 am

There’s caution to be had with any sea level measurements. But at least that can be made with “raw” sea level data. Something tells me most of the “adjustments” would tend go one way, as with adjustments to the temperature record.

January 21, 2017 9:52 am

Hi Dave – I will be visiting your site soon (-:
May I suggest that you include Colorado University’s Sea Level Research Group to your analysis, but more than that include links to their historical data found on the Internet Archives WayBack Machine
https://web.archive.org/web/*/http://sealevel.colorado.edu/
As you know unlike the tide gauges CU rewrites their historical data on a regular basis.
Here’s the earliest data I can find there: 2004 Release 1.2

Reply to  Steve Case
January 21, 2017 2:55 pm

Thanks, Steve! I have a brief note there, with one of your graphs. But I don’t emphasize the satellite altimetry data, because I do not trust it.
http://www.sealevel.info/resources.html#satellite
However, to their credit, at least U.Colorado doesn’t block data archiving with robots.txt and/or server tricks. I wish that was always the case for NASA and NOAA. I hope that President Trump will issue an executive order, ordering that government and government-supported web sites STOP DELETING old versions of “their” (our!) data sets, to hide the extent and nature of their alterations, and STOP using robots.txt and server configuration tricks to BLOCK the archiving of old versions of their data and web pages by The Wayback Machine and other “archiving” services.
http://www.sealevel.info/noaa_blocking_wayback_machine_tight_cropped.png
If you know what a robots.txt file is then look at the NOAA National Climate Data Center (NCDC) robots.txt file:
http://www.ncdc.noaa.gov/robots.txt
Do you see it? The first thing in it is the NOAA’s blanket prohibition on the archiving of NCDC data files!
What possible excuse can NOAA have for such behavior?

Reply to  daveburton
January 21, 2017 3:31 pm

DB, 100% with you. This is a major drain the swamp agenda item. The only imaginable reason is to hide their incompetent continuing fudges. Karl got caught anyway.

Reply to  daveburton
January 22, 2017 1:14 am

Yes CU are serial data manipulators with zero transparency. Totally agenda driven. Satellite sea levels are a farce and they rig the results to fit thier agenda and remove access to anything they previously showed.
One example is you no longer get the chance to view the data without “inv. barometer correction”. Why that would be revelant to a global average seems unexplained and there is no way to compare to no_IB version to see what affect it has because they’ve scrubbed that version and will not even provide it upon request.
Second they add in their hypothetical adjustment for ocean basins getting deeper. Since there is no measurement of that it is totally speculative and means their “mean sea level” is now floating, phantom like, above the waves.
This physically unreal version of “sea level” is explained by claim that they want it to be an index which reflects global warming and thus needs to be adjusted to account for deeper oceans. Pretty much says it all about where they are coming from.

Reply to  daveburton
January 22, 2017 4:02 am

Greg: “it… means their “mean sea level” is now floating, phantom like, above the waves.”
Haha, what a great turn of phrase! I’m going to plagiarize that.
This is how I’ve been saying it: “…many researchers add 0.3 mm/yr GIA to the calculated average, to account for estimated post-glacial sinking & broadening of the ocean basins. I do not, because that sum is not truly sea-level rise. That sum is what it is estimated that the rate of sea-level rise would be were it not for the sinking of the ocean floor.”
My version is dry and forgettable. Your version paints a word picture which is vastly more memorable.
I like to credit my sources. Drop me an email, please, so I know who to credit, when I steal your creative work.

Reply to  daveburton
January 25, 2017 12:43 am

Thanks for the reply and links. I hadn’t gone back to check and only found your reply on a pop up on WUWT’s front page.
By the way the robots got through to some of GISS table data sets. I’ve used the WayBack Machine and have gleaned about 70 world monthly GHN and Met Station only pages for my files. Interesting stuff they change the data for nearly every entry several times a year. Email me if you want copies.
stacase at hotmail dot com

The Original Mike M
January 21, 2017 10:00 am

I’d like to see a multi-year mean trend variation like NOAA’s but add a way to vary how many years are averaged together. They use 50 years but a select-able feature to allow a shorter duration might be useful too. Example – https://tidesandcurrents.noaa.gov/sltrends/50yr.htm;jsessionid=1A6A5C0998FB0224B593746686597B70?stnid=9410170

Bloke down the pub
January 21, 2017 10:15 am

Double-Plus Good.

Sean Peake
January 21, 2017 11:00 am

Impressive!

Roy Spencer
January 21, 2017 11:13 am

Cool! I can tell this took some work. I hope it helps keep the scientists honest. 😉

catweazle666
Reply to  Roy Spencer
January 21, 2017 6:41 pm

“I hope it helps keep the scientists honest. ;-)”
The scientists are honest.
It’s the “scientists” that aren’t.

January 21, 2017 11:26 am

Excellent work.

January 21, 2017 11:40 am

A suggested site information addition at some point, discussion of the closure problem. I sped read resources and checked the glossary. Did not find it. There are three somewhat dodgy recent closure papers cited in essay PseudoPrecision concerning SLR. Closure uncertainty bounds SLR, and shows that sat alt must be too high. Closure is the idea that to a first order approximation, SLR should equal the sum of thermosteric rise (e.g. inferred from ARGO) and ice cap mass loss (e.g. Inferred from GRACE, using the post 2013 Antarctic diff GPS measured GIA rather than the previous faulty modeled GIA. McIntyre has a past post with details). Doesn’t address acceleration, but does bound current SLR rate from climate change.
Masterful work so far. Kudos.

Don K
Reply to  ristvan
January 21, 2017 4:33 pm

Masterful work so far. Kudos.

I agree.
I also agree about the closure problem — which I think deserves a lot more attention than it gets. It’s difficult to see how the CU satellite folks could screw up a rate of change, but it seems likely that they have. I did read a paper once dealing with reanalysis of the TOPEX?Poseidon data that came up with a reasonable 2.0mm/yr SLR estimate, but with a huge (+- 2.0mm/yr) uncertainty due to uncertainties in ionospheric delay models. That would seem consistent with the tidal gauges. Jason would seem to have better delay measurement on paper. But maybe not in practice.
I’ve been trying to read the IPCC Sea Level writeups in the ARs. Unfortunately they have a very high EGO (Eyes Glaze Over) quotient, so I can manage only a few pages a week. it’ll be many years before I finish. My impression is that “they” actually have been plugging away at their budget for how much of SLR is due to what, and they think they are making good progress.
In addition to closure, I think isostasy deserves a good deal more attention than it gets. Virtually all SLR computations seem to include isostasy corrections (usually referred to as GIA even in the tropics) of one sort or another. I’m having a good deal of difficulty convincing myself that the current isostasy models aren’t possibly introducing large, unacknowledged, uncertainties into the results.

Reply to  Don K
January 21, 2017 5:14 pm

DK, agree on both. Think the observational closure problem is more important, given Feynman’s dictum.

Ross King
January 21, 2017 11:47 am

Great job!
Just a quick thought …. Has anyone done the equivalent for the rise in LAND levels at specific measuring stations?
Not sure what it’d prove, beyond further accentuating the ‘restless Earth’ picture … to confound those who would have us believe it can all be computer-predicted from the basis of “Settled-Science”.
It astonishes me that here we are agonizing over a mm. or two of sea-level rise (and similar trivial observations) when we can’t predict where the next e’quake is going to happen and when.

Crispin in Waterloo
January 21, 2017 12:16 pm

I am impressed by both the completeness and scope of choices to examine particular portions of data sets.
Thanks so much.

CoRev
January 21, 2017 12:25 pm

Great job Dave.
To your “to do” list have you considered adding a coastal search, e.g US East v. US West? Or adding any other major geographically definable areas for comparison?

Reply to  CoRev
January 21, 2017 3:13 pm

Yes, I totally agree, and that’s on my to-do list. “Coastline codes” are probably the simplest way to do it. That’s the first part of of the ccc-sss style station ID
E.g., 960 is the coastline code for the U.S. Atlantic coast:
960-021 for Fernandina Beach, FL:
http://www.sealevel.info/MSL_graph.php?id=Fernandina
960-060 for Wilmington, NC:
http://www.sealevel.info/MSL_graph.php?id=Wilmington
960-121 for The Battery (NYC):
http://www.sealevel.info/MSL_graph.php?id=Battery
I have a list of coastline codes, but they unfortunately do not tell which coasts are which, for a particular country (and the USA has five coastline codes, plus Puerto Rico, the Virgin Islands, etc.).
http://sealevel.info/coastline_codes.csv

Mike Fayette
January 21, 2017 12:25 pm

Great site! I hope WUWT links to it on their reference pages!

Mark from the Midwest
January 21, 2017 12:33 pm

Nice site …. Off Topic, anyone got any insight / info into William Happer for the WH Science Advisor? My back-channel rumor mill channels just cranked up two magnitudes.

H. D. Hoese
January 21, 2017 12:38 pm

Great. For those of us that live near sea level, I just saw an osprey catch a mullet, we, among others, should find this extremely useful. Facts have an infinite shelf life, but they do get dusty and encrusted.

fah
January 21, 2017 12:47 pm

Excellent idea and it looks good. One thing I would like to see included, although it looks much harder, is similar attention paid to local subsidence data. I have tried to look into the data at various times and it can be tough tracking it down, although when it is available it seems to be significant. It looks like a good deal of fairly long term GPS based data is becoming available, at least in the U.S. Worldwide data seems the result of scattered studies and is hard to find sometimes. Collection and providing local and regional subsidence data would be a big help to go along with the tide gauge data.

Reply to  fah
January 21, 2017 3:25 pm

I agreey, fah. The fact that the best/longest tide gauge records are from cities (harbors!), means that there’s potentially a lot of groundwater pumping (water wells) near them, which could cause subsidence, and exaggerated local sea-level rise.

Dodgy Geezer
January 21, 2017 1:30 pm

…Nevertheless, in the interest of full disclosure, I consider myself a “lukewarmer.” …… In my opinion, the weight of the evidence indicates that anthropogenic global warming is real, modest…
I would be very interested to know what evidence supports your opinion that anthropogenic global warming is real. My understanding is that it is hard to determine what happens to the extra CO2 humans have put into the atmosphere since the 1970s, and only possible to associate warming with extra CO2 by modelling, and assuming that CO2 is the main temperature driver.
I understand that CO2-driven warming should be associated with a detectable tropospheric ‘hot-spot’ and increased humidity levels, and that neither of these have been detected. If this is true then I would say that it was premature to associate increased CO2 with any warming. Incidentally, I would call for President Trump to make funds available for a detailed study of the troposphere so that a definitive finding can be obtained…

Reply to  Dodgy Geezer
January 21, 2017 1:46 pm

DG, not DB but would like to offer some thoughts. It is not hard to determine what happens to anthropogenic CO2 since 1958 (Keeling curve). About half goes into carbon sinks nd half remains resident in the atmosphere, Salby is grossly and egregiously wrong. We know it is a GHG. We also know from ‘grey earth first principles that by itself, a doubling produces 1.1-1.2K of warming. Lindzen uses 1.2; Moncktons three part error series last year at WUWT provides data calculating 1.16K (C). What we don’t know is feedbacks. The absence of a tropical troposphere hot spot does not prove CO2 is not causing some warming; what it proves is that the CMIP5 models of that warming are wrong. FWIW, I am also a weak lukewarmer. Observational ECS ~1.65 means there is a weak positive feedback, about half of modeled. Bode sum f about 0.26-0.3 rather than 0.65. That means there is no C in CAGW. Only a little beneficial aGW. No accelerating SLR. Thriving polar bears. Greening.
Think it far better to be as precisely simple as possible than erroneously challenge demonstrable basics like no anthropogenic CO2, no possible CO2 warming. What you point to is the attribution problem I comment on frequently here.

Dodgy Geezer
Reply to  ristvan
January 21, 2017 2:18 pm

…About half goes into carbon sinks and half remains resident in the atmosphere, Salby is grossly and egregiously wrong….
Thanks for your response. That point was one I was looking for direction on. Is there any definitive work showing that this is the case?
My (admittedly weak) understanding of ECS calculations suggests that they do involve a lot of (unknown?) assumptions, and I don’t know enough to determine whether these are realistic or not. In particular, the impacts of cloud formation seem to be unmeasured and unknown, and they could be a strong negative feedback. First principles are easy to understand, but the question here is whether in practice the raw extra heating is completely mitigated by complex natural feedbacks or not. The assumption that observational recent heating must be CO2 driven, so we can calculate basic sensitivities from that seems to me to be not valid until we can identify pretty much exactly how the climate works – which we have not yet achieved.
FWIW, I am probably classed as a deep sceptic – but happy to go where the data leads. If I can be sure of the data! Having watched this field with John Daly I have spent over 20 years deeply suspicious of most official data. I can well imagine becoming a lukewarmer (or a True Believer!) but I would like to see the proof…

Reply to  ristvan
January 21, 2017 2:35 pm

USGS has a carbon sinks publication with decent estimates. Google will take you there. Basic technique is assume natural sources/sinks are in rough natural equilibrium (must be so atbpreindustrial ~280 ppm for centuries), estimate recent anthropogenic annual emissions from fossil fuels and cement, then see what fraction of that shows up in the Keeling curve. For ECS, best source is Lewis and Curry 2014. You can get it plus a great discussion at Climate Etc. They used only IPCC values, and calculated different time frames to wash out natural variability as much as possible. As for CO2 ‘control knob’ we have instead a strong disproof in the pause. Since except for a now rapidly cooling 2015-16 El Nino blip it has not warmed this century. Yet this century had ~35% of the total increase in CO2 ppm since 1958 (Keeling curve). More simple unassailable skeptical talking points for you. Regards.

Reply to  ristvan
January 21, 2017 3:33 pm

Agreed, ristvan, though atmospheric physicist Wm Happer has found evidence that the base-level warming effect of CO2 is commonly overestimated by about 40%. Of course that doesn’t mean that ECS isn’t about 1.6 and TCR about 1.0, it just means the base-level forcing from CO2 might account for a bit less of it, and positive feedbacks a bit more of it.

DMA
Reply to  ristvan
January 21, 2017 3:40 pm

Ristvan
If human emissions are about 1/30th of all emissions for any year and CO2 is mostly well mixed, how can 1/2 of the human emissions be left when the sinks are done? Shouldn’t it be equal to or less than 1/30 of the total?
Hertzberg and Schreuder, 2016 concludes with ” Nothing in the data supports the supposition that atmospheric CO2 is a driver of weather or climate, or that human emissions control atmospheric CO2.” I haven’t got to read any more than the introduction to this paper but they wouldn’t agree that 1/2 of human emissions remain each year or they would have found some evidence that it controls the atmospheric content. I don’t think it makes sense to assume that all the new CO2 we find in the atmosphere each year comes from fossil fuels. Natural sources are huge, temperature sensitive, and always active. We know CO2 follows temp by 300 to 800 years from the ice cores. Earth started out of the little ice age some 300 years ago. Why couldn’t most of the new CO2 we see lately be from the warming of the oceans in the last couple hundred years and the fact that it aligns with increased human emissions a coincident? I don’t think our methods of determining CO2 levels present centuries ago are accurate enough to preclude spikes that lasted less than maybe 100 years.

Reply to  ristvan
January 21, 2017 4:38 pm

DB, a data correction meant in the best way. If ECS >1.2 CO2 alone, then feedback must be weak positive so TCR MUST be >~1.2. The Lewis and Curry paper estimates ~1.3. Consistency matters.

Dodgy Geezer
Reply to  ristvan
January 21, 2017 4:39 pm

..Basic technique is assume natural sources/sinks are in rough natural equilibrium (must be so atbpreindustrial ~280 ppm for centuries), estimate recent anthropogenic annual emissions from fossil fuels and cement, then see what fraction of that shows up in the Keeling curve…
Immediately I have a problem. We know that CO2 concentration has varied in the past, though we don’t know why. How can we be sure that there is no natural change going on now? And how can we be sure that it has been 280ppm for centuries? DMA below voices some typical queries.
I can see that the simplest assumption to make is that natural forces have been fairly static and that variation is due to humans – that’s certainly a starting point for investigating the phenomenon – but the cost and impact on our society of CO2 limitation is so high that I would really like something better than a ‘reasonable assumption’ – particularly now that the runaway heating predicted is not occurring.
I would like to see independent work aimed at addressing these huge unknowns. Instead, I see repetitive work intended to justify the early guesses, dipping into frankly fraudulent attempts to keep the hypothesis from failing. In such circumstances I am far less willing to give researchers any benefit of doubt…

Reply to  ristvan
January 21, 2017 4:43 pm

DMA, your math logic is unfortunately faulty. Let me try to explain. You start from delta to natural sources (your word, emissions), but exclude equivalent delta to natural sinks. Wrong math. Your basic equation is wrongly unbalanced. Think about it.

TA
Reply to  ristvan
January 21, 2017 5:27 pm

Dodgy Geezer wrote: “The assumption that observational recent heating must be CO2 driven, so we can calculate basic sensitivities from that seems to me to be not valid until we can identify pretty much exactly how the climate works – which we have not yet achieved.”
That’s the way I see it, too, Dodgy. Too many assumptions. All the warming since 1979 could be natural, just going by the recent past, when it warmed just as much from 1910 to 1940, no CO2 required.

Reply to  ristvan
January 22, 2017 3:25 am

DMA, the great majority of natural CO2 sources & sinks are directly connected and almost exactly balanced. For instance, when a blade of grass grows, about half of its “dry mass” (mass not counting water) is carbon, taken from CO2 in the air. But that blade of grass generally dies and rots, releasing its CO2 back into the atmosphere, typically within a year, so unless the total amount of vegetation growing on the Earth is increasing or decreasing there’s no net change to atmospheric CO2 levels.
However, when mankind unlocks CO2 from fossil fuels, and releases it to the atmosphere, it does tend to increase atmospheric CO2 levels. However, there are two main negative (stabilizing) feedbacks which reduce the extent (AR5 estimates by ~55%):
CO2 Absorption By Water Feedback. Higher atmospheric CO2 levels increase CO2 absorption by water bodies (mainly the oceans), removing CO2 from the atmosphere. AR5 estimates that this effect currently removes about 26% of anthropogenic CO2 emissions from the atmosphere (Fig. 6.1), but that’s a very rough estimate. (Note: Since the oceans contain about 50 times as much CO2 as the atmosphere, the absorption of atmospheric CO2 by the oceans affects the oceans much less than it affects the atmosphere.)
CO2 Fertilization Feedback (“greening”). Higher CO2 levels increase plant growth rates, which reduces atmospheric CO2 levels. AR5 estimates that this effect currently removes about 29% of anthropogenic CO2 emissions from the atmosphere (Fig. 6.1), but that’s a very rough estimate. (Note: on p. 6-3 they give slightly numbers: 2.5 / 9.2 = 27% went into the biosphere.)
(Note: the sum of the amounts of CO2 taken up by fertilization/greening and water absorption is estimated by AR5 to be 55% of anthropogenic CO2 emissions, and that estimate is not as rough as either of the two addends.)

Dodgy Geezer
Reply to  ristvan
January 22, 2017 8:37 am

…Think it far better to be as precisely simple as possible than erroneously challenge demonstrable basics like no anthropogenic CO2, no possible CO2 warming. What you point to is the attribution problem I comment on frequently here….
A lot of the above comments illustrate my position very well.
Note that I am NOT claiming that there is no human CO2 warming. I am claiming that we just don’t know at present – the data is poor, and I believe that a lot of it has been both presented and altered for political reasons rather than to determine the truth.
My belief that we have a lot of fraudulent data does not mean that I believe that the ‘real’ data (supposing it to exist) would not show a connection between human CO2 and warming. I just don’t hold to a line which seems to say – “A lot of this data is dodgy, but there must be something there…”. I don’t believe we have any proof either way at the moment, and so I hold that denying the link, believing it wholeheartedly, or believing it slightly are all flawed positions…

Robert from oz
January 21, 2017 1:44 pm

Great site good idea but , whenever I see “NOAA” is it possible to trust the figures .
Sea levels have been rising since the end of the last ice age , doesn’t matter what side of the fence your on it’s an accepted matter of fact .

Reply to  Robert from oz
January 21, 2017 2:13 pm

NOAA doesn’t fiddle tide gauge records; if they did the Coast Guard would come after them. Unlike temperature records where they fiddled and Rep. Lamar Smith went after them and ran into an Obama inspired stone wall–until yesterday.

Reply to  ristvan
January 21, 2017 3:42 pm

The only “fiddling” NOAA does with individual tide gauge records, that I’ve noticed, is seasonal detrending (which they properly and prominently disclose on their web pages). In other words, they “subtract off” the “seasonal signal.”
My code does that, too, and finds numbers very similar to NOAAs.
I’ve looked pretty closely at their trend analyses for quite a few tide gauges, and I’ve seen no evidence of anything other than good work by competent oceanographers.
Some of the other material on NOAA’s site is pretty appalling. But not the individual tide gauge analyses, AFAIK.

Justthinkin
January 21, 2017 2:10 pm

Just a thought….any idea what happens to sea level in the Atlantic,when the largest island(Greenland),in the world tips over and sinks? (as was suggested by some greenie nuts)

Dodgy Geezer
Reply to  Justthinkin
January 21, 2017 2:20 pm

Santa’s elves come riding in on unicorns and put it back again?

Reply to  Justthinkin
January 21, 2017 2:20 pm

Nah, that was Guam, not Greenland. And the greenie nut was a Democratic congressman. Hank Johnson 2010. It’s on youtube for eternal enjoyment. He later said he was being sarcastic. Hard to know. The Admiral he was addressing in the hearing replied deadpan, “We don’t anticipate that.”

Reply to  ristvan
January 21, 2017 3:48 pm

That Admiral was amazing. I cannot imagine how he kept a straight face.
You want to know what else is amazing? Believe it or not, ol’ Hank was the UPGRADE.
I’m not kidding. He was the sane and sober candidate who replaced Cynthia McKinney, because she was too crazy even for Georgia’s 4th congressional district, which I think must have been specially gerrymandered to contain all of the funny farms in the Peach State.

Reply to  ristvan
January 21, 2017 4:51 pm

Wow. That I did not know.

January 21, 2017 2:48 pm

Thanks for all the information on NOAA. I will look into it more.

Editor
January 21, 2017 2:55 pm

Thanks for all your work, well done.
w.

January 21, 2017 3:00 pm

Wow!
Great job Dave. I’ll be spending some time looking at all the cool things you can do on your new site.
Many thanks.

Justthinkin
January 21, 2017 3:11 pm

And as an afterthought,what about the Bay of Fundy,highest tide change in the world? And happens every 12 hours?

Reply to  Justthinkin
January 21, 2017 8:06 pm

http://sealevel.info/MSL_graph.php?id=970-001
Thumbnail:
http://sealevel.info/thumbnails/970-001_tn.png
Note that these are monthly averages of mean sea level which are being graphed, so those giant tides mostly get averaged out.

Reply to  Justthinkin
January 22, 2017 12:51 pm

Fungy does not have just the highest tides , it also has the lowest tides, ie. the largest tidal variation due to a unique geographic formation.
That may be interesting in terms of reducing %age error but it will not affect long term rise.
It is also famous for an unusual tide bore which apparently goes up one side and comes down the other. I know someone who has witnessed this but not seen it myself.

January 21, 2017 3:36 pm

Excellent site! When you average all this tide gauge data, what do you get?
I would guess +6 inches (152.4) per century…

Reply to  J. Philip Peterson
January 21, 2017 4:01 pm

Thanks. It depends on which set of tide gauges you examine. There are tide gauge records as short as ten years listed, which is way, Way, WAY too short to determine a robust trend. So I suggest just looking at gauges which have 50+ or 60+ years of data.
A couple of the spreadsheets were pre-pruned, to include just stations with 50+ or 60+ years of data. Alternately, you can download a spreadsheet, load it into Excel, and delete whatever you wish. I’d suggest deleting everything with less than about 50-70 years of data, plus gauges which obviously have “issues” with their data (like Galveston, which is sinking, due the the compacting fill which elevated it a century ago).
Here’s NOAA’s 2013 list of stations, with updated data, but pruned to include only stations with 60 years of data. The median (which is probably better than the average) is 1.21 /mm/yr. With Peltier’s VM2 adjustments, it’s 1.4 mm/yr:
it’s http://www.sealevel.info/MSL_NOAA2013_60yr.html
I think the “right” number is about 1.5 mm/yr, probably a bit less. See:
http://www.sealevel.info/avgslr.html

Don K
Reply to  daveburton
January 21, 2017 4:48 pm

“So I suggest just looking at gauges which have 50+ or 60+ years of data.”
I think one should also consider ignoring all tidal gauges in tectonically active areas — which is, unfortunately a very large percentage of the Earth. e.g. the entire Pacific Ocean except Australia (vulcanism), the North Atlantic (vulcanism, glacial isotasy), the Eastern Mediterranean, etc, etc, etc.

Reply to  daveburton
January 21, 2017 10:18 pm

The “Greenland Effect,” a/k/a the “European Problem,” is the fact that European tide gauges seem to measure noticeably lower rates of SLR than most other places in the world, due to slight changes in the Earth’s gravity field as melting of the Greenland ice sheet slowly redistributes mass from the ice sheet into the oceans. When Dr. Jerry Mitrovica was investigated the Greenland Effect, his paper’s reviewers made him use a list (suggested by Dr. Bruce Douglas) of about two dozen high quality gauges from around the world which are thought to be unaffected by distortions from PGR and tectonic activity. (Were you one of Mitrovica’s reviewers, Don?)
That seems interesting enough that I made that list of tide gauges into one of the spreadsheets on my site:
Spreadsheet: http://www.sealevel.info/MSL_mitrovica23_trendtable.html
Thumbnails: http://www.sealevel.info/MSL_mitrovica23_thumbnails.html

Grant
January 21, 2017 3:39 pm

Look, Dave’s making America great again already. Well done.

Richard Barraclough
January 21, 2017 4:02 pm

Let me add my thanks. It’s too late at night for me to absorb all the detail, but it has been bookmarked for later.

Bill Illis
January 21, 2017 4:13 pm

Thanks very much Dave. I’ll be around.
Add in a link to the Sonel GPS station database at:
http://www.sonel.org/-Vertical-land-movement-estimate-.html?lang=en
You can match up a tide gauge station from PMSL with the GPS vertical land movement station on this page. A little tough to do but this is how PMSL wants it set-up.
http://www.sonel.org/spip.php?page=cgps
Just noting that they have now moved to a new world-wide vertical land movement model ULR6. This is only image for it which is not that useful but anyway.
http://image.slidesharecdn.com/balluv201507081730upmcjussieu-room107-151211153453/95/ballu-v-201507081730upmcjussieuroom107-3-638.jpg

Reply to  Bill Illis
January 23, 2017 12:57 am

Thank you, Bill! Clearly that info belongs in the spreadsheets.

a_generalist
January 21, 2017 4:25 pm

This is awesome information! Thanks for creating this.

J_Bob
January 21, 2017 4:30 pm

Great Tool !!!!
I often use the Holgate-9 sea level graph:
http://www.climate4you.com/images/Holgate-9_Since1900-NEW.gif
to explain that a constant sea level rise rate is counter to the current opinions as to significant global warming, based on:
1- Melting ice sheets
2- Ocean temperature rise (sea water expands with temperature slightly below freezing)
Look forward to using it.

Reply to  J_Bob
January 22, 2017 5:23 am

Good idea, J_Bob! I’ve just added a “Holgate-9” spreadsheet to the site; it’s the 4th one down:
http://www.sealevel.info/data.psp#spreadsheets

Reply to  daveburton
January 22, 2017 5:45 am

BTW, you may notice in the spreadsheet that three of the nine sites show very slight but statistically significant sea-level rise acceleration. But if you adjust the analysis periods for those three you’ll discover that in all three cases the acceleration occurred more 90 years ago. None of the nine measurement records have recorded any sign of an acceleration attributable to anthropogenic GHGs.

Scott Rabone
January 21, 2017 4:37 pm

Hi David – great website thanks! Between 2007 and 2010 a series of “high quality” sea level measuring stations were installed around the coast of New Zealand. If you get around to adding more data sets to your site, the data from these ones can be found here: http://www.linz.govt.nz/sea/tides/sea-level-data/sea-level-data-downloads Its hard to make sense of the raw data so it would be great to be able to use your graphing tools.

Reply to  Scott Rabone
January 23, 2017 1:20 am

Thank you, Scott. PSMSL does have some of those. You can start here and use the “next” arrow at the upper right corner to step to other NZ gauges:
http://www.sealevel.info/MSL_graph.php?id=690-001
It is interesting that they seem to have recent data for Auckland, even though PSMSL’s data for Auckland ends in year 2000.
PSMSL has:
Auckland
Cape Roberts, Antarctica
Gisborne
Napier
Port Chalmers
Tauranga
Wellington
PSMSL doesn’t have (or has less than 10 years data for):
Boat Cove, Raoul Island
Castlepoint
Charleston
Fishing Rock, Raoul Island
Kaikoura
Korotiti Bay, Great Barrier Island
Lottin Point
Manukau
North Cape
Owenga, Chatham Islands
Puysegur
Sumner
In addition, PSMSL has:
Port Lyttelton
Lyttelton II
Timaru Harbour
Dunedin
Dunedin II
Bluff/Southland Harbour
Westport Harbour
Nelson
Port Taranaki
Whangarei Harbour (Marsden Point)

tony mcleod
January 21, 2017 4:48 pm

Good job David. Impressive array of resources in one place.
I have one correction – ” but the global total is always between 14 and 24 million square kilometers, with hardly any significant long-term trend”
This isn’t the case. The global total is well under 14 at the moment and, in the Arctic at least, there is a pronounced long-term trend.

Reply to  tony mcleod
January 22, 2017 6:09 am

Thank you, Tony. Yes, I wrote that before the sharp 2016 decline in reported global sea ice extent, which occurred coincidentally with the demise of the DMSP F17 & F19 satellites that were measuring it. I’m skeptical that the decline is real, but you’re correct that “hardly any significant trend” is not a proper description for 2016’s reported ice extent numbers.
However, the global total is not under 14 at the moment. DMI shows Arctic sea ice extent at about 14 million square km, and NSIDC says about 13 million square km. DMI apparently doesn’t analyze the southern hemisphere, but NSIDC shows Southern Ocean sea ice extent at the moment at about 3 million square km. The sum is still well above 14 million sq km.

tony mcleod
Reply to  daveburton
January 22, 2017 1:41 pm

True, My mistake, I confused the area graph which is showing about 13 with the extent graph.

hunter
January 21, 2017 4:53 pm

Thank you for putting this together. It is very impressive.

RoHa
January 21, 2017 5:15 pm

An awful lot of hard, meticulous, work must have gone into amassing and collating that data, and much more hard work into devising methods for making it accessible. That is a science site!
Yet (probably through exhaustion) you have omitted the essential item. Namely, the Doom scale. How can we use your data to find out whether we are mildly doomed, normally doomed, very doomed indeed, apocalyptically run-screaming-for-the-hills-while-civilization-collapses-around-you doomed, or worse.
After all, that is what all this science is for, isn’t it?

JohnWho
Reply to  RoHa
January 22, 2017 4:19 pm

RoHa –
I believe Dave is using a “Doomed”-down methodology.
/grin

January 21, 2017 6:18 pm

The continents are all moving around horizontally, and the vectors given seem to indicate as much as a few inches per year in motion.
Gravity strongly affects what is referred to as “sea level. How much does it do so, one might ask?
As much as 100 meters up and down, simply from gravimetric effects!
In other words, variations in gravity caused by differences in the density of the Earth cause the ocean to puddle up or be drawn down by well over three hundred feet, and this can be over distances as small as a few thousand miles, as it is between the central Indian Ocean (-100 meters), and the Western Pacific near New Guinea (+ 80 meters). For a whopping variation of some 180 meters, or over 590 feet, in the actual height of the ocean as measured from the center of the Earth.
My understanding is that the model that has been made by geodesists, called Earth Gravitational Model, is accurate everywhere to less than one meter. I do not know how much less, but if it was millimeters, i think they would have said that instead of “less than a meter”.
So there is that.
Then, back to the moving continents, and the gravity that moves along with them, and in their wake, and the shift from large quakes, such as the Indian Ocean quake on Christmas 2005, or the Japanese quake on March 11 of 2011.
In the first, some 1700 miles of sea crust was displaced about 50 feet horizontally, and “several meters” vertically upwards over that same distance. “Several meters” is about as accurate as most of the estimates I have seen were given, and that one event is thought to have raised the entire sea level of the world by 0.1 millimeters, by lowering the capacity of the Indian Ocean basin. The whole Earth is said to have rung like a bell, with a wavelength of the motion about a centimeter!
The Japanese one has been variously estimated as having even larger movements, with one estimate by the Japanese Agency for Marine-Earth Science given as moving the Pacific plate by over 50 meters, and raising the seabed by over 7 meters. Parts of Japan moved over two meters closer to the US, and a 250 mile stretch of coastline dropped by two feet! The entire axis of the earth shifted by as much as ten inches, and the length of the day and tilt of the Earth were altered. The Earth is now spinning faster, and each day is now 1.8 milliseconds shorter. Sound waves from the quake altered the orbit of the GOCE satellite, making it the first time an earthquake was measured from space. Some think it may have caused at least one glacier to slip by half a meter.
One thing to et from this is that such things are always occurring gradually as a result of small slippages and cumulative effects of all the motions of the Earth. The Pacific plate in that area moves nearly four inches a year.
Measuring sea level changes to tenths and hundredths of a millimeter over the whole Earth, knowing all this?
Not to mention what it implies about stuff no one may have ever thought of?
Pphfft!

Reply to  Menicholas
January 21, 2017 6:22 pm

Although to tell the truth, I never felt a thing, even though I am a light sleeper and was right here in Fort Myers.

TA
Reply to  Menicholas
January 22, 2017 5:40 am

A very interestig post, Menicholas.

TomRude
January 21, 2017 6:33 pm

The CBC beside still running an anti-trump campaign is also an arch-supporter of the global warming scare.
Among its fieriest peddlers, Quirks and Quarks’ Bob McDonald: http://www.cbc.ca/news/technology/science-trump-belief-1.3944523
“Donald Trump has stated clearly that he believes climate change is a hoax and that vaccines cause autism, two topics that have been clearly proven by science to be untrue.”
Too many “clearly” Bobby… and a straw man: climate change? No, alarmist climate change stuff that Bobby pushes 24/7/365 on CBC.
“False belief systems have a habit of remaining in the public mind even after they have been proven wrong. Conspiracy theorists still believe the moon landings were faked, even though we have recent photos of the Apollo landing sites from the Lunar Reconnaissance Orbiter, which is circling the moon at this moment.”
Ah Bob McDonald uses the discredited Lewandovsky study and since Bob McDonald is not informing but propaganding, he avoids carefully to offer the reference to his little novelty act.
Shame!

January 21, 2017 6:37 pm

Much desired tool. Thanks, Dave. Should raise more questions than answers.
Graphing San Diego from 1980 we see peaks during the ’82-’83, ’97-’98, and ’15-’16 very strong El Niño periods. Up the coast at Crescent City there are peaks in ’83 and ’98, but not in the ’15-’16 period.
1980-2017 trend at San Diego is +1.83mm/yr. At Crescent City it’s −1.48mm/yr.

Reply to  verdeviewer
January 22, 2017 6:21 am

Yes, San Diego’s sea-level is strongly positively correlated with ENSO.
On the other side of the Pacific, just the opposite occurs: sea-level is strongly negatively correlated with ENSO.
The sea-level records from San Diego and Kwajalein look like mirror images of each other:
http://www.sealevel.info/1820000_Kwajalein_San_Diego_2016-04_vs_ENSO.png
The moral of the story is: oceans slosh.

January 21, 2017 8:18 pm

Dave Burton
really a usefull site! Thank you for your work.
I used your tool for an hypothetical search of Italian tide gauge stations so tried “Venice Italy” and got a “no data” page. Next I wrote “Venice”, with the same result. The last try was to use the original name i.e. “Venezia” and got the plot of the Venice tide gauge station named “Venezia Arsenale”. Fine! At this point your site had (almost :-)) no more secrets for me, so tried “Napoli” (i.e Naples) and got a fine plot of the … Annapolis tide gauge station! Fine but not so useful to the supposed research on Italian stations.
Is there a way to distinguish Napoli from Annapolis through your tools? I can of course go to the PMSL site
and get the plot there (and also the dataset) but I would like to use your site.

Reply to  Franco Zavatti
January 22, 2017 6:48 am

Ah, yes, I definitely need to get a search working which will return multiple results!
Also, the Prev/Next arrows at the top should follow the coastlines; they need much improvement.
In the meantime, you can can use this cumbersome process to find such gauges:
1. Find some other Italian gauge.
2. Click on the PSMSL ID at the top, to view PSMSL’s info about that location.
3. Use their map to find the gauge you’re actually interested in, and click on the marker.
4. Note the PSMSL station ID for that station (129 for Naples).
5. On the sealevel.info site search for the PSMSL station ID:
http://www.sealevel.info/MSL_graph.php?id=129

Reply to  daveburton
January 22, 2017 7:00 am

Thank you and congrats again for your effort. Franco

Phu
January 21, 2017 10:18 pm

Deceptive chart, using meters instead of inches to make results look less aggressive. At least it clearly shows sea level is rising. peaks are higher than previous peaks, lows are lower than previous lows. I’d expect a SCIENTIFIC view to be less manipulative.

Reply to  Phu
January 23, 2017 1:36 am

Even in the USA, most sea-level researchers use meters (or millimeters), rather than inches. Did you notice that NOAA’s graphs show sea-level in meters?
The fact that sea-level is rising (very slowly) at most (though not all) of the best long-term tide gauges is well-known. What you need to understand is that >70 years of heavy carbon emissions, from heavy fossil fuel use, which has driven CO2 levels from 310 ppmv to over 400 ppmv, has nevertheless had no detectable effect on the rate of sea-level rise.

January 22, 2017 2:06 am

Kudos to Dave Burton for putting this together. I imagine there was quite a lot of work went into doing this.
You ask for comments so here are some points:
1/ I tried Honolulu but mis-spelt it as Honalulu. When I did ” Click thumbnail here for a down­load­able, book­mark­able image” I just get an error message:
“Sorry, you need to have JavaScript enabled in your web browser to use this feature.”
I do have JS enabled. The lack of a valid station was the error , not JS. If you need to test for JS , maybe do it in the onload event and test for it explicitly. Also consider using NOSCRIPT html tags on the entry page to display this error message.
2/ Implement a proper filter, not “boxcar” smoothing. See my article on how running averages distort the data and for information on better options.
https://climategrog.wordpress.com/2013/05/19/triple-running-mean-filters/
3/ The quadratic fit for acceleration is not valid. You are nailing one of the variables which should be free: the starting date for the acceleration. This must be a free variable to assess when the acceleration started otherwise you are imposing your own preconceptions on the data.
Jevrejeva’s papers on tide gauges find no acceleration in 20th c. but there was acceleration between 1880 and 1900.

By fixing the year zero as 1960.95 you are assuming that there is an attributable CO2 effect and going to look for evidence. This is exactly what the alarmists have been doing all along. You should look for an acceleration across all the data *without assumptions* of the cause and then one can assess whether it coincides with human emissions or whatever.

4/ What is the CO2 line supposed to show us ? Presumably you intend to allow a comparison and infer whether there is some temporal correlation. In that case you need work out right physical variable.
Assuming there is some thermal expansion due CO2 GHG effect, it should be some modified log of CO2 ratio not straight ppmv. Also since mean sea level is cumulative you should have some metric of the cumulative radiative effect of increasing CO2 , so an integral of the log ratio to the assumed “pre-industrial level” that was constant for thousands of years before we discovered “dirty” coal 😉
One would then want to see the two on comparable scales, the MSL data is squashed into fraction of the vertical axis and does not allow even a realistic visual comparison of form.
A lot of people ( sceptics ) have produced this kind of graph to poo-poo the idea of CO2 affecting various climate variables and it is very flawed. You must present two things which are at least theoretically related. Ideally you would also present some kind of correlation coefficient between the two datasets with an indication of significance for the number of independent data points available.
Aside from those points it looks like it could shape up into being a very handy tool. Thanks for the effort in putting this together.

Reply to  Greg Goodman
January 22, 2017 3:05 am

PS In fact if you want to fit something it should be a fn which could be seen to represent , at least crudely, the assumed , cumulative effect of GHG. A parabola does not fit this , so fitting a parabola is not really informative.

Reply to  Greg Goodman
January 22, 2017 3:19 am

re point 3, my mistake, it seems that the start date is free. Honolulu shows negative accel.
Could you explain why you are using 2*A as the accel. surely it is simply A .

Reply to  Greg Goodman
January 22, 2017 1:30 pm

OK, I get it: d2x/dt2 = 2*A

Reply to  Greg Goodman
January 22, 2017 11:51 pm

No worries, Greg. David Appell, PhD had the same confusion (but was too proud to admit it).
I’ve added the error-handling bug that you identified (#1) to my to-do list; thanks.
The CO2 graph is included because the only basis for anyone to predict accelerating SLR is that GHG (mainly CO2) levels are going up, which the worriers think should cause accelerating SLR. That’s why the IPCC calls their sea-level rise scenarios “emissions scenarios.”
I have an option to disable the CO2 trace, because you often won’t want that on your graphs. But it’s turned on by default because the (lack of) relationship between CO2 and SLR is the defining issue which (dis)connects SLR to arguments over efforts to curtail CO2 emissions.
Wildly accelerated sea-level rise is usually claimed by climate alarmists to be the #1 negative consequence of burning fossil fuels. Without a connection between elevated CO2 and accelerated sea-level rise, there’s no justification for even mentioning sea-level when talking about anthropogenic climate change.

Reply to  Greg Goodman
January 24, 2017 10:51 pm

That error handling bug (#1) is now fixed.

Dermot O'Logical
January 22, 2017 3:28 am

Admittedly skim reading, so apologies if I missed the explanation, but why exclude data before 1930 for calculation when you want to show what the data shows?

Reply to  Dermot O'Logical
January 23, 2017 12:21 am

Dermot O’Logical asked, “why exclude data before 1930 for calculation when you want to show what the data shows?”
By default, these graphs do not exclude data before 1930 for calculations.
But you can use the options to configure that, if you wish. You might want to do so, because the most famous sea-level paper of all, Church & White 2006, “A 20th century acceleration in global sea-level rise,” reported a “clear change of slope at ~1930” — when CO2 was under 310 ppmv.
They also wrote that “much of the acceleration occurr[ed] in the first half of the 20th century rather than a smooth acceleration over the whole period.” That was actually an understatement. When I reanalyzed their data I discovered that all of the acceleration was prior to 1930. After 1930, their data actually showed a small (statistically insignificant) deceleration in the rate of sea-level rise (doi:10.1007/s11069-012-0159-8).
So the “20th century acceleration” in sea-level rise which they found obviously was not due to the big increase in CO2 levels since 1945. It could not have been, because all of the acceleration predated 3/4 of the CO2 increase!
The relevant question for the climate debate, to help answer policy questions about what we should or should not do about CO2 emissions, is what effect anthropogenic CO2 actually has on sea-level rise. If you’re going to look for sea-level rise acceleration as a consequence of anthropogenic CO2, then it only makes sense to examine the period of time when CO2 levels were rising substantially. So it is reasonable to exclude data before 1930, or even, perhaps, before 1945.

Reply to  daveburton
January 24, 2017 5:34 am

“So it is reasonable to exclude data before 1930, or even, perhaps, before 1945.”
The trouble with this kind of approach is that you are excluding information. You would need to show that there was no rise before AGW then accelerating rise since GHE is considered significant.
If you preselect a period there is a likelihood that there will be some rise or accel which is not due to AGW but happens in that period, leading to spurious assumptions.
You need to work out what AGW says the forcing calculation should be and then compare to the longest records and see whether there is some matching both to periods with negligible AGW and recent periods where it should be strongest.
Either you sum the forcing over time as I did, or you go to a second order approximation and workout what the feedbacks are to increasing temp ( primarily Planck f/b ) and subtract that from the sum of the basic forcing.
In view of the lack of understanding of climate that will quickly become speculative and open to question.
For my money the simple first approximation of the integral of radiative forcing is the most useful first step to get an overview.
All this is done with arbitrary scaling so even if there is a match in form it is not proof. You’ d then need to look at OHC, depth of warming etc. and it quickly gets unhelpful.

Reply to  daveburton
January 24, 2017 7:00 am

Currently, nothing is excluded from the calculations. That’s clearly a mistake for cases like Seward, so one of my “to-do items” is to default to starting the regressions for such sites after the earthquake or other discontinuity. When I get the proverbial round tuit.

cerescokid
January 22, 2017 5:41 am

Dave
A terrific and useful website. I’ve spent a little time comparing the graphs of NOAA and PSMSL to see how close the trends are for each site.
I have a question about Gibara, Cuba. The trend for 40 years is remarkably similar. But NOAA stopped at 2011 and PSMSL continued to 2014 with a straight up rise by 1 meter in 2012.
Obviously, there is an error. But my question is whether you know of any coordination and reconciliation by individuals in these two systems. I’m surprised that such a large error is allowed to stay in their systems.

Reply to  cerescokid
January 22, 2017 6:48 am

The data is what is recorded. That is proper. If there are QA issues this kind of thing needs to be checked against meta data like site and equipment changes that are hopefully logged.
DB did specifically draw attention to this issue.

Reply to  Greg
January 22, 2017 7:07 am

The Gibara problem is my own bug. My code combines data from two sources, and it’s not quite as easy as that sounds. PSMSL’s data is in mm, unadjusted. NOAA’s is in meter’s, referenced to a different baseline value, and the version I prefer to use is seasonally adjusted (i.e., with a calculated seasonal cycle subtracted off).
Usually, my code successfully translates from one to the other, so that I can combine the two data sources. Obviously my code broke for Gibara. I’ve not yet had time to find the bug, but I’m sure it’s my bug.

Reply to  cerescokid
January 22, 2017 6:58 am

That’s a bug in my code. It’s already on my to-do list, but I haven’t gotten to it yet.
Workaround for now is to change the data source to “PSMSL” (at the bottom of the options):
http://www.sealevel.info/MSL_graph.php?id=Gibara&datasource=psmsl

Jeremy Shiers
January 22, 2017 6:19 am

Dave a great site which I shall definitely be using from now on
In 2014 I fitted a linear trend to at all sealevel observation sites in psmsl
http://jeremyshiers.com/sealevels/20140814/rlr_monthly/summary_rlr_monthly.html
though these results are static
looking at the rates of slr for all sites it is clear this is affected by the number of years of data
Santana in Brazil was -300 +/- 200 mm/year but only one year of data
Pulupandan, Negros Occidental in Philippines was 120 +/- 50 mm/year again with one year of data
Miyake Sima in Japan had a rate of 7.7 +/- 0.7 mm/year but there is a clear step in data which the metadata explains is the result of an earthquake
Tide gauges are moved sometimes by several meters vertically, the reported figures are adjusted to take account of these moves. Though a tiny error in the adjustment would affect the SLR of adjusted figures.
Adjustments are sometimes made retrospectively and can be huge. My favorite is Workington in UK which was adjusted from +7.85 to -7.24 during 2011.
Histogramming rates of SLR at locations with more than 60 years of observations shows rates bunched much more tightly around 1 mm/year compared to using all sites. And using sites such as Brest (206 years) shows now sign that I can see of recent acceleration in SLR

Reply to  Jeremy Shiers
January 22, 2017 6:53 am

Another thing that can lead to bias in attempting to draw global trends or other figures is that land movements vary a lot in both directions and are not equally distributed at the site we have, many longer ones been centred or Europe or N. Am.
This is quite a difficult subject as a quick look at the literature will reveal.
DB’s new site will however, provide a handy tool to looking at specific records.

Reply to  Jeremy Shiers
January 22, 2017 6:59 am

“And using sites such as Brest (206 years) shows now sign that I can see of recent acceleration in SLR”
There are three major breaks in that record. You can not just ignore that and assume the various sections are compatible.

Reply to  Jeremy Shiers
January 22, 2017 11:20 am

Very interesting web page, Jeremy; thank you for the link!
I included sites with at least a decade of MSL data, but the literature indicates that 50-60 years are needed to deduce a robust trend from a single tide gauge. Workington has just under 24 years, which is nowhere near long enough:
http://www.sealevel.info/MSL_graph.php?id=Workington
Skagway, AK, USA is presumably experiencing PGR:
http://www.sealevel.info/MSL_graph.php?id=Skagway
Miyake Sima, Japan was apparently affected by tectonic activity:
http://www.sealevel.info/MSL_graph.php?id=Miyake
PSMSL has added another note since you made your page. It says, “The apparent datum shift occuring during 2000 has been investigated. What was originally thought of as a datum error is now recognised as a real event. On July 14th 2000 Mount Oyama on the island began a series of eruptions. By September of that year the island had been completely evacuated. After a 4 year period of volcanic emissions residents were allowed to return permanently in February 2005. Because this was a real event the data point flags have been removed. However, the station flag remains.”
Tribeni, India is astonishing. I wonder what’s going on there?
http://www.sealevel.info/MSL_graph.php?id=Tribeni
Felixstowe is interesting. The old data that you show isn’t listed by PSMSL, at all, now:
http://www.sealevel.info/MSL_graph.php?id=Felixstowe
http://www.psmsl.org/data/obtaining/stations/214.php

Reply to  daveburton
January 23, 2017 3:02 am

Some of the graphs the Jeremy pulls out demonstrate precisely why you can not just “average” a ton of garbage and hope that it will mean something. You must do some serious QA pre-selection .

Med Bennett
January 22, 2017 6:37 am

Nice work – great site! I’ve just been arguing with a bunch of alarmists over a sea level story on the New Yorker magazine’s Facebook page – one of the few liberal sites that allow images in the comments. This will come in handy!

Reply to  Med Bennett
January 22, 2017 7:17 am

Look at Battery NY
y = B’ + M·x + A·x²
y = -114.824 + 2.983·x + 0.00412·x² mm
That’s 0.4 mm/year/century “acceleration”. Sod all.
No accel, no AGW.

Ron Richey
January 22, 2017 6:51 am

How long did it take you to develop this tool?
Amazing to the average Joe’s like me who visit here.
Thanks for creating it.
RR

Reply to  Ron Richey
January 23, 2017 12:30 am

I haven’t been keeping track, Ron. It’s been a big project, which took longer than I expected. A lot of other things have gotten neglected.

Reply to  daveburton
January 23, 2017 2:57 am

Engineering addage: add up everything you need to do and work out how long it will take; then double it and add 10% 😉

Reply to  daveburton
January 23, 2017 8:07 am

Is that before or after you show it to a bunch of other people, and collect their feature requests? 😉
Another engineering adage: the last 10% takes 90% of the work.

John MacDonald
January 22, 2017 12:13 pm

Dave Burton: You have created a wonderful site. Thank you.
1. I see some West Coast Canadian stations in the GLOSS list. Perhaps a next step could be to add a separate station list for all of Canada. Or have I just not found the East Coast stations yet?
2. Check Elfin Cove, AK. Two errors occur when displaying the data.

Reply to  John MacDonald
January 23, 2017 12:38 am

Thank you, John.
Those warnings are “debug code.” Elfin Cove has only 10 years of data, and a lot of very short data records trigger those warnings.
It’s because my code does some of the calculations two or more ways, and compares the results, to try to make sure I’ve done it right. For very short records the calculations often don’t produce exactly the same answers. When I figure out which way is best, I’ll use it and suppress the warnings.

January 23, 2017 2:51 am

To get a more rigorous comparison of the form of potential CO2 “forcing” and changes in MSL I calculated the cumulative sum ( integral ) of the simple 5.35 * ln ( CO2 / 280 ) formula of IPCC and rescaled both plots to full height ( ie physically arbitrary scaling but allowing a better comparison. )
Dave’s graphs are flattening the MSL which does not really show the form of the variation.
I have also applies a 12mo gaussian filter to get rid of fuzz.
http://climategrog.files.wordpress.com/2017/01/co2_battery_msl.png
data sources and code here:
https://climategrog.wordpress.com/co2_battery_msl/

Reply to  Greg Goodman
January 23, 2017 2:52 am

curvature in the middle portion of the graph is completely the wrong way around though I’m sure a warmist would look at that and declare it is a “prefect” match.

Reply to  Greg Goodman
January 23, 2017 7:37 am

I don’t think it’s correct to integrate the CO2 forcing. It’s CO2 levels (well, the log of CO2 levels) that affects temperatures, not the integral of the log.
If all you could measure was CO2 emissions, so your “forcing” had to be CO2 emissions (rather than levels), then you could integrate it, to approximate its effect on levels. But CO2 levels already represent integrated CO2 emissions, to the exact, correct extent that nature “integrates” (accumulates) them. If you integrate levels, the result of that calculation seems divorced from anything physical.

Reply to  Greg Goodman
January 23, 2017 4:21 pm

Dave, if you convert solar radiation to heat ( hot water ) using a roof top water heater, it matters how long the sun is shining during the day. The temp increase is the integral of incoming radiation over time.
That’s essentially what I am doing here.

Reply to  Greg Goodman
January 23, 2017 7:58 am

The “flattened” scaling of the vertical axis is simply a consistent vertical scale for most sites, intended to be broad enough to “work” for most sites, and similar to what NOAA uses.
The way my code draws the vertical axis, there are always nine labeled points. By default, they are 0.15 meters apart, which is what NOAA generally uses. When that would result in any of the traces not fitting on the graph, the vertical axis increments are increased: to 0.20, 0.25, 0.30, etc., per increment, as necessary, to make everything fit.
0.15 meters per increment was NOAA’s choice for most of their MSL graphs, but I think it was a reasonable one, which is why I used it, too.
It’s a compromise, of course. If you make the default increment much larger than that, so that fewer graphs would require non-standard scaling, to get better consistency between the graphs for the different sites, then for the most typical graphs everything looks almost like a horizontal line. Not good.
If you make the increment as small as you possibly can for each graph, so that the traces are scaled to “use” maximum possible vertical range, then when you compare the graphs of different sites most of them would look pretty much the same. The graph thumbnail sheets would become quite misleading, because the graphs for the sites with the highest rates of SLR would look just like the graphs for the sites with lowest positive rates of SLR. Only the sites where MSL is falling would look distinctly different — also not good.
http://www.sealevel.info/MSL_global_thumbnails5.html
I guess I could add the vertical scaling to the list of options, and let the user override it, if he wishes.

Reply to  daveburton
January 23, 2017 3:54 pm

I used gnuplot to produce the graph above. It automatically scales the data as large a possible to the nearest axis tickmarks.
A typical, more rigorous, way would be to “normalise” both variables to standard deviations.
If the idea is to allow the user to make a visual comparison of the two the noise needs to be removed from the MSL data with a suitable low-pass filter and the scaling needs to be compatible.
Unless we can assume that climate will equilibrate in a decade or so and thus say dRad = &lamba; dT then we do need to integrate the forcing power term to get an energy quantity which can be compared to the change in thermal energy to which thermal expansion is attributed.
Since IPCC says it takes many hundreds of years to reach a new equilibrium, then that second option is the correct way to go.
However, IMO, the main thing is filtering the noise and proper scaling to enable a realistic comparison. Just sticking ppmv on top of a flattened MSL is misleading.
Since both are rising there will be some similarity but a proper comparison shows that it does not go further than that and the form of the changes are not suggestive of a direct causal relationship.
All climate variables go either up or down and thus can be suggested to be either correlated or anti-correlated to CO2 if we draw straight lines through any and all data.

Reply to  daveburton
January 23, 2017 5:41 pm

Greg wrote, “…Unless we can assume that climate will equilibrate in a decade or so…”
W/r/t temperatures, it is generally expected that about 2/3 of of the effect of a forcing will be seen within twenty years. That’s why ECS is generally estimated to be about 1.5 times TCR.

Reply to  daveburton
January 24, 2017 1:25 am

Thanks Dave,
My study of the tropical response to Mt Pinatubo eruptions found much faster equilibration which implies very low climate sensitivity to radiative forcing, but this is not what IPCC etc say.
https://climategrog.wordpress.com/2015/01/17/on-determination-of-tropical-feedbacks/
what is the source for that 20y figure?

Reply to  daveburton
January 24, 2017 3:14 pm

Thanks for the reply Dave , but you said:

W/r/t temperatures, it is generally expected that about 2/3 of of the effect of a forcing will be seen within twenty years.

When I ask where you got that you send be to google search for a definition of TCR.
TCR is the effect of a 1% increase over 70 years ( a doubling ) at the end of 70 years. That does not answer the question of your 20y claim. On a quick check of a few results there I do not see any mention of your 20 years. What are you basing that on?
thx.

Reply to  daveburton
January 24, 2017 3:22 pm

https://www.ipcc.ch/ipccreports/tar/wg1/fig9-1.htm
IPCC graphs do not seem to settle in 20y !

Reply to  daveburton
January 24, 2017 3:23 pm

comment image

Reply to  daveburton
January 24, 2017 3:37 pm

This is what I get from a google search. Is it not what you get?
http://www.sealevel.info/tcr_screenshot.png

Reply to  daveburton
January 24, 2017 4:55 pm

Here’s a 2011 paper in which they define an alternate term, “Transient Climate Sensitivity” (TCS), which they say “may be expected to be very similar to the TCR.” They report the finding that “…surface temperatures respond quite quickly to a change in radiative forcing, reaching a quasi-equilibrium on the timescale of a few years… prior to a much slower evolution to the true equilibrium (e.g., Held et al. 2010)…”
They also say, “As long as the CO2 doubles over a time period short enough for deep ocean temperature to remain far from equilibrium (less than 100 years, for example), the response to that doubling will likely be nearly independent of the emissions
path.”

Reply to  daveburton
January 24, 2017 8:34 pm

Following the lead of global warming climatologists one might assert that the ratio between a basketball player’s height and the circumference of his big toe is “the height-to big-toe-circumference sensitivity,” implying that this ratio is a constant. even though to imply this ratio is a constant is preposterous as there is no reason for belief in the proposition that this ratio is a constant.

Reply to  daveburton
January 26, 2017 2:01 am

Thanks Dave, I had not seen the text you highlight, though it does appear lower down the results I get. I would have seen that is was a WP entry and ignored it without reading since WP is totally unreliable for anything about climate. However, if you read it carefully it just says the TCR is the change in temp over 20y centred on the end of the 1% per year for 70 years increase. So that is just an arbitrary period over which they measure the slope, it is not the time to settle.
” … is defined as the average temperature response over a twenty-year period centred at CO2 doubling in a transient simulation with CO2 increasing at 1% per year.”
“Over the 50–100 year time-scale, the climate response to forcing is likely to follow the TCR;”
So they are expecting the 20y slope to be representative of the change over 50-100 y. That is more in keeping with the IPCC graph I posted. It seems that you misinterpreted what that 20y period represented.
I think this justifies the calculation I did as a first approximation of accumulated energy to be compared to thermal expansion of the oceans and thus MSL time series. However, if you prefer to assume instantaneous ( one year ) equilibration and just take the instantaneous forcing instead of the sum, the result is not hugely different, so probably not worth a more detailed argument in this context.
The main question is to put the two variables fullscale on the graph. I realise why you are using the common, one size fits all scaling for MSL and NOAA does then same when just presenting MSL.
The problem arises when you put another data set on the same graph with a different vertical scaling. The result is misleading as I have detailed above.

Catcracking
Reply to  Greg Goodman
January 23, 2017 2:57 pm

Just look at the actual plot of the data for the Battery not the manipulated on shown by Greg. The real plot tells a clear story.
Dave, thank you for this site, it will be helpful to understand seal level rise.

Reply to  Catcracking
January 23, 2017 4:02 pm

I have produced an “actual plot” as well. What exactly is your objection to the processing apart from your finding the result provides less conformation bias?
Why is one plot more “real” than the other? Please try to make a reasoned criticism.
I too think Dave has made a great site, though he has quite a few things on his TODO list. I’m sure will get even better.

Reply to  Catcracking
January 23, 2017 4:09 pm

BTW Dave used the 75y “smoothed” data from Law Dome. I used their 20y “smoothed” version. All data is “manipulated”. If you have a technical objection I’ll try to address it.

Reply to  Catcracking
January 24, 2017 2:38 am

The only thing I “manipulated” at Battery, was to filter out the annual noise and used the full range of the vertical scale so that we can see the longer term variation more clearly.
If the MSL: data is to be presented within two vertical boxes, the CO2 data should be shown on a similar scale. Stretching one and not the other gives a misleading impression that they are very different, like one is flat and the other ramps up.
My graph shows that for much of the record MSL at NY is decelerating just when CO2 forcing is ramping up. One can then argue about what that may indicate but it is a proper way to compare the variability in the two records.

Johann Wundersamer
January 23, 2017 9:20 am

v’

Stephen M
January 23, 2017 12:37 pm

Excellent resource!
One comment: The pages listing data for individual sites present trend and acceleration data in mm/year and mm/year^2, respectively. However, the spreadsheets for a collection of sites (e.g., “NOAA’s 2016 list of 375 long term trend tide stations”) appear to use meters/year^2 for acceleration values, resulting in very small numbers. Putting acceleration data in mm/year^2 rather than meters/year^2 might be less confusing and would allow more significant digits in the spreadsheets.
Thanks!

Reply to  Stephen M
January 23, 2017 5:33 pm

Oh, my, you are right, Stephen! That’s a big, fat, ugly bug. I’ll eradicate it in a few hours.
Thank you, for noticing this, and bringing it to my attention.

Reply to  Stephen M
January 23, 2017 9:46 pm

Fixed, with gratitude. You’re top of the leaderboard for “best bug report” award, Stephen.

Peter s
January 23, 2017 1:37 pm

This information makes a mockery of Al Gore’s latest movie.

Reply to  Peter s
January 23, 2017 4:04 pm

No, Al Gore makes a mockery of Al Gore’s latest movie. Even climate alarmists think it is boring as hell.

Catcracking
January 23, 2017 2:49 pm

As a skeptic I am still trying to get my arms about this from Rutgers especially the sea level rise table along the shore at 3.5 feet by 2100 (central) and 10.1 Feet (collapse), any comments?
http://geology.rutgers.edu/images/stories/faculty/miller_kenneth_g/Sealevelfactsheet7112014update.pdf

Reply to  Catcracking
January 23, 2017 3:36 pm

The mean sea level (MSL) trend at Atlantic City, NJ, USA is +4.08 mm/year with a 95% confidence interval of ±0.16 mm/year, based on monthly mean sea level data from 1911/9 to 2016/11. That is equivalent to a change of 1.34 feet in 100 years.
http://www.sealevel.info/MSL_graph.php?id=180
The mean sea level (MSL) trend at Sandy Hook, NJ, USA is +4.06 mm/year with a 95% confidence interval of ±0.21 mm/year, based on monthly mean sea level data from 1932/11 to 2016/11. That is equivalent to a change of 1.33 feet in 100 years:
http://www.sealevel.info/MSL_graph.php?id=960-101
The mean sea level (MSL) trend at Cape May, NJ, USA is +4.55 mm/year with a 95% confidence interval of ±0.54 mm/year, based on monthly mean sea level data from 1965/12 to 2016/11. That is equivalent to a change of 1.49 feet in 100 years.
http://www.sealevel.info/MSL_graph.php?id=Cape+May
New York City saw a small amount of apparent sea-level rise acceleration in the late 19th century, but none in the last hundred years. The mean sea level (MSL) trend at The Battery, NY, USA is +3.10 mm/year with a 95% confidence interval of ±0.16 mm/year, based on monthly mean sea level data from 1916/12 to 2016/11.‡ That is equivalent to a change of 1.02 feet in 100 years.
http://www.sealevel.info/MSL_graph.php?id=8518750&c_date=1916/12:2019/12
(‡Light blue data is excluded from regression calculations)

Catcracking
Reply to  daveburton
January 23, 2017 5:35 pm

Dave,many thanks for your reply, It appears that the Rutgers claim is severely erroneous. Ergo, I plan to keep my shore house on the Barnegat Bay which did not flood during Sandy Super storm but was close.

Peter S
January 23, 2017 6:24 pm

It is good see an honest environmentalist at work. I am sure Donald Trump would have no problem funding the honest researcher after he reviews and puts a squeeze on where the money is going at the moment.

January 23, 2017 7:28 pm

Regression analysis fails to respond to to the so-called “problem of induction.” The problem is of how, in a logically justifiable fashion, to reach general conclusions from specific instances of them. This problem has been solved but most people don’t know of this advance.

Chimp
January 24, 2017 3:42 am

Arctic sea ice headed back toward the Neutral Zone at warp drive:
http://nsidc.org/arcticseaicenews/charctic-interactive-sea-ice-graph/

Bob F
January 24, 2017 6:26 am

Add an alternative SLR metric: “Number of years to half submerge the Statue of Liberty”

Reply to  Bob F
January 24, 2017 6:45 am

+1   😉
Likewise, water volumes should be specified in “Lake Eries,” ice & water areas in “Manhattans,” and heat/energy in “Hiroshimas.”

January 29, 2017 11:19 am

I’m working on “country/coastline code list” page, to find tide gauge sites by country & coastline. It’s about 3/4 done:
http://www.sealevel.info/coastline_codes.html

January 30, 2017 3:21 am