Further Problems with Kemp and Mann

Guest Post by Willis Eschenbach

In my previous post I discussed some of the issues with the paper “Climate related sea-level variations over the past two millennia” by Kemp et al. including Michael Mann (Kemp 2011). However, some commenters rightly said that I was not specific enough about what Kemp et al. have done wrong, so here’s what further investigation has revealed. As there is no archive of their reconstruction results, I digitized their estimate of reconstructed global sea level rise as shown in their Figure S2 (A). First, here is their Figure, showing their reconstruction of sea level.

Figure 1. Kemp Figure S2 (A) SOURCE 

I digitized the part of their graph from 1650 onwards, to compare it to recent observation. Figure 2 shows those results:

Figure 2. Kemp 2011 reconstructed global sea level change, 1650-2000 

So what’s not to like in these latest results from Kemp and Michael Mann?

The first thing that seems strange is that they are claiming that globally there has been a sea level rise of 200 mm (8 inches) in the last fifty years (1950-1999). I know of no one else making that claim. Church and White estimate the rise 1950-2000 at 84 mm (three and a quarter inches) mm, and Jevrejeva says 95 mm (three and three-quarters inches), so their reconstruction is more than double the accepted estimates …

The next problem becomes apparent when we look at the rate of sea level rise. Figure 3 shows the results from the Kemp 2011 study, along with the MSL rise estimates of Jevrejeva and Church & White from worldwide tidal gauges.

Figure 3. Kemp 2011 reconstructed rate of global sea level rise, 1650-2000, along with observations from Jevrejeva (red circles) and Church and White (purple squares).

Kemp et al. say that the global rate of sea level rise rose steadily since the year 1700, that it exceeded 3mm per year in 1950, that it has increased ever since, and in 2000 it was almost 5 mm/year.

Jevrejeva and Church & White, on the other hand, say it has never been above 3 mm/year, that it varies up and down with time, and in 2000 it was ~ 2 mm/year. In other words, their claims don’t agree with observations at all.

In addition, the Kemp 2011 results show the rate of sea level rise started increasing about 1700 … why would that be? And the rate has increased since then without let-up.

So we can start with those two large issues — the estimates of Kemp et al. for both sea level and sea level rise are very different from the estimates of established authorities in the field. We have seen this before, when Michael Mann claimed that the temperature history of the last thousand years was very different from the consensus view of the time. In neither case has there been any sign of the extraordinary evidence necessary to support their extraordinary claims.

There are further issues with the paper, including in no particular order:

1. Uncertainties. How are they calculated? They claim an overall accuracy for estimating the sea level at Tump Point of ± 40 mm (an inch and a half). They say their “transfer function” has errors of ± 100 mm (4 inches). Since the transfer function is only one part of their total transformation, how can the end product be so accurate?

2. Uncertainties. The uncertainties in their Figure S2 (A) (shaded dark and light pink in Figure 1 above) are constant over time. In other words, they say that their method is as good at predicting the sea level two thousand years ago as it is today … seems doubtful.

3. Uncertainties. In Figure 4(B) of the main paper they show the summary of their reconstruction after GIA adjustment, with the same error bands (shaded dark and light pink) as shown in Figure S2 (A) discussed above. However, separately in Figure 4(B) they show a much wider range of uncertainties due to the GIA adjustment. Surely those two errors add in quadrature, and end up with a wider overall error band.

4. Tidal range. If the tidal range has changed over time, it would enter their calculations as a spurious sea level rise or fall in their results. They acknowledge the possible problem, but they say it can’t happen, based on computer modeling. However, they would have been better advised to look at the data rather than foolishly placing their faith in models built on sand. The tidal range at Oregon Inlet Marina, a mere ten miles from their Sand Point core location, has been increasing at a rate of 3 mm per year, which is faster than the Kemp reconstructed sea level rise in Sand Point. Since we know for a fact that changes in tidal range are happening, their computerized assurance that they can’t happen rings more than a bit hollow. This is particularly true given the large changes in the local underwater geography in the area of Sand Point. Figure 4 shows some of those changes:

Figure 4. The changes in the channel between Roanoke Island and the mainland, from 1733 to 1990.

Note the shallows between the mainland and the south end of Roanoke Island in 1733, which are noted on charts up to 1860, and which have slowly disappeared since that time. You can also see that there are two inlets through the barrier islands (Roanoke Inlet and Gun Inlet) which have filled in entirely since 1733. The changes in these inlets may be responsible for the changes in the depths off south Roanoke Island, since they mean that the area between Roanoke and the mainland cannot easily drain out through the Roanoke Inlet at the north end as it did previously. Their claim that changes of this magnitude would not alter the tidal range seems extremely unlikely.

5. Disagreement with local trends in sea level rise. The nearest long-term tide station in Wilmington shows no statistically significant change in the mean sea level (MSL) trend since 1937. Kemp et al. say the rise has gone from 2 mm/year to 4.8 mm per year over that period. If so, why has this not shown up in Wilmington (or any other nearby locations)?

6. Uncertainties again, wherein I look hard at the math. They say the RMS (root mean square) error in their transfer function is 26% of the total tidal range. Unfortunately, they neglected to report the total tidal range, I’ll return to that in a minute. Since 26% is the RMS error, the 2 sigma error is about twice that, or 50% of the tidal range. Consider that for a moment. The transfer function relates the foraminiferal assemblage to sea level, but the error is half of the tidal range … so best case is that their method can’t even say with certainty if the assemblage came from above or below the mean sea level …

Since the tides are so complex and poorly documented inside the barrier islands, they use the VDatum tool from NOAA to estimate the mean tidal range at their sites. However, that tool is noted in the documentation as being inaccurate inside Pamlico Sound. The documentation says that unlike all other areas, whose tidal range is estimated from tidal gauges and stations, in Pamlico Sound the estimates are based on a “hydrodynamic model”.

They also claim that their transfer function gave “unique vertical errors” for each estimate that were “less than 100 mm”. This implies that their 2 sigma error was 100 mm. Combined with the idea that their VLSI error is 50% of the tidal range, this in turn implies that the tidal range is only 200 mm or so at the Sand Point location. This agrees with the VDatum estimate, which is almost exactly 200 mm.

However, tides in the area are extremely location dependent. Tidal ranges can vary by 100% within a few miles. This also means that the local tidal range (which is very local and extremely dependent on the local geography) is very likely to have changed over time. Unfortunately, these local variations are not captured by the VDatum tool. You can download it from here along with the datasets. If you compare various locations, you’ll see that VDatum is a very blunt instrument inside Pamlico Sound.

That same VDatum site give the Pamlico Sound two sigma errors (95% confidence interval) in converting from Mean Sea Level to Mean Higher High Water (MHHW) as 84 mm, and for Mean Lower Low Water as 69 mm.

The difficulty arises because the tidal range is so small. All of their data is converted to a “Standardized Water Level Index” (SWLI). This expresses the level as a percentage of the tidal range, from 0 to 100. Zero means that the sample elevation is at Mean Lower Low Water, 100 means it is at MHHW. The tidal range is given as 200 mm … but because it is small and the errors are large, the 95% confidence interval on that tidal range is from 90 mm to 310 mm, a variation of more than three to one.

Their standardized water level index (SWLI) is calculated as follows:

SWLI = (Sample Elevation – MLLW) / (MHHW – MLLW) x 100     (Eqn. 1)

When adding and subtracting amounts the errors add quadratically. The sample elevation error (from the transfer function) is ± 100 mm. The MLLW and MHHW two sigma errors are 69 mm and 84 mm respectively.

So … we can put some numbers to Equation 1. For ease of calculation lets suppose the sample elevation is 140 mm, MLLW is 0 mm, and MHHW is 200 mm. Mean sea level is halfway between high and low, or about 100 mm. Including the errors (shown as “±” values) the numerator of Eqn. 1 becomes (in mm)

(Sample Elevation – MLLW) = (140 ± 100 – 0 ± 69) 

Since the errors add “in quadrature” (the combined error is the square root of the sum of the squares of the individual errors), this gives us a result of 140 ± 122 mm

Similarly, the denominator of Eqn. 1 with errors adding in quadrature is

(MHHW – MLLW) = (200 ± 84 – 0 ± 69) = 200 ± 109 mm

Now, when you divide or multiply numbers that have errors, you need to first express the errors as a percentage of the underlying amount, then add them in quadrature. This gives us

(140 ± 87%) / (200 ± 55%) *100

This is equal to (.7 ± 103 %) x 100, or 70 ± 72, where both numbers are percentages of the tidal range times 100. Since the tidal range is 200 mm, this means that the total uncertainty on our sample is about 72 percent of that, or ± 144 mm. So at the end of all their transformations, the uncertainty in the sample elevation (± 144 mm) is larger than the sample elevation itself (140 mm).

All of that, of course, assumes that I have correctly interpreted their very unclear statements about the uncertainties in their work. In any case, how they get a Tump Point two sigma error of about 40 mm (an inch and a half) out of all of that is a great mystery.

Those are my problems with the study. Both the rate and the amount of their reconstructed sea level rise in the last fifty years are much greater than observations; tidal ranges in the area are varying currently and are quite likely to have varied in the past despite the authors’ assurances otherwise; and their methods for estimating errors greatly underestimate the total uncertainty.

w.

[UPDATE] One other issue. They say regarding the C14 dating:

High-precision 14C ages (8) were obtained by preparing duplicate or triplicate samples from the same depth interval and using a pooled mean (Calib 5.0.1 software program) for calibration.

This sounded like a perfectly logical procedure … until I looked at the data. Figure 5 is a plot of the individual data, showing age versus depth, from Supplementary Tables DR3 and DR4 here. They have used the “pooled mean” of three samples at 60 cm depth, and three samples at 80 cm depth.

Figure 5. Age and depth for the Sand Point samples in the top metre of the core. Red squares show C14 dates. Horizontal black bars show the 2-sigma uncertainty (95% confidence interval).

Look at the 60 cm depth. The three samples that they tested dated from 1580, 1720, and 1776. None of their error bars overlap, so we are clearly dealing with three samples that are verifiably of different ages.

Now, before averaging them and using them to calibrate the age/depth curve … wouldn’t it make sense to stop and wonder why two samples taken from the exact same one-centimetre-thick slice of of the core are nearly two hundred years different in age.

The same is true at the 80 cm depth, where the ages range from 1609 to 1797. Again this is almost a two hundred year difference.

What am I missing here? How does this make sense, to average those disparate dates without first figuring out what is going on?

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
90 Comments
Inline Feedbacks
View all comments
KR
June 27, 2011 5:40 pm

Willis
I found out (by, gasp, asking one of the authors) that the authors had submitted additional information and code, but that PNAS hasn’t linked it to the article yet – they are working on that. I believe it’s reproducible without that extra data, but I do understand that mileage may vary.
The curve you are looking at is a very low order polynomial fit to the assorted data – the steep curve in the end of the 20th century is a result of fitting it to the severe change in slope at the end of the 19th century, and hence the overshoot at 2010. I personally would have preferred them to place four linear fits to the different slopes they had identified, based upon breakpoint analysis, but given that some of the changes (MWP, LIA) weren’t all that abrupt, I would have to play more with the stats to see which gave a better fit.
Table S1 in the supplemental data does give the raw values with variances, albeit without the GIA and site offsets – it’s not that hard to drop that into Excel and reproduce the raw datapoint curve and do your own fits.

My biggest issues with this column, and with your previous post on the subject, is that rather than starting from the assumption that the authors (of which Mann is again 4th of 6, not the lead) were honestly presenting data, you led out with innuendo and accusations of malfeasance. Where you had questions or points that you thought lacked clarity, you did not ask the authors for clarification (yes, I know you and Mann don’t get along, but there are 6 authors here), or read the references, or look at similar literature (all basic methods for clarifying what you are reading), but instead proceeded with more innuendo and accusations. The transfer function you complained about in the previous post, for example? Reference number 7, fully detailed, yet you didn’t bother to check the references, and just claimed it to be undisclosed with more innuendo. Yes, you would have to pay to read the article where it was published – not all journals are free, and everyone I know has to cough up the subscription prices for numerous journals to keep current.
You simply and straightforwardly attacked a paper due to (in my opinion) who one of the authors was, and because it presented a ‘hockey stick’ – regardless of the fact that the data was processed in a well established fashion for sea level reconstructions. That’s not science, Willis – that’s an agenda.
Given that approach, I really don’t expect you to treat the authors, their methods, or their data with any actual consideration of what’s presented, but rather with more polemic. I have to say that I’ve seen much better from you, and have enjoyed previous discussions on this site – I am quite disappointed.

June 27, 2011 8:01 pm

RoHa says:
June 26, 2011 at 5:24 pm
“O.K. Time for the easy version again.
Is sea level going up, down, or sideways?
Inquiring minds want to know.”

It’s only the sand that’s going up, down, and sideways; with every good storm. The C14 samples only tell us that warmmongers do not possess inquiring minds. In fact, I doubt any warmer will dare to take more than a cursory dismissive glance at this page. Even the trolls won’t touch it.

Ryan
June 28, 2011 2:58 am

KR: “In terms of predictive capability, I would definitely go with the tidal gauge data, as it’s far more precise. But in terms of long term reconstruction of the last 2K years, I believe (just my opinion, mind you) that the point of the paper is to establish that foraminiferal data provides a reasonable estimate.”
I think that you can clearly see that it has NO capability for hindcasting since you can look at the tide gauge data for the last 100 years and if you had used the North Carolina data and zeroed it for the present day then by 1900 you would already be seeing a drop of 50% more than the real measured data. If it is 50% out in hindcasting over 100 years then how accurate will it be after 2000 yrs????? You couldn’t use this data to give any clue about the sea level – you couldn’t even tell if the sea level was higher or lower (and this is BEFORE we get into the size of those massive error bars – bearing in mind that you need to go out BEYOND those marked to reach just a 90% confidence interval).
It really is time you opened your mind to the possibility that this paper is a pile of bull-crap.

KR
June 28, 2011 6:13 pm

Willis – Sadly, just the polemical response I expected.
Another successful interaction in the world of climate science obfuscation…
…you might even believe that they chose that particular polynomial fit by chance or something, rather than picking it because of its “ski-jump” shape.
Rather than a better statistical fit across the several thousand years with minimum assumptions? Hmmm….
Michael Mann has lied, concealed critical parts of his work and data, and destroyed evidence.
And no, reference number 7 did not give the details of the three transfer functions that they used in the Kemp 2011 study.
Actually, it specifically gave the depth/species ratios you were whining about as ‘made up’ in your last post.
Lack of transparency. They didn’t publish their data. They didn’t publish their code. They didn’t publish their “transfer functions” (despite your claims to the contrary).
Heh. Give it a week or so on the ‘code’ publishing – the author I contacted, Vermeer, was quite surprised that the data wasn’t linked yet. And the transfer function you complained about is published – in a clear reference you apparently hadn’t looked at until I mentioned it. Besides – reading the paper, I believe I could (with some time in the swamps) replicate and test this paper’s results, the major real concern with a well written paper.
[various complaints about the science]
It’s clear that you have either (a) not read the methods, or (b) do not understand them. The authors have specifically addressed concerns about GIA, wetland modification, etc., using methods standard to the field of Foraminifera sea level reconstructions, but you are simply emphasizing uncertainties without addressing how the authors approached them. Weak sauce…
So yes – I have questioned your motives, based upon what you have written – an attack post lacking in scientific content. As I said before, you have attacked a paper by Kemp et al (with Mann a contributing, but not lead author) on sea level from paleo records with additional work to see how it matches up to temperature reconstructions – apparently (since I don’t read minds) because one of the authors is a convenient Ad Hominem target, and more importantly because it clearly indicates climate change. Again – not science, but polemic. And the scientific validity of polemic is, well, zero.

T.Haugland
June 29, 2011 12:32 am

Has there been any research on how tectonic plate movement might influence sea level?
My guess, is that most of the change measured, can be explained that way…
Makes a lot more sense than temperature at least…

Ryan
June 29, 2011 2:50 am

KR: you know something, I agree with you on the Dr Mann thing. Kemp et al. have put their own names to this pile of poop and therefore they are all tarred with the same brush. It is unfair to single out Dr Mann as being particularly corrupt in this case – the whole team on this paper are liars.
Normal practice in attempting to use a proxy to extend our understanding of something outside the instrumental record would be to ensure the proxy fits well within the period of the instrumental period. This is something they have bent over backwards to avoid doing with all manner of graphs that attempt to obscure the truth by failing to superimpose the hindcasts as they should. Figure S6 of the supporting document is as close as they get to showing it done properly and it shows that the proxy is widely inaccurate going back 100 yrs. It can’t possibly be relied upon to hindcast back 2000years – that should have been the only conclusion of the paper but they claimed the opposite. They are LIARS. All of them are LIARS. There is plenty of reason to believe that the way the paper has been presented is to cover up the fact it is based on a LIE, from the way they have tried to obscure the weakness of the proxy to hindcast within the instrumental period to the way they have tried to use change point regression analysis in a manner in which it can’t be reliably used, to the way they have used claimed “tide gauge data” which is actually 50% computer modelling based on assumed global temperatures, to the way they sought to avoid a genuine peer review process.
This paper is a travesty of the truth. It should be studied and picked apart by anybody with an interest in the truth until every lie within it has been exposed. It should be held up as a tragic example of how low certain scientists will stoop to promote certain false ideas. It should be used to pillory all the scientists that had the audacity to put their names to it. These people should never have the right to put their name to any scientific paper ever again, let alone any paper that is specific related to AGW. They are condemned by their own words and actions.

kadaka (KD Knoebel)
June 29, 2011 5:22 am

Willis,
The earliest mention I can find of the transfer functions is this Horton et al 2006 paper (no Kemp). No SI. Free download from NC state site:
http://dcm2.enr.state.nc.us/slr/Horton%20et%20al%202006%20diatoms.pdf
Discussion section is about development and use, lots of details, type of transfer function used mentioned, but the elusive actual transfer function isn’t there. From the conclusions:

We developed the first diatom-based transfer function for reconstructions using diatom-based transfer function for the east coast of North America to reconstruct former sea levels with a precision of ±0.08 m. We applied the transfer function to construct a relative sea-level curve from fossil assemblages from Salvo, North Carolina. These results suggest a sea-level rise of 0.7 m over the last c. 150 years, at an average of 3.7 mm year⁻¹, which is consistent with existing sea-level data, and illustrates the utility of the transfer function approach.

There is mention of assorted caveats with using the method, and possible “cherry picking” among parameters for the best results which might be justified. Well, they say they were first, thus I shall presume the usual warnings against using the “first of” something apply.
This was referenced by:
Engelhart, S.E., B.P. Horton, and A.C. Kemp. 2011. Holocene sea level changes along the United States’ Atlantic Coast. Oceanography 24(2):70–79, doi:10.5670/oceanog.2011.28.
Free download, no SI. Transfer functions not mentioned by name, but much discussion of the methodology of relative sea level (RSL) reconstructions with math on error calculations. From the Concluding Remarks:

Comparisons between geological observations and tide-gauge data reveal a 10-20-cm sea level rise during the twentieth century in addition to long-term changes driven by land level changes.

Hope this helps.
Curious. Someone comes up with something “new” and “scientific,” and from that comes a rotating assemblage of co-authors churning out a nigh-endless stream of numerous papers for numerous journals which are basically just different presentations of the same data. Has such credential-padding long been a behavior in science, or is this an example of the spreading of Hockey Team Publishing Syndrome?

kadaka (KD Knoebel)
June 29, 2011 6:25 am

Re: my previous post:
Sure, after I proofread it ten times and finally posted, I see in the first quoted section that when I corrected the mush from the bad copy-and-paste from the pdf (multiple columns mash together), I somehow duplicated some text while goofing it a bit. Oh well, mistake noted, original source with correct text linked and available.

kadaka (KD Knoebel)
June 30, 2011 3:15 am

Willis,
I’m beginning to think these “transfer functions” are not something that can be printed out and analyzed. Following the Horton trail I found this 2005 paper, free download of this “postprint” version (note: link given for data repository on pg 30 doesn’t work):
http://repository.upenn.edu/ees_papers/39/
From page 11:

Foraminifera-based transfer functions have been developed using a unimodal-based technique known as weighted averaging partial least squares via the program CALIBRATE, release 0.70, 1997 (Juggins and ter Braak, 1997).

Reference citation:

Juggins, S. and ter Braak, C.J.F. 1997: CALIBRATE, Department of Geography, University of Newcastle.

Searching by the names yielded many mentions of the program with different version numbers, years (<2000), and often "unpublished software." It's apparently discipline-specific and used to make educated guesses about environmental variables like moisture and salinity based on the amounts of assorted micro-critters like diatoms.
Just tonight I Googled this (used only "Juggins", didn't see this with both names):
http://www.staff.ncl.ac.uk/staff/stephen.juggins/software.htm
CALIBRATE was an old DOS program by Juggins, elsewhere I've seen it mentioned as a C++ program. Some time ago it was replaced by C2, Windoze (XP, Vista), which also replaces another program mentioned in that Horton paper (M.A.T.). Free download and use, limitation is max 75 samples without license.
Thus the vaunted "transfer functions" may be nothing more than an input file of a proprietary format used for a piece of proprietary software. Black box functioning, just the results for output. Also, see this discussion. Alternatives in R and something called “paltran” are mentioned, one R package is “rioja” which is also by Juggins. But, the paltran version doing WA-PLS gives different results than rioja or C2. Not a good sign when a method can’t be properly replicated by others.
The WA-PLS method was laid out in much mathematical detail in a much-cited book chapter circa 1993 (Hydrobiologia), link to paywall of just that chapter. But what appears to be the same chapter appeared in another book (Multivariate Environmental Statistics), free download of chapter:
http://www.biometris.wur.nl/NR/rdonlyres/71EBBDE7-DE7B-4956-9BEA-2B807C34A8EF/52117/terBraak1993JugginsBirksVoetWAPLS.pdf
So if you could acquire the “transfer functions” you may just have some files for an old piece of DOS software that you’d also have to dig up, and still not know exactly what was being done without getting the source code or much likely-verboten reverse software engineering. Going by the licensing of C2 and the Kemp2011 SI saying the North Carolina dataset has 193 surface samples, you may also need the paid-for version which might not be available if you need the original CALIBRATE program.
You may have better luck waiting for KR to provide that “freely available” info. 😉

Wolfgang Flamme
July 1, 2011 6:58 pm

Willis, FYI:
Just stumbled upon this thread … over at Prof. Rahmstorf’s blog ‘KlimaLounge’ he pointed me to an archive of data and code since I was interested in the proxy-DCA too:
http://www.pik-potsdam.de/sealevel/en/data.html
Unfortunately the Kemp reconstruction part is missing from the archive (confined to the results of his RSL reconstruction) so I guess they’ve documented their part of the work package. I just asked Rahmstorf wether he knows if Kemp will fill in his part of the documentation as well.
Anyway let’s wait and see what will be available when they’ve finished adjusting the documentation part of the paper.