Sea level rise: what’s the worst case?

Reposted from Judith Curry’s Climate Etc.

by Judith Curry

Draft of article to be submitted for journal publication.

Well, I hope you are not overdosing on the issue of sea level rise.  But this paper is somewhat different, a philosophy of science paper.  Sort of how we think about thinking.

I would appreciate any comments, as well as suggestions as to which journals I might submit to.  I have two in mind, but am open to suggestions (and I may need backups).

Thanks in advance for your comments.

Sea level rise: What’s the worst case?

Abstract. The objective of this paper is to provide a broader framing for how we bound possible scenarios for 21st century sea level rise, in particular how we assess and reason about worst-case scenarios. This paper integrates climate science with broader perspectives from the fields of philosophy of science and risk management. Modal logic is used as a basis for describing construction of the scenario range, including modal inductivism and falsification. The logic of partial positions and strategies for speculating on black swan events associated with sea level rise are described. The rapidly advancing front of background knowledge is described in terms of how we extend partial positions and approach falsifying extreme scenarios of 21st century atmospheric CO2 concentrations, warming and sea level rise. The application of partial positions and worst-case scenarios in decision making strategies is described for examples having different sensitivities to Type I versus Type II errors.

  1. Introduction

Sea level rise is an issue of significant concern, given the large number of people who live in coastal regions. The concern over sea level rise is not so much about the 20 cm  or so that global mean sea level has risen since 1900. Rather, the concern is about projections of 21st century sea level rise based on climate model simulations of human-caused global warming.

Scientists and policy makers using projections of sea level rise are susceptible to making both Type I and Type II errors. An overestimation of a given impact is a Type I error (i.e., a false positive), while an underestimation of the impact is a Type II error (false negative). While we do not yet know the outcome of 21st century sea level rise, and hence Type I and II errors are correctly regarded as potential errors, we can assess errors in reasoning that lead to potential Type I or II errors.

The Intergovernmental Panel on Climate Change (IPCC) assessments have focused on assessing a ‘likely’ range (>66% probability) in response to different emissions concentration pathways. Brysse et al. (2013) argues that the IPCC consensus building process has effectively resulted in a focus on the avoidance of Type I (false-positive errors). A case in point is the assessment of sea level rise in the IPCC AR4 (2007). The AR4 deliberately neglected dynamic ice sheet melt from its projections of future sea level rise because future rates of dynamic ice sheet melt could not be projected with any confidence – a Type II error.

Curry (2011, 2018a) raises a different concern, that the climate change problem has been framed too narrowly, focusing only on human-caused climate change. In the context of this framing, the impacts of long-term natural internal variability, solar variations, volcanic eruptions, geologic processes and land use are relatively neglected as a source of 21st century climate change. This narrow framing potentially introduces a range of both Type I and II errors with regards to projections of 21st century climate change, and leaves us intellectually captive to unchallenged assumptions.

Oppenheimer et al. (2007) contends that the emphasis on consensus in IPCC reports has been on expected outcomes, which then become anchored via numerical estimates in the minds of policy makers. Thus, the tails of the distribution of climate impacts, where experts may disagree on likelihood or where understanding is limited, are often understated in the assessment process. Failure to account for both Type I and Type II errors leaves a discipline or assessment processes in danger of misrepresentation and unnecessary damages to society and human well being.

In an effort to minimize Type II errors regarding projections of future sea level rise, there has been a recent focus on the possible worst-case scenario. The primary concern is related to the potential collapse of the West Antarctic Ice Sheet, which could cause global mean sea level to rise in the 21st century to be substantially above the IPCC AR5 (2013) likely range of 0.26 to 0.82 m. Recent estimates of the maximum possible global sea level rise by the end of the 21st century range from 1.5 to 6 meters (as summarized by LeCozannet et al, 2017; Horton et al., 2014). These extreme values of sea level rise are regarded as extremely unlikely or so unlikely that we cannot even assign a probability. Nevertheless, these extreme, barely possible values of sea level rise are now becoming anchored as outcomes that are driving local adaptation plans.[1]

Reporting the full range of possible outcomes, even if unlikely, controversial or poorly understood, is essential for scientific assessments for policy making. The challenge is to articulate an appropriately broad range of future scenarios, including worst-case scenarios, while rejecting impossible scenarios.

This paper integrates climate science with broader perspectives from the fields of philosophy of science and risk management. The objective is to provide a broader framing of the 21st century sea level rise problem in context of how we assess and reason about worst-case scenarios.

  1. Searching for black swans

Projections of future sea level rise are driven by climate-model generated projections of surface temperature in response to scenarios that increase atmospheric greenhouse gases. What type of climate change or sea level rise events, not covered by the current climate assessment reports, could possibly occur?

Potential surprises relative to background knowledge are often referred to as ‘black swans.’ There are two categories of black swan events (e.g. Aven and Renn, 2015):

  • Events or processes that are completely unknown to the scientific community (unknown unknowns).
  • Known events or processes that were ignored for some reason or judged to be of negligible importance by the scientific community (unknown knowns; also referred to as ‘known neglecteds’).

Efforts to avoid surprises begin with a fully imaginative consideration of possible future outcomes. Two general strategies have been employed for articulating black swan events related to climate change:

  • Statistical extrapolation of inductive knowledge beyond the range of limited experience using fat-tailed probability distributions.
  • Physically-based scientific speculation on the possibility of high impact scenarios, even though we can neither model them realistically nor provide an estimate of their probability.

2.1 Dismal theorem and fat tails

In a seminal paper, Weitzmann (2009) articulated the dismal theorem, implying the evaluation of climate change policy is highly sensitive to catastrophic outcomes, even if they occur with vanishingly small, but fat-tailed,[2] probability. The dismal theorem contrasts sharply with the conventional wisdom of not taking seriously extreme temperature change probabilities because such probability estimates are not based on hard science and are statistically insignificant.

Weitzmann argued that probability density function (PDF) tails of the equilibrium climate sensitivity, fattened by structural uncertainty using a Bayesian framework, can have a large effect on the cost-benefit analysis. Weitzmann’s analysis of the equilibrium climate sensitivity (ECS) was based on the IPCC AR4 (2007) assessment that ECS was ‘likely’ (> 66% probability) to be in the range 2 to 4.5oC with a best estimate of 3oC, is ‘very unlikely’ (< 10% probability) to be less than 1.5oC, and values substantially higher than 4.5oC cannot be excluded. Proceeding in the Bayesian paradigm, Weitzmann fitted a Pareto distribution to these values, resulting in a fat tail that produced a probability of ECS of 0.05% exceeding 11oC, and 0.01% probability of exceeding 20oC.

Subsequently, the IPCC AR5 (2013) modified their assessment of ECS, dropping the lower bound of the ‘likely’ range to 1.5oC and 1.0oC for the ‘very likely’ range (>90%), and more clearly defining the upper range with a 10% probability of exceeding 6oC. Most significantly, the IPCC AR5 stated that no best estimate for equilibrium climate sensitivity can now be given because of a lack of agreement on values across assessed lines of evidence and studies. While intuitively it might seem that a lower bottom would be good news, Freeman et al. (2015) considered a family of distributions using the AR5 parameters and found both the lowering of the lower bound and the removal of the best estimate actually fatten the ECS tail.

Annan and Hargreaves (2006) and Lewis and Curry (2015) have criticized high values of ECS derived from estimated PDFs, owing to unjustified assumptions and inappropriate statistical methods. The uncertainty surrounding ECS is not intrinsic to ECS itself, but rather arises from uncertainties in the parameters used to calculate ECS, e.g. external forcing data and magnitude of ocean heat uptake.

Curry (2018a) argues that given the deep uncertainty surrounding the value of climate sensitivity, we simply do not have grounds for formulating a precise probability distribution. With human-caused climate change, we are trying to extrapolate inductive knowledge far outside the range of limited past experience. While artificially-imposed bounds on the extent of possibly ruinous disasters can be misleading (Type II error), so can statistical extrapolation under conditions of deep uncertainty also be misleading (Type I error).

2.2 Physically-based scenario generation

Rather than sampling from a probability distribution, physically-based scenario generation develops different possible future pathways from coherent storylines that are based on particular assumptions.

Formally, each possible future can be regarded as a modal sentence (Betz, 2009), stating what is possibly true of our climate system. Betz articulates two general methodological principles that may guide the construction of the scenario range: modal inductivism and modal falsificationism. Modal inductivism states that a certain statement about the future is possibly true if and only if it is positively inferred from our relevant background knowledge. Modal falsificationism further permits creatively constructed scenarios to be accepted as long as the scenarios cannot be falsified by being incompatible with background knowledge. Modal inductivism is prone to Type II errors, whereas modal falsification is prone to Type I errors.

Betz (2009) argues that modal inductivism explains the controversy surrounding the conclusions in the IPCC AR4 regarding sea level rise (e.g. Oppenheimer et al. 2007). The AR4 summary statement anticipated a likely rise in sea level of 18-59 cm by the year 2100. This result was derived from climate model-based estimates and did not include the potential for increasing contributions from rapid dynamical processes in the Greenland and West Antarctic ice sheets. Although the AR4 recognized the possibility of a larger ice sheet contribution, this possibility is not reflected in its main quantitative results. Betz argues that the possible consequences of rapid ice-dynamical changes were not included because there was no model that could infer positively the ice-dynamical changes.

2.2.1 Modal inductivism: scenario generation by climate models

The IPCC Assessment Reports provide projections of future climate using global climate models that are driven by scenarios of future greenhouse gas emissions. Limitations of the IPCC projections of future climate change are described by the IPCC AR5 (2017; Section 11.3.1, 12.2.3). Internal variability places fundamental limits on the precision with which future climate variables can be projected. There is also substantial uncertainty in the climate sensitivity to specified forcing agents. Further, simplifications and parameterizations induce errors in models, which can have a leading-order impact on projections. Also, models may exclude some processes that could turn out to be important for projections.

Apart from these uncertainties in the climate models, there are three overarching limitations of the climate model projections employed in the IPCC AR5 (Curry, 2018a):

  • The scenarios of future climate are incomplete, focusing only on emissions scenarios (and neglecting future scenarios of solar variability, volcanic eruptions and multi-decadal and longer term internal variability).
  • The ensemble of climate models do not sample the full range of possible values of ECS, only covering the range 2.1 to 4.7 oC and neglecting values between 1 and 2.1 oC, with values between 1.5 and 2.1 oC being within the IPCC AR5 likely range
  • The opportunistic ensemble of climate model simulations used in the IPCC assessment reports does not provide the basis for the determination of statistically meaningful probabilities.

In summary, existing climate models provide a coherent basis for generating scenarios of climate change. However, existing climate model simulations do not produce decision-relevant probabilities and do not allow exploration of all possibilities that are compatible with our knowledge of the basic way the climate system actually behaves. Some of these unexplored possibilities may turn out to be real ones.

2.2.2 Modal falsification: alternative scenario generation

Smith and Stern (2011) argue that there is value in scientific speculation on policy-relevant aspects of plausible, high-impact scenarios, even though we can neither model them realistically nor provide a precise estimate of their probability.

When background knowledge supports doing so, modifying model results to broaden the range of possibilities they represent can generate additional scenarios, including known neglecteds. Simple climate models, process models and data-driven models can also be used as the basis for generating scenarios of future climate. The paleoclimate record provides a rich source of information for developing future scenarios. Network-based dynamical climatology can also be used as the basis for generating scenarios. More creative approaches, such as mental simulation and abductive reasoning, can produce ‘what if’ scenarios (NAS 2018).

In formulating scenarios of future climate change, Curry (2011) raises the issue of framing error, whereby future climate change is considered to be driven solely by scenarios of future greenhouse gas emissions. Known neglecteds include: solar variability and solar indirect effects, volcanic eruptions, natural internal variability of the large-scale ocean circulations, geothermal heat sources and other geologic processes. Expert speculation on the influence of known neglecteds would minimize the potential for missing black swans events that are associated with known events or processes that were ignored for some reason.

The objective of alternative scenario generation is to allow for and stimulate different views and perspectives, in order to break free from prevailing beliefs. Construction of scenarios that provide plausible but unlikely outcomes can lead to the revelation of unknown unknowns or unknown knowns.

  1. Scenario justification

As a practical matter for considering policy-relevant scenarios of climate change and its impacts, how are we to evaluate whether a scenario is possible or impossible?  In particular, how do we assess the possibility of potential black swan scenarios?

Confirmation (verification) versus falsification is at heart of a prominent 20th century philosophical debate. Lukyanenko (2015) argues that verification and falsification each contain contradictions and ultimately fail to capture the full complexity of the scientific process.

If the objective is to capture the full range of policy-relevant scenarios and to broaden the perspective on the concept of scientific justification, then both verification and falsification strategies are relevant and complementary. The difference between modal inductivism and modal falsificationism can also be thought of in context of regarding the allocation of burdens of proof. Consider a contentious scenario, S. According to modal inductivism, the burden of proof falls on the party that says S is possible. By contrast, according to modal falsificationism, the party denying that S is possible carries the burden of proof. Hence verification and falsification play complementary roles in scenario justification.

The problem of generating a plethora of potentially useless future scenarios is avoided by subjecting the scenarios to an assessment as to whether the scenario is deemed possible or impossible, based on our background knowledge. Further, some possible scenarios may be assigned a higher epistemic status if they are well grounded in observations and/or theory.

Under conditions of deep uncertainty, focusing on the upper bound of what is physically possible can reveal useful information. For example, few if any climate scientists would argue that an ECS value of 20 oC is possible. But what about an ECS value of 10 or 6 oC? We should be able to eliminate some extreme values of ECS as impossible, based upon our background understanding of how the climate system processes heat and carbon in response to gradual external forcing from increasing atmospheric carbon dioxide.

3.1 Scenario verification

As a practical matter for considering policy-relevant scenarios of climate change and its impacts, how are we to evaluate whether a scenario is possible or impossible?  Betz (2010, 2012) provides a useful framework for evaluating the scenarios relative to their degrees of justification and evaluating the outcomes against our background knowledge. A high degree of justification implies high robustness and relative immunity to falsification.

Below is a classification of future climate scenarios based upon ideas developed by Betz:

  • Strongly verified possibility – supported by basic theoretical considerations and empirical evidence
  • Corroborated possibility – it has happened before
  • Verified possibility – consistent with relevant background knowledge
  • Unverified possibility – climate model simulation
  • Borderline impossible – consistency with background knowledge is disputed (‘worst case’ territory)
  • Impossible – inconsistent with relevant background knowledge

Climate model simulations are classified here as unverified possibilities. Oreskes (1994) has argued that verification and validation of numerical models of natural systems is impossible. However there is a debate in the philosophy of science literature on this topic (e.g. Katzav, 2014).  The argument is that some climate models may be regarded as producing verified possibilities for some variables (e.g. temperature).

The epistemic status of verified possibilities is greater than that of unverified possibilities; however, the most policy-relevant scenarios may be the unverified possibilities and the borderline impossible ones (potential black swans). Clarifying what is impossible versus what is possible is important to decision makers, and the classification provides important information about uncertainty.

As an example, consider the following classification of values of equilibrium climate sensitivity (overlapping values arise from different scenario generation methods and different judgment rationales):

  • <0: impossible
  • >0 to <1 oC: implies negative feedback (unverified possibility)
  • 1.0-1.2 oC: no feedback climate sensitivity (strongly verified, based on theoretical analysis and empirical observations).
  • 1.15 to 2.7 oC: empirically-derived values based on energy balance models with verified statistical and uncertainty analysis methods (corroborated possibilities)
  • 2.1 – 4.7 oC: derived from climate model simulations (unverified possibilities)
  • >4.5 to 10 oC: borderline impossible (for equilibration time scales of a few centuries)
  • >10oC: impossible (for equilibration time scales of a few centuries)

There is a strongly verified anchor on the lower bound — the no-feedback climate sensitivity, which is nominally ~1 oC. Determination of ECS from observational data is represented by the Lewis and Curry (2018) analysis, the values from which are regarded as corroborated possibilities. The climate model range reported by the IPCC AR5 of 2.1 to 4.7 oC is classified as unverified possibilities. The borderline impossible range is open to dispute. Annan and Hargreaves (2006) argue for an upper bound of 4.5 oC. The IPCC AR5 put a 90% probability at 6 oC.  None of the ECS values cited in the AR5 extend much beyond 6 oC (one tail extends to 9 oC), although in the AR4 several long-tailed distributions were cited, extending beyond 10 oC.

It is rational to believe with high confidence a partial position (Betz, 2012) that equilibrium climate sensitivity is at least 1 oC and between 1 and 2.7 oC, which encompasses the strongly verified and corroborated possibilities. This partial position with a high degree of justification is relatively immune to falsification. It is also rational to provisionally extend one’s position to believe values of equilibrium climate sensitivity up to 4.7 oC (the range simulated by climate models), although these values are vulnerable to the growth or modification of background knowledge and improvements to climate models whereby portions of this extended position may prove to be false. This argument supports why a rational proponent should be interested in adopting a partial position with high degree of justification. High degree of justification ensures that a partial position is highly immune to falsification and can be flexibly extended in many different ways when constructing a complete position.

If values beyond 10 oC are impossible, then the fat tail values generated by Weitzmann (2009) are impossible. Is it physically justified to eliminate extreme outcomes of ECS? Because of the nature of the definition of equilibrium climate sensitivity, with a very long equilibration timescale and the possibility of very long timescale feedbacks, attempting to identify an impossible threshold may be an ill-posed problem. Because of the central role that ECS plays in Integrated Assessment Models used to determine the social cost of carbon, this issue is not without consequence.

Even if is impossible to falsify high values of ECS owing to ambiguities in the definition of ‘equilibrium,’ there are physical constraints on how rapidly temperature or sea level can change by 2100 in response to CO2 doubling.

3.2 Scenario falsification and expert judgment

While scientific theories can never be strictly verified, they must be falsifiable. This leads to a corollary that predicted outcomes based on a theory of change should in principle be falsifiable.

How do we approach falsifying extreme scenarios? Extreme scenarios can be evaluated based on the following criteria:

  1. Evaluation of the possibility of each link in the storyline used to create scenario
  2. Evaluation of the possibility of the outcome, in light of physical constraints and possibility of the inferred rate of change.

The first criteria is a mechanistic one, whereby individual processes and links among them are evaluated. The second criteria is an integral constraint on the scenario outcome, related to the possibility of the outcome itself and the required rate of change to achieve the outcome over a specified period.

Assessing the strength of background knowledge is an essential element in assessing extreme scenarios. Extreme scenarios are by definition at the knowledge frontier. Hence the background knowledge against which extreme scenarios are evaluated is continually changing, which argues for frequent re-evaluation of extreme scenarios.

Scenario falsification requires expert judgment, assessed against background knowledge. This raises several questions:

  • Which experts and how many?
  • By what methods is the expert judgment formulated?
  • What biases enter into the expert judgment?

Expert judgment encompasses a wide variety of techniques, ranging from a single undocumented opinion, to preference surveys, to formal elicitation with external validation (e.g. Oppenheimer et al., 2016).

Expert judgment plays a prominent role in the IPCC process. The multiple lines of evidence surrounding equilibrium climate sensitivity used in the IPCC’s expert judgment are quite clear. However, sea level rise projections are a much more complex situation for expert judgment, owing to their dependence on projections of ice sheet behavior with relatively few lines of evidence and a great deal of uncertainty. Hence sea level rise projections have been heavily dependent on expert judgment.

Issues surrounding the process of expert judgment are revealed in context of an expert elicitation on sea level rise conducted by Horton et al. (2014), which presented results of a broad survey of 90 experts. Gregory et al. (2014) criticized several aspects of the elicitation. The first criticism addresses the issue of ‘which experts?’ The respondents were a subset (18%) of the 500 experts whom Horton et al. identified; the other 82% could not be contacted, declined to respond, or supplied incomplete or inconsistent responses.

While overall the elicitation provided similar results as cited by the IPCC AR5, Figure 2 of Horton et al. shows that several of the respondents placed the 83-percentile for global mean sea level rise by 2100 for RCP8.5 to be higher than 2.5 m, i.e. more than 1.5 m above the AR5 likely range, with the highest estimate exceeding 6 m. Gregory et al. argue that such high values are physically untenable. They state that there is a large difference in rigor between the IPCC assessment and an expert elicitation. An expert elicitation is opaque; the respondents are not asked to justify, and we cannot know how they arrived at their conclusions. The IPCC assessment process is designed to avoid Type I errors, whereas the expert elicitation elicited several expert opinions that arguably make a Type II error.

Curry (2011a) argues that because of the complexity of the issues in climate science, individual experts use different mental models for evaluating the interconnected evidence. Biases can abound when reasoning and making judgments about such a complex problem. Bias can occur by excessive reliance on a particular piece of evidence, the presence of cognitive biases in heuristics, failure to account for indeterminacy and ignorance, and logical fallacies and errors including circular reasoning.

Research in cognitive psychology shows that powerful and sometimes subtle biases play significant role in scientific justification. Tversky and Kahnemann (1974) identified numerous cognitive biases that proliferate into our regular and scientific thinking, and often compete with inductive and deductive forms of logical reasoning.

  1. Sea level rise scenario verification and falsification

A comprehensive summary of recent sea level rise projections is provided by Horton et al. (2018). In assessing these projections for application to decision making, a broader framing of possible climate change scenarios is provided here, that includes natural climate variability and geologic processes.

4.1 Scenario generation

Physically-based scenarios of future sea level change are derived from the following methods: extrapolation of recent trends, semi-empirical approaches based on past relationships of sea level rise with temperature, and process-based methods using models.

Sea level rise projections are directly tied to projections of surface temperature, which are based upon simulations from global climate models that are forced by different emissions scenarios.

Most assessments have focused on bounding the likely range (>66%). Since the IPCC AR5 was published in 2013, new scenario and probabilistic approaches have been used for 21st century sea level rise projections. However, these new projections are based on the same climate model simulations used in the IPCC AR5.

Of particular note: the NOAA Technical Report entitled Global and Regional Sea Level Rise Scenarios for the United States (NOAA, 2017) provides a range of global mean sea level rise scenarios for the year 2100. The worst-case upper-bound scenario for global sea level rise (the H++ scenario) is 2.5 meters by the year 2100. The lower bound scenario is 0.3 meters by the year 2100.

Here we critically evaluate the upper and lower bounds: the worst-case scenario and also the lower bound best-case scenario.

3.2 Worst-case scenario

The worst-case scenario is judged to be the most extreme scenario that cannot be falsified as impossible based upon our background knowledge (Betz, 2010).   Strategies for generating the worst-case sea level rise scenarios include: process modeling that employs the worst-case estimate for each component, estimates based on the deglaciation of the last ice age and the previous interglacials, and expert judgment.

Most of the recent estimates of the worst-case scenario for global sea level rise in the 21st century range from 1.5 to 3.0 meters, with the recent NOAA Report (NOAA, 2017) using a value of 2.5 meters. In the expert elicitation study of Horton et al. (2014), 5 of the 90 respondents cited a value exceeding 3 m, with the highest value exceeding 6 m. These values of sea level rise imply rates of sea level rise as high as 50-100 mm/year by the end of the 21st century. For reference, the current global rate of sea level rise is about 3 mm/year. Are these scenarios of sea level rise by 2100 plausible? Or even possible?

4.2.1 Worst-case storylines

Worst-case scenarios have been developed around story lines of irreversible reduction in ice mass of the Greenland and/or West Antarctic ice sheets. Worst-case scenarios for 21st century sea level rise have been developed in different ways: convening an expert committee to develop extreme scenarios (e.g. Katsman et al., 2011), conducting a large expert assessment survey (Horton et al., 2014), or combining process models or expert assessment of ice sheet contribution with climate model projections (e.g. Bamber and Aspinall, 2013).

The primary concern over future sea level rise in the 21st century is related to the potential collapse of the West Antarctic Ice Sheet (WAIS). For the WAIS, marine ice shelves and tongues that buttress inland, grounded ice are believed to be critical for the ice-sheet stability. Marine Ice Sheet Instability (runaway retreat of the ice sheet) could be initiated if the buttressing effect of this ice is lost from erosion by a warming ocean or altered circulation in coastal seas.

The most vulnerable region of the WAIS is the Amundsen Sea sector. Scenarios for increased ice discharge from this region have been articulated by Pfeffer et al. (2008), based on kinematic constraints on the discharge of glaciers. DeConto and Pollard (2016) introduced new instability mechanisms related to marine ice-cliff instabilities and ice-shelf hydrofracturing (rain and meltwater-enhanced crevassing and calving). Their high-end estimate exceeded 1.7 m of sea-level rise from Antarctica alone in 2100 under the RCP8.5 scenario.

The most extreme 21st sea level rise scenarios from process-based models are reported by Schlegel et al. (2018). They assessed how uncertainties in snow accumulation, ocean-induced melting, ice viscosity, basal friction, bedrock elevation, and the presence of ice shelves impact the future sea level contribution from the Antarctic ice sheet. They found that over 1.2 m of Antarctic ice sheet contribution to global mean sea level contribution is achievable over the next century, but not likely, as this increase tenable only in response to unrealistically large melt rates and continental ice shelf collapse. As an extreme worst case, plausible combination of model parameters produced simulations of 4.95 m sea level rise from the Antarctic ice sheet by 2100.

Prior to these sophisticated ice sheet model simulations, the rationale for the highest scenarios that were elicited by Horton et al. (2014) – exceeding 6 m – do not seem to be justified by process-based models, but rather by top-down semi-empirical methods that relate sea levels to global mean surface temperatures during current and previous interglacials. Hansen et al. (2016) considered sea level and rates of sea level rise during the late Eemian (previous interglacial, about 124,000 years ago), as a justification for predictions of several meters of sea level rise in the 21st century.

Another possible storyline relates to newly discovered geothermal heat fluxes in the vicinity of the Greenland and Antarctic ice sheets (e.g. DeVries et al. 2017), although these processes have not yet explicitly figured into worst-case sea level rise scenarios.

4.2.2 Worst-case constraints

While associated with physically plausible mechanisms, the actual quantification of the worst-case scenarios for 21st century sea level rise remains highly speculative. As a check on scenarios developed from process models and/or more speculative methods, integral constraints on basic physical processes provide a rationale for potentially falsifying extreme scenarios.

Deglaciation following the last ice age provides an opportunity to examine the stability of marine ice sheets and possible rates of sea level rise.  During the Meltwater Pulse 1A (MWP-1A), the most rapid deglaciation occurred around 14.5ka BP. Recent research by DesChamps et al. (2012) constrained the rapid melting to a period of ~340 years. The most probable value of sea level rise during this period was between 14 and 18 m, implying that the rate of sea-level rise exceeded 40 mm/yr during this pulse. Two conflicting scenarios have been proposed for the source of MWP1A – a northern scenario, with partial melting of the large North American and Eurasian ice sheets, and a southern scenario that points to an Antarctic source. If the northern scenario is correct, then MWP-1A is not a very useful constraint for possible 21st century sea level rise.

Additional insights of relevance to the current configuration of the ice sheets are provided from the last interglacial (~130 to ~115 ky ago; the Eemian). Kopp et al. (2009) estimated a late Eemian sea level highstand with median value of sea level exceeding present values by 6.6 m (95% probability and unlikely (33% probability) to have exceeded 9.4 m. Kopp et al. concluded that present ice sheets could sustain a rate of global sea level rise rise of about 56–92 cm per century for several centuries, with these rates potentially spiking to higher values for shorter periods. Kopp et al. inferred that achieving global sea level in excess of 6.6 m higher than present likely required major melting of both the Greenland and the West Antarctic Ice Sheets.

Rohling et al. (2013) provide a geologic/paleoclimatic perspective on the worst-case scenario for 21st century sea level rise by examining the past 5 interglacial periods. They investigated the natural timescales and rates of change in ice-volume adjustment to a disequilibrium state, relative to a forcing increase. Projected rates of sea level rise above 1.8 m by 2100 are larger than the rates at the onset of the last deglaciation, even though today’s global ice volume is only about a third of that at the onset of the last deglaciation. Starting from present-day conditions, such high rates of sea level rise would require unprecedented ice-loss mechanisms without interglacial precedents, such as catastrophic collapse the West Antarctic Ice Sheet or activation of major East Antarctic Ice Sheet retreat.

An alternative strategy for falsifying ice loss scenarios relates to identifying physical constraints on specific ice loss mechanisms. Pfeffer et al. (2008) falsified extreme scenarios based on kinematic constraints on glacier contributions to 21st century sea level rise. They found that a total sea-level rise of about 2 meters by 2100 could occur under physically possible glaciological conditions but only if all variables are quickly accelerated to extremely high limits. They concluded that increases in excess of 2 meters are physically untenable.

The most extreme process-based sea level rise scenarios (e.g. DeConto and Pollard 2016; Schlegel et al., 2018) are derived from linking atmospheric warming with hydrofracturing of buttressing ice shelves and structural collapse of marine-terminating ice cliffs in Antarctica. Prediction of 21st century contributions depends critically on uncertain calibration to sea level rise in the Pliocene (about 3 million years ago) and debated assumptions about the Antarctic contribution to sea level rise during the Eemian.

Worst-case scenarios for 2100 and collapse of the West Antarctic Ice Sheet are driven by the RCP8.5 greenhouse gas concentration scenario. An additional constraint on the worst-case sea level rise scenario is an assessment of whether RCP8.5 is a possible scenario. RCP8.5 is an extreme scenario that may be impossible, given unrealistic assumptions and constraints on recoverable fossil fuel supply (e.g. Wang et al., 2016). Ritchie and Dowlatabadi (2017) explain that RCP8.5 contains a return to coal hypothesis, requiring increased per capita coal use that is based on systematic errors in coal production outlooks. Here, RCP8.5 is classified as borderline impossible.

Scenarios of 21st century sea level rise exceeding about 1.8 m require conditions without natural interglacial precedents. These worst-case scenarios require a cascade of events, each of which are extremely unlikely to borderline impossible, based on our current knowledge base. The joint likelihood of these extremely unlikely events arguably crosses the threshold to impossible.

How to rationally make judgments about the possibility of extreme scenarios remains a topic that has received too little attention.

4.3 Best case scenario

There has been much less focus on the possible best-case scenario, which is defined here as the lowest sea level rise for the 21st century that cannot be falsified as impossible based upon our background knowledge. Consideration of the best case is needed to provide bounds on future sea level rise. Further, verification/falsification analysis of the best case can provide important insights into uncertainty and the possible impacts of known neglecteds.

Parris (2012) recommends a lower bound of 0.2 m for 21st century global mean sea level rise, which is basically the observed rate of sea level rise during the 20th century. NOAA (2017) recommends that this value be revised upward to 0.3 m, because the global mean sea level rise rate as measured by satellite altimeters has averaged 3 mm/year for almost a quarter-century.

It is difficult to defend an argument that it is impossible for the 21st century sea level rise to occur at the same average rate as observed in the 20th century, especially since many if not most individual tide gauge records show no recent acceleration in sea level rise (e.g. Watson 2016).

Is it possible for global sea level to decrease over the 21st century? Kemp et al. (2018) provided an estimate of mean global sea level for the past 3000 years. There are several periods with substantial rates of sea level decline, notably 1000 to 1150 AD and 700 to 400 BC. Century-scale sea level decreases of the magnitude determined by Kemp et al. (about half the magnitude of the 20th century rate)[3] are not sufficient to completely counter the likely sea level rise projected by the IPCC AR5. Given the thermal inertia present in the oceans and ice sheets, it is arguably impossible for global mean sea level rise to decrease on the time scale of the 21st century.

However, it is possible for 21st century sea level rise to be less than in the 20th century. Possible scenarios of solar variations, volcanic eruptions and internal variability associated with large-scale ocean circulations could combine to reduce the 21st century rate of sea level rise relative to the 20th century. The relative importance to sea level change of human-caused warming versus natural climate variability depends on whether equilibrium climate sensitivity is on low end of the range (< 2 oC) or the high end (>4 oC) of current estimates.

The recent acceleration in global mean sea level rise since 1993 is attributed to increased melting of the Greenland ice sheet (e.g. Chen et al. 2017). This acceleration in Greenland melt has been largely attributed to natural variability associated with large-scale ocean and atmospheric circulation patterns – the Atlantic Multidecadal Oscillation (AMO) and the North Atlantic Oscillation (NAO) (e.g. Hahn et al. 2018). A future transition to the cool phase of the AMO and/or positive phase of the NAO would slow down (or possibly even reverse) the mass loss from Greenland, and hence slow down the rate of global sea level rise. Such a scenario for the Greenland mass balance is regarded as a corroborated possibility, since we have seen such a scenario recently during the 1970’s and 1980’s (e.g. Fettweis et al, 2013). The implications of such a scenario in the 21st century would significantly reduce sea level rise potentially for several decades, with the relative importance of this scenario for Greenland depending on whether equilibrium climate sensitivity to CO2 is on low end of the range or the high end of current estimates.

An additional best-case scenario relates to the recent finding by Barletta et al. (2018) that the ground under the rapidly melting Amundsen Sea Embayment of West Antarctica is rising at a rate of more than 4 cm per year.  This rise is acting to stabilize the West Antarctic Ice Sheet. Ice loss spurs uplift in the sea floor (isostatic rebound), which is occurring rapidly owing to low viscosity under the Amundsen Sea Embayment. Such processes have a strong direct impact on West Antarctic Ice Sheet evolution at the centennial time scale. Gomez et al. (2015) articulate a negative feedback process whereby the combination of bedrock uplift and sea surface drop associated with ice sheet retreat significantly reduces ice sheet mass loss.

4.4 Possibility distribution

Given the deep uncertainty associated with projections of 21st century sea level rise (e.g. Horton et al., 2018), a way to stratify the current knowledge base about likelihood and possibility of a range of 21st century sea level outcomes is a possibility distribution (e.g. Mauris 2011). As an example, LeCozannet et al. (2017) has constructed a possibility distribution and diagram for projections of 21st century sea level rise scenario outcomes.

Here, a possibility diagram is constructed (Figure 1) under different assumptions than used by LeCozannet et al. The variable U denotes the outcome for 21st century sea level change. U is cumulative, so that an 80 cm outcome must necessarily first pass through lower values of sea level rise. Values less than U therefore represent partial positions for U. The function π(U) represents the state of knowledge of U, distinguishing what is necessary and possible from what is impossible.

π(U) = 1: nothing prevents U from occurring; U is a completely possible value and may be regarded as necessary

π(U) = 0: U is rejected as impossible based on current background knowledge

Intermediate values of π(U) reflect outcomes whereby there would be no particular surprise if U does occur, or no particular surprise if U does not occur. Following the classification introduced in section 3.1, values of U are assigned the following values of π(U) (Figure 2):

  • π(U) ≥0.9: sea level rise up to 0.3 m; corroborated possibilities
  • 0.5 > π(U) > 0.9: sea level rise exceeding 0.3 m and up to 0.63 m; verified possibilities contingent on DT, based on IPCC AR5 likely range (but excluding RCP8.5).
  • 0.5 ≥ π(U) > 0.1: sea level rise exceeding 0.63 m and up to 1.6 m; unverified possibilities
  • 0.1 ≥ π(U) > 0: sea level rise between 1.6 and 2.5 m; borderline impossible
  • π(U) = 0: sea level rise exceeding 2.5 m; impossible based upon background knowledge
  • π(U) = 0: negative values of sea level change; impossible based on background knowledge

Figure 1: Possibility diagram of projections of cumulative 21st century sea level rise.

These π assignments are based on justifications provided in previous subsections (see also Curry, 2018b); however, this particular classification represents the judgment of one individual. One can envision an ensemble of curves, using different assumptions and judgments. The point is not so much the exact numerical judgments provided here, but rather to demonstrate a way of stratifying the current knowledge base that is consistent with deep uncertainty.

The possibility distribution in Figure 1 does not directly map to a PDF — the level of uncertainty is such that there is no particular basis for selecting a median or mean value for some hypothetical PDF of future sea level rise. LeCozannet et al. (2017) argue that no single PDF can represent the whole range of uncertainty sources related to future sea-level rise.

While there is a great deal of uncertainty surrounding the possible impact of marine ice cliff instability in the 21st century, rejecting such a scenario  is a Type II error. Our background knowledge base will change in the future, and it is certainly possible that in the future that such a scenario will be considered to have a greater likelihood. Based upon our current background knowledge, it is arguably more rational to reject the RCP8.5 concentration scenario than it is to reject the ice cliff instability scenario.

  1. Decision making under deep uncertainty about sea level rise

The concepts of the possibility distribution, worst case scenarios, scenario verification and partial positions are relevant to decision making under deep uncertainty, where precautionary and robust approaches are appropriate. A precautionary appraisal is initiated when there is uncertainty. A robust policy is defined as yielding outcomes that are deemed to be satisfactory across a wide range of plausible future outcomes (e.g. Walker et al. 2016). Robust policy making interfaces well with possibilistic approaches that generate a range of possible futures. Worst-case scenarios are an essential feature of precaution.

These concepts are applied in general terms to two decision making challenges related to sea level rise that have different sensitivities to Type I and II errors:

  • Infrastructure siting in coastal areas: Type II errors are of the greatest concern
  • Tort litigation: Type I errors are of the greatest concern

5.1 Infrastructure siting in coastal areas

Consider a hypothetical decision related to siting of major infrastructure near the coast, such as an international airport or a nuclear power plant. For infrastructure siting decisions having a multi-decade lifecycle, a Type II error (underestimation of the impact) would have the most adverse consequences. In this case, it is arguably more important to assess the worst-case scenario than to assess what is likely to happen.

NOAA (2017) provides the following advice about scenarios in the context of robust decision making. First, define a scientifically plausible worst-case scenario as a guide for overall system risk and long-term adaptation strategies. Then define a central estimate or mid-range scenario as a baseline for shorter-term planning. This strategy assumes that adaptive management, e.g. including additional flood defenses such as sea walls, is feasible.

Considering the worst-case scenario is consistent with the precautionary principle. The precautionary principle is best applied to situations where the potential harm can be controlled by the decision maker. In this case, the actual siting of the infrastructure is completely controllable.

So for purposes of decision making regarding infrastructure siting, which worst-case scenarios should be considered? Guided by the possibility diagram in Figure 1, a prudent strategy is to select a provisional worst-case scenario of 1 to 1.6 m (a partial position), with a contingent strategy for adding additional flood defenses if needed.

Actual siting decisions involve a large range of factors not considered here (e.g., Hall et al., 2016), but this example of decision making arguably benefits from explicit consideration of the worst-case scenario.

5.2 Climate change litigation

Kilinsky (2008) argues for the theory that plaintiffs can prevail on claims arising from the threat of potential injury attributable to a failure to adapt to or prevent climate change. Allen (2003) discusses the challenges from a climate science perspective related to demonstrating liability for climate change.

In evaluating litigation claims that rely on projections of future climate change and sea level rise, it is instructive to consider the role that the strength of the knowledge base of future climate change and sea level rise might have in these claims. The standard legal definitions for evidentiary standards and burden of proof can be mapped onto the possibility classifications:

  • Credible evidence: evidence that is not necessarily true but that is worthy of belief and worthy of consideration –> unverified possibilities
  • Preponderance of the evidence, or balance of probabilities: greater than fifty percent chance that the proposition is true; more likely than not to be true –> verified possibilities
  • Clear and convincing evidence: highly and substantially more probable to be true than not –> corroborated possibilities
  • Beyond reasonable doubt: there is no plausible reason to believe otherwise –> necessary.

Based upon this classification, unverified possibilities and worst-case scenarios of sea level rise would not support such a tort case. The range of verified possibilities arguably defines the maximum projected sea level rise that would meet the standard of preponderance of evidence. The challenge in developing evidence for such a case is to demonstrate that the projections based on climate model simulations of global temperature change meet the standards of verified possibilities, with a high degree of justification and relatively immune to falsification, even if the verified scenario represents only a partial position.

  1. Conclusions

The purpose of generating scenarios of future outcomes is that we should not be too surprised when the future actually arrives. Projections of 21st century sea level rise are associated with deep uncertainty and a rapidly advancing knowledge frontier. The dynamic nature of the knowledge frontier on worst-case sea level rise scenarios is highlighted by Kopp et al. (2017), who compared recent projections with past expert assessments. The objective of this paper has been to articulate a strategy for portraying scientific understanding of the full range of possible scenarios of 21st century sea level rise, with a focus on worst-case scenarios and the avoidance of Type II errors.

An argument for alternative scenario generation has been presented, to stimulate different views and perspectives. In particular, considering climate change to be solely driven by scenarios of future greenhouse gas emissions is arguably a framing error, that neglects possible scenarios of future solar variability, volcanic eruptions, natural internal variability of the large-scale ocean circulations, and geothermal and other geologic processes.

A framework for verifying and falsifying future scenarios is presented, in the context of modal logic. A classification of future scenarios is presented, based on levels of robustness and relative immunity to falsification. The logic of partial positions allows for clarifying what we actually know with confidence, versus what is more speculative and uncertain.

A possibility diagram of scenarios of 21st century cumulative sea level rise that ranks the possibilities from necessary to impossible provides a better representation of the deeply uncertain knowledge base than a probability distribution, since no single PDF can represent the whole range of uncertainty sources related to future sea-level rise. Apart from the limits of necessary and impossible, the intermediate possibilities do not map to likelihood since they also include an assessment of the quality of the knowledge base.

Hence, the possibility diagram avoids classifying scenarios as extremely unlikely if they are driven by processes for which we have a low level of understanding.

The possibility diagram for sea level rise projections considers sea level rise outcomes as resulting from a cumulative process, whereby a higher sea level outcome must first pass through lower levels of sea level rise. Therefore, lower values of sea level rise represent a partial position for the higher scenario. Partial positions can discriminate between lower values for which we have greater confidence, and higher values that are more speculative.

The concepts of the possibility distribution, worst case scenarios, scenario verification and partial positions are applied here to two decision making challenges related to sea level rise that have different sensitivities to Type I and II errors. The possibility distribution interfaces well with robust decision making strategies, and the worst-case scenario with partial positions is an important factor in precautionary considerations.

The approach presented here is very different from the practice of the IPCC assessments and their focus on determining a likely range, and provides numerous new challenges to the scientific community. There are some efforts (e.g. Horton et al., 2018) to develop decision-relevant probabilities of future sea level rise as part of a science-based uncertainty quantification. The state of our current understanding of sea level rise is far from being able to support such probabilities. The possibility distribution provides a framework for better classifying our knowledge about sea level rise scenarios.

[1] https://www.scientificamerican.com/article/prepare-for-10-feet-of-sea-level-rise-california-commission-tells-coastal-cities/

[2] A fat-tailed distribution is a probability distribution that exhibits a large skewness or kurtosis, relative to to a normal or exponential distribution.

References [ References]

Postscript

In comments under this article, Dr Curry has this exchange with Steven Mosher.

Steven Mosher | November 29, 2018 at 7:51 pm | Reply

“Nevertheless, these extreme, barely possible values of sea level rise are now becoming anchored as outcomes that are driving local adaptation plans.[1]”

Not true.

Re Read the article. the coastal commisions recommendation to consider 10 feet is not DRIVING local adaptation plans.

If I tell you, plan for 2 feet but, think about 10 feet, the recommendation to think about 10 feet is not a driving factor.

 

I cover the Pacifica story in a subsequent post here. ~ctm

0 0 votes
Article Rating
44 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Toto
November 30, 2018 10:20 pm

Science! OMG!

Not so seriously, the worst case is if the C-SL-rise alarms keep going until 2100.

old white guy
Reply to  Toto
December 1, 2018 4:05 am

that was a long article and I still want know where the water is going to come from, being as the planet has the same amount that it has always had.

Reply to  old white guy
December 1, 2018 4:33 am

Ice and thermal expansion of water.

old white guy
Reply to  David Middleton
December 1, 2018 9:56 am

I do not think there is enough to get the levels that are being projected. the world also will never get warm enough to see such levels even if there was enough water.

Andrew Jenkinson
Reply to  Toto
December 1, 2018 9:54 pm

I would like to see the evidence that scientists can measure increases in depth of the oceans of mm or cm when oceans are not static and scientists can only only measure the deepest part of the ocean within an accuracy of several metres. Projections made using calculations based on the size of glaciers can only guess how much will fall back on land as a result of higher temperatures and higher evaporation rates.
How does increased cloud cover affect global warming? I accept climate change is happening but I believe the general public doubt see level rise estimates as they do not believe such small rises could possibly have been accurately measured in the past.

Mardler
Reply to  James D Russell
December 4, 2018 4:05 am

I am not sure how reliable this NOAA data is given other data tampering however two things stand out here.

1. The trend is “relative” so is a combination of ecstatics and isostatics.

2. The trend line back to 1800 is constant with the post 1900 trend possibly being very slightly lower. There is no “human induced” uptick.

Andrew Jenkinson
Reply to  Mardler
December 4, 2018 4:23 am

“The plot shows the monthly mean sea level without the regular seasonal fluctuations due to coastal ocean temperatures, salinities, winds, atmospheric pressures, and ocean currents. The long-term linear trend is also shown, including its 95% confidence interval. ”
Does anyone seriously believe that in 1897 scientists were able to not only measure the sea level to an accuracy of a millimetre AND adjust it for seasonal fluctuations with any accuracy? I see no mention of the position or distance of THE Moon affecting the readings.
Unbelievable.

John F. Hultquist
November 30, 2018 10:41 pm

There are places around the world that have dynamic barrier islands, developments very near or below adjacent water levels, and structures on eroding cliffs. Search phrase: washaway beach / cape shoalwater wa

Moving and adaptation are the solutions. Regardless of the probabilities discussed, there is a lot that can be done, and should be done, even if the world-ocean doesn’t rise another mm.

Robber
November 30, 2018 10:55 pm

An ensemble of models? But I thought we had been assured that the science was settled? Surely after 30 years there must be a model consensus? And now we have just been told that a 1.5 degree increase in temperature since pre-industrial times will be catastrophic? As I understand it, that means just 0.5 degrees increase from today’s temperatures must be stopped by giving $ billions to someone, increasing energy costs by slashing coal and oil usage, and changing our diets. Tell ’em they’re dreaming.

Reply to  Robber
December 1, 2018 3:26 am

There is a 97% consensus among CMIP5 ensemble models… The consensus is that thermometers run cold.

Nylo
November 30, 2018 11:07 pm

So for purposes of decision making regarding infrastructure siting, which worst-case scenarios should be considered? Guided by the possibility diagram in Figure 1, a prudent strategy is to select a provisional worst-case scenario of 1 to 1.6 m (a partial position), with a contingent strategy for adding additional flood defenses if needed.

I would say that this value would need to be adjusted by the currently existing relationship between the specific place’s sea level rise speed and the global sea level rise speed due to the land generally sinking or rising at the place. Also I would consider that sea level rise would not be evenly distributed. Areas close to the ice melt may even see a sea level drop while areas farther away see a stronger sea level rise, because of gravity of the redistributing water.

I once read that the ammount by which the Greenland Ice Sheet “pulls up” the water surronding it due to its own gravity is higher than the expected global sea level rise if all of that ice were to melt. Which means that the global sea level may rise by that ammount on average… but near Greenland in particular it would drop. Which also means that far for Greenland the rise would be higher. I can’t remember where I read it, but I do remember that it included numeric calculations that passed my sniff test at the time.

Nylo
Reply to  Nylo
November 30, 2018 11:19 pm

So basically with a global SLR of 1-1,6m caused mostly by Greenland ice sheet melt and WAIS melt, the area of the eastern coast of Asia at about the latitude of southern Japan should expect a significantly higher increase, whereas eastern Canada, Iceland and Chile-Argentina should expect a significantly smaller increase.

November 30, 2018 11:37 pm

Quite accurate. People live in flood plains, tornado alleys, hurricane paths, fire risk areas, frigid zones & not just coastal fringes; either people move or accept their risks.

Asking poor nations’ populace to limit fossil fuel use that facilitates electricity & transportation,or even cut back on traditional livestock use to reduce methane emissions, seems to me indefensable. Modern capability is capable of coping with disruptive sea level changes; human pattern of settlements have historically been able to deal with sea waters’ moves.

True, a lot of commercial activity & thus population is now close to the seas.A lot of many coastal populations are no longer primary producers of goods. So it seems 2 large factors behind sea level concern are socio-cultural nostalgia & preserving property value – food production inland will continue & most energy resources too.

Toto
Reply to  gringojay
December 1, 2018 9:50 am

not to mention earthquakes, volcanos, lahars, …

LdB
Reply to  gringojay
December 2, 2018 5:38 am

Yes and most coastal cities will have been rebuilt almost completely in a hundred year period remembering many existing houses are probably already quite old. Go to any major coastal city and do a search for housing that is 200 years old it is usually handfuls. You just let things play out because you are talking snail pace multi lifespan effects.

The other problem is even if you accept CAGW then it doesn’t matter where you move the coastal houses and people they are still at substantial risk. If you forcibly move them and they then get hit by a natural disaster at the new location it becomes your fault. usually that means you are putting the taxpayers on the hook for all future natural disasters.

Neil Jordan
November 30, 2018 11:50 pm

California sea level rise analysis is included in the just-released California Coastal Analysis and Mapping Project/Open Pacific Coast (CCAMP/OPC), sponsored by FEMA with input from NOAA and Scripps Institution of Oceanography. There are four intermediate data submittals. Volumes 1 and 2 are germane to sea level rise. The highest stated range of sea level rise rate is 14 mm/year based on starting year 2000 and ending year 1,400 mm in 2100. The measured sea level rises are mostly in the 1 to 3 mm/year range. Eyeball mode is 2 mm/year.
Excerpts:
“Intermediate Data Submittal Vol. 1
“PDF 164 1. Background
“1.1 Mapping California’s Land-Ocean Interface
Rising sea level will have significant impacts on California’s coastline with some estimates up to a 1.4 m sea level rise by 2100.(1) As stated in the California Climate Adaptation Strategy, “Much of the damage from this accelerated sea-level rise will likely be caused by an increase in the frequency and intensity of coastal flooding and erosion associated with extreme weather events and storm surges.”(2) While bays and estuaries are expected to experience the most dramatic modifications in the coming century, changes. . .”
Intermediate Data Submittal Vol. 2
“SUPPLEMENT 2 OFFSHORE WATER LEVELS
“PDF 96 Sea Level Rise
“Table 1. Summary of data for the tide stations used in the frequency analysis.”
(Full table not typed in. Three of 16 tide gages shown.
9419750 Crescent City -0.65 mm/year lowest
9414290 San Francisco 2.01 mm/year middle and generally indicative of most
9418767 Humboldt 4.73 mm/year highest

GoatGuy
November 30, 2018 11:53 pm

I just wish to point out that the following:

π(U) ≥ 0.9: sea level rise up to 0.3 m; corroborated possibilities
0.5 > π(U) > 0.9: sea level rise exceeding 0.3 m and up to 0.63 m; verified possibilities contingent on DT, based on IPCC AR5 likely range (but excluding RCP8.5).
0.5 ≥ π(U) > 0.1: sea level rise exceeding 0.63 m and up to 1.6 m; unverified possibilities
0.1 ≥ π(U) > 0: sea level rise between 1.6 and 2.5 m; borderline impossible
π(U) = 0: sea level rise exceeding 2.5 m; impossible based upon background knowledge
π(U) = 0: negative values of sea level change; impossible based on background knowledge

Is mathematically incorrect. It should have been written:

π(U) ≥0.9: sea level rise up to 0.3 m; corroborated possibilities
0.5 < π(U) < 0.9: sea level rise exceeding 0.3 m and up to 0.63 m; verified possibilities contingent on DT, based on IPCC AR5 likely range (but excluding RCP8.5).
0.1 < π(U) ⇐ 0.5: sea level rise exceeding 0.63 m and up to 1.6 m; unverified possibilities
0 < π(U) ⇐ 0.1: sea level rise between 1.6 and 2.5 m; borderline impossible
π(U) = 0: sea level rise exceeding 2.5 m; impossible based upon background knowledge
π(U) = 0: negative values of sea level change; impossible based on background knowledge

Note the use of LESS THAN signs. This (e.g. 0.1 < π(U) ⇐ 0.5) is verbalized as "0.1 is less than π(U) and π(U) is less than or equal to 0.5″, meaning π(U) is strictly between and excluding 0.1 and including 0.5. Mathematicians might not be so stickery, but we computer scientists are. Many an algorithm has failed for want of proper greater-than and less-than signs.

Just saying
GoatGuy

December 1, 2018 12:09 am

For millions of years there are billions of km³ of fresh water (from rains, rivers & rivers) that have poured into the seas & oceans … WITHOUT WHERE THEY DO NOT UP !! ! That’s it! Quite simply because water continuously seeps into the ocean and sea floors to the magma where this poisonous soup (the fish shit in the sea!) Is heated / boiled and goes up (as in a coffee maker) to the sources (hot or cold depending on the altitude) and towards the water tables it fills.

Rod Evans
Reply to  huemaurice5
December 2, 2018 2:12 pm

Are you sure?

gbaikie
December 1, 2018 12:12 am

“In an effort to minimize Type II errors regarding projections of future sea level rise, there has been a recent focus on the possible worst-case scenario. The primary concern is related to the potential collapse of the West Antarctic Ice Sheet, which could cause global mean sea level to rise in the 21st century to be substantially above the IPCC AR5 (2013) likely range of 0.26 to 0.82 m. Recent estimates of the maximum possible global sea level rise by the end of the 21st century range from 1.5 to 6 meters (as summarized by LeCozannet et al, 2017; Horton et al., 2014).”

It seems if put enough ice in the ocean to raise sea levels by 1 meter, it could freeze the ocean and cause global cooling.

If put 1.1 meter of ice over entire ocean, sea level rise 1 meter and Earth gets colder than Earth has ever been.
If put 2.2 meter over southern hemisphere the ocean, sea rises by 1 meter, and likewise Earth gets very cold.
If put 4.4 meter over 1/2 of southern hemisphere ocean, sea level rises by 1 meter.
Where is 1/2 of southern hemisphere ocean?
40% of southern hemisphere ocean is Tropic of Capricorn, 23.5 degree south. And 1/2 is about 30+ degrees south.
Now instead of uniformly covering 1/2 southern hemisphere with 4.4 meters of ice, lets say it averages 4.4 meter thickness, so open ocean with big icebergs within it. And lets it average 4.4 meter but is thinner nearer the tropics and thicker near polar region. Or gets less iceburgs near 30 degrees and more at 50 degree South.
It seems with abundance of icebergs at 50 degrees south and further south, might cause the surrounding open ocean to freeze.

It seems before one could get 1 meter rise in sea level from large amount of ice entering the ocean, one would significantly increase the polar sea ice surrounding Antarctic.

December 1, 2018 1:54 am

Can we cease to worry about the effects of CO2. That has clearly ben shown as a None-factor, so lets consider the far more obvious risk factors.

Regarding the West Antarctic ice shelf. We know that it has volcanos under it, so lets assume the worst. They all start up and the shelf melts. So just how much of a rise world wide are we looking at. The se is vast and deep, so I would not expect at the worst if we got a half metre out of it.

The Dutch face far bigger amounts during stormy weather and they survive and have done so for hundreds of years.

So its possible that like Canvey Island in England the houses built on this Island cannot see the sea, because they are way below sea level and there is a big wall keeping the sea away.

So some people, such as Al Gore may lose their view of the sea if such a wall were to be built, but it is far from the end of the world ..

MJE

Editor
December 1, 2018 3:21 am

The AAPG is acting like they want to re-engage constructively on climate change.

I recommend that Dr. Curry submit the full paper to the AAPG Bulletin and a summary to, including figure 1, to the Explorer.

December 1, 2018 3:56 am

Sea level changes are a function of geological events. There are two known peaks in the last 150,000 years at 23 metres and 2 metres respectively as recently as 6,000 years ago . Therefore it is reasonable to plan for at least one of these two peaks. Leading to the last and smaller peak, the rate of sea level rise was about 10 mm per annum over a 2,000 year period. This rate of change is significantly greater than it is today.
While there was meltwater contributions to the sea level the main bodies of ice remain intact to this very day. The intact core sections from both Greenland and Antarctica point to this little understood aspect of the non correlation of ice accumulations with sea level changes for in excess of 428,000 years at both locations.
While Professor Curry is an extraordinarily fine climate scientist, she like all of her colleagues fail to grasp the obvious discrepancy between volumes of water alleged to have become frozen and the sea level changes that took place since time memorian.

Reply to  Ian MacCulloch
December 1, 2018 4:01 am

The 2 m peak 6 ka has been erased by climate science.

The well-established Holocene Highstand has been explained away as “ocean siphoning” and isostatic changes to the ocean basins.

Reply to  David Middleton
December 1, 2018 7:16 am

For millions of years there are billions of km³ of fresh water (from rains, rivers & rivers) that have poured into the seas & oceans … WITHOUT WHERE THEY DO NOT UP !! ! That’s it! Quite simply because water continuously seeps into the ocean and sea floors to the magma where this poisonous soup (the fish shit in the sea!) Is heated / boiled and goes up (as in a coffee maker) to the sources (hot or cold depending on the altitude) and towards the water tables it fills.

Rod Evans
Reply to  Huemaurice5
December 2, 2018 2:29 pm

I think you made that comment earlier.
Are you sure it says, what you think it says? The line Without where they do not up!! has me fooled.
I also get the impression you think water seeps into the sub sea strata as a key component of the water cycle. While some of it undoubtedly will do that, along with all the detritus you mention, the key water cycle balancing activity is evaporation, not volcanic release of subterranean water, though again that clearly does happen.
Sorry if I have misunderstood your comment.

Rod Evans
Reply to  Huemaurice5
December 2, 2018 2:34 pm

Have you come across evaporation of water, as postulated in the water cycle theory Huemaurice5?

Bloke down the pub
December 1, 2018 5:53 am

With regard to Black Swan events, I get the feeling that if the oceans were hit by a comet that introduced enough water to raise sea level by two inches, the cagw crew would still claim that it’d been made worse by global warming.

Nick Schroeder
December 1, 2018 7:13 am

Back in IPCC AR5 it took until 2500 for RCP8.5 to melt enough cryosphere to raise sea levels 6 to 20 feet.

Alasdair
December 1, 2018 7:15 am

Looming above all this is the spectre of Chaos Theory where current science stumbles around with its linear equations in the darkness of a non linear environment, chasing the strange attractors. Rich grounds for secular religious fanaticism.

Judith does well shedding some light on the problem; but I am not sure it will do much to curb the fanatics who now have the bit between their teeth.

fretslider
December 1, 2018 7:54 am

Sea level rise is all the rage right now.

But whatever happened to ocean acidification? It seems to have died a [timely] death.

ggm
December 1, 2018 8:39 am

This is all irrelevant. All we know for fact is that sea level rise is not accelerating and is rising at the same rate is has for many decades. Until that changes it’s all disproved.

jim hogg
Reply to  ggm
December 1, 2018 12:04 pm

Nothing about the future disproves anything until it happens. We have no guarantees that recent trends will remain the same.

December 1, 2018 10:10 am

Worst case is, if I lived on the shore (I don’t), it would rise faster than I can run from.

December 1, 2018 10:15 am

“Modal logic is used as a basis for describing construction of the scenario range, including modal inductivism and falsification.”

I am a native English speaker, former National Merit Scholar and degreed Mechanical Engineer from a good school, but I have no idea what language this is.

I have never seen any proof that satellites can measure sea level, and there has never been any calibration of these measurements. Hence, the tide gauges, which show no acceleration, are the only reliable measurement.

Sciencey-babble, that is what we will call this article.

Mike Lowe
Reply to  Michael Moon
December 1, 2018 11:36 am

From one Michael to another, I have to say I agree. Whilst I can see some point in trying to assess the probability of certain predictions actually happening, does that mean the we need to include the published prediction of a moon-affected Greenie that the SLR will be 50 meters by the year 2100? Whilst I admire Judith’s efforts to inject some logic into the CAGW argument, it seems that to most people (however well educated they are) the records of recent SLR figures will be a good indication of likely future rises. A few years ago, I found at the Northern end of Bondi Beach a beautiful stainless steel plate with appropriate markings to show the amount of SLR in recent years. The only problem was that it was lying on its side and completely useless for its intended purpose. I suspect that many such well-intentioned indicators are similarly useless.

Reply to  Michael Moon
December 2, 2018 9:48 am

I have no idea what language this is.

Newspeak.

michael hart
December 1, 2018 12:24 pm

Gives too much credibility to the fantasies of the usual crowd, who long since threw away what little they had.

Just as warming of the oceans due to CO2 must lag any warming of the atmosphere, so too, it often seems, that the rhetoric of the global warm-suckers must similarly lag behind.

Sensible scientists pointed out that the oceans were probably a better ‘calorimeter’ than the atmosphere for the earths energy budget before the global warm-suckers decided that the oceans were a good place to look for their missing heat from the atmosphere. Ironically, both could still be wrong. The non-linearity of seawater expansion with temperature means that even if Trenberth’s missing heat is hiding in the deep ocean, that means that its contribution to additional sea rise is going to be tiny over the next few millennia. Which obviously means that the alarm over global warming is also buried for a few thousand years after Trenberth’s pension plan has to stop paying out.

Whatever, it should be obvious to all that sea-level rise is a slow, second-order effect from any putative warming from atmospheric CO2. If the atmosphere is not warming as fast as the global warm-suckers predicted, then changes in sea-level rise will be commensurately slower and less problematic, as the data bears out. Bjorn Lomborg, a “believer”, has sensibly pointed out that global sea-levels may have risen about 12 inches in the last century and nobody wrote home to complain. So it will continue.

Practical advice follows: If your local sea-level change looks unfavorable, then frickin vote for some local frickin politicians who might do something about it, rather fund global warm-suckers and their models telling you how dooomed you are going to be after they have retired. London started building the Thames barrier long before global warm-sucking became fashionable in West Kensington.

Neville
December 1, 2018 1:45 pm

I’d like the CAGW crowd to tell us how to stop their so called dangerous SLR? What mitigation process would make any measurable difference by 2100 or 2300?
Fair dinkum China and India + the non OECD countries must be laughing all the way to their banks. We have to cop Dr Hansen’s BS and fra-d from the Paris COP 21 idiocy and the non OECD now have soaring co2 emissions and this will continue into the foreseeable future. Will they ever wake up?

December 1, 2018 3:06 pm

Risk assessment is based on knowledge of loss scenarios that have the two dimensions of probability and consequence. In industry, loss of human life is rated as high consequence and, for evaluation purposes, assigned a very high monetary value.

Any loss scenario associated with sea level rise is inevitably low consequence. For example, if I am building a ship loading facility on a tropical coastline I will be designing for a peak wave height that might be 10m, a storm surge of 2m on a king tide of 6m. That sets the air clearance above low tide level. I also need to consider under keel clearance but that can be remedied by dredging. Nothing with sea level rise is going to factor into the risk assessment over the 30 year economic life of the project. Adding an allowance of 0.2m for sea level rise would be insignificant in that case. It would not make sense to build the ship loading facility in any location other than existing shoreline. I live 30m above sea level and may live another 30 years. I doubt I will ever enjoy a beach frontage but I would not complain if I did. With sea level rise there is likely winners and losers financially. Very unlikely to result in loss of life unless the emergency planning for specific storm surge events is incompetent.

On your point of where IPCC completely fails, I believe we are now observing high consequence events associated with rising CO2 where the IPCC have a blind spot. It is reasonably apparent that forest productivity has increased quite dramatically in the 21st century, likely caused by rising CO2. Forest management practices have not kept pace with the increased productivity. In 2018 there has been significant loss of life associated with wild fires; some regarded as unprecedented in recorded history. I regard forest productivity increasing fuel load as a certain risk. It demands immediate attention. It is arguable that one of the responses to Climate Change to sequester carbon in forests is a direct contributor to the mounting risk.

Another growing risk related to Climate Change is the lack of system planning with regard to permitting intermittent power generation onto large electrical grids. This should have never occurred without overall system planning. Since it has, the risk of prolonged grid collapse has increased dramatically. In South Australia there has already been high consequences demonstrated by loss of grid power due to intermittent power generators. Engineers in Scotland are waving a red flag regarding the mounting risk and high consequence of grid collapse. The consequence of such an event in Scotland will almost certainly be loss of life as occurred in Canada during the 1998 ice storm.

Continuing with energy risks, the artificially high cost of carbon based fuels through various taxes and transfer payments has created energy poverty for financially disadvantaged in developed economies. Lack of heating in a cold climate or lack of cooling in a hot climate are high consequence events for the vulnerable. Rioting against poorly justified taxes on carbon is also a high consequence scenario.

December 2, 2018 4:00 am

Thanks David for your note. It did lead me to the source of your statement . It seems as though there are a few papers on this topic. It may be that the true interpretation of events is that we are dealing with an expanding AND contracting earth model. There is no doubt there were advances and retreats of the glaciers during this period from 12,000 years BP. The actual volume of ice involved is quite small relative to the volumes of the movements exemplified by the seal level changes. However, the rapid rate of change of the sea level associated with the still stands, the establishment of the still stands and the development of the lower three Great Barrier Reefs (drill proven) all point to a greater than climactic event and convenient shoreline flexing.

Dixon
December 2, 2018 6:38 am

What is needed is a decision-rules strategy – like Fisheries Managers use to intervene in catch settings: if the stock index drops to this level, we will act to cut catch. We will close the fishery if the stock index drops to this level. If/when stock indexes climb above those reference points, we will relax the catch restrictions.

It’s not really all that hard. Just agree on how to define the reference points and what actions you will take when the reference points are met.

James D Russell
December 2, 2018 7:42 pm

“Solar activity is declining very fast at the moment,” Mike Lockwood, professor of space environmental physics at Reading University, UK, told New Scientist. “We estimate faster than at any time in the last 9300 years.”

We are already experiencing a empirical a 2deg decline in global temperature during the last two years. The late 17th century Maunder Minimum, when there were no sunspots for 70 years brought on the Mini Ice Age. I would suspect that any increase in sea levels will slow and even out shortly.