I and my collaborators at Rutgers, Tufts, the Potsdam Institute for Climate Impact Research, York, Woods Hole, and Harvard have published a new paper, to appear online this week, describing global sea-level change over the last three thousand years.
- R. E. Kopp, A. C. Kemp, K. Bittermann, B. P. Horton, J. P. Donnelly, W. R. Gehrels, C. C. Hay, J. X. Mitrovica, E. D. Morrow, and S. Rahmstorf (2016). Temperature-driven global sea-level variability in the Common Era. Proceedings of the National Academy of Sciences. doi: 10.1073/pnas.1517056113.
In this paper, we use a new statistical framework (based on spatio-temporal empirical hierarchical modeling with Gaussian processes; code available at Github) to identify the common global signal in a new database of >1300 geological sea-level indicators from 24 localities around the world. To our knowledge, this paper represents the first attempt to combine statistically rigorous analysis methods and a global proxy database to reconstruct global sea-level change over this time period.
What does the study find?
The paper has four key findings.
First, the rate of global sea-level change in the 20th century (1.4 ± 0.2 mm/yr) was, with 95% probability, faster than during any century since at least 800 BCE. (And the 800 BCE date is not because the rate of global sea-level rise was probably faster before then, but simply that the reconstruction quality isn’t good enough before then to have the same level of confidence.)
Second, the 20th century wasn’t the only time period when temperature and global sea level changed together. Global sea level underwent a statistically robust fall of 8 ± 8 cm (95% probability interval) over 1000-1400 CE, coincident with a decline in global temperature of ~0.2°C. Notably, both the decline in sea level and the decline in temperature occurred during the so-called European “Medieval Warm Period,” providing additional evidence that the “Medieval Warm Period” and “Little Ice Age” were not globally synchronous phenomena.
Third, using a ‘semi-empirical’ statistical model calibrated to the relationship between temperature and global sea-level change over the last 2000 years, we find that, in alternative histories in which the 20th century did not exceed the average temperature over 500-1800 CE, global sea-level rise in the 20th century would (with >95% probability) have been less than 51% of its observed value. We suggest that this counterfactual is consistent with what the world might have been like in the absence of anthropogenic warming. (A separate study led by my collaborator Ben Strauss at Climate Central uses our results to examine the contribution of the anthropogenic sea-level contribution to nuisance flooding in the United States.)
Fourth, the new semi-empirical model reconciles the remaining discrepancies between the physical process models preferred by the IPCC’s Fourth and Fifth Assessment Reports and semi-empirical models. The semi-empirical results are consistent with the localized projections that our team presented in a 2014 Earth’s Future paper. This agreement should lend greater confidence to these projections (about 50–130 cm of very likely 20th century global sea-level rise under the high-emissions RCP 8.5 pathway).
However, there is a caveat: semi-empirical models are inherently calibrated to the historical experience, and potentially biased if the processes that will dominate sea-level change in the future are qualitatively different from those that drove it in the past. In the Common Era before the 21st century, changes in ocean heat content and in mountain glaciers were likely the main drivers of global sea-level change. Ice sheets – and, in particular, ocean interactions with ice sheet margins – are playing an increasingly important role and dominate uncertainty in global sea-level rise projections in the second half of this century. Thus, the agreement between the new semi-empirical model and the physical models could be taken as suggesting that both share a common historical bias. For example, some exciting work being done by David Pollard and Rob DeConto suggests that processes such as ice-cliff collapse and ice-shelf hydrofracturing may play important roles in future ice sheet behavior that have not been well incorporated into most ice sheet models.
For the moment, though, the projections of global and local changes made by the new paper and by our 2014 paper remain among the best available probabilistic projections. Nonetheless, the ice-sheet response to global warming remains an area of what risk analysts call ‘deep uncertainty’. There is an indeterminate but non-zero probability that our estimated probability distributions are excessively conservative, particularly toward the end of the century and beyond. Thus, I would suggest that decision makers use these ‘best-available’ distributions but also consider the consequences for their decisions of ‘worst-case’ sea-level rise scenarios (e.g., about 2.5 m globally in the course of the century according to Kopp et al., 2014). If the consequences of such a worst-case scenario are unacceptable, then decision makers should adopt strategies that are robust to this possibility. Such robust strategies generally shouldn’t involve treating the worst case as a certainty; rather, in many cases, they will involve ‘adaptive’ strategies that allow for tightening of protection should sea-level rise prove to be toward the high end of projections.
What’s new relative to previous work?
The IPCC’s assessment of the literature, prior to our study, was that global sea-level fluctuations over the last 5 millennia were <± 25 cm, and that there was no clear evidence of whether specific fluctuations seen in some regional sea level records reflected global changes. The new paper places tighter constraints on variability over the last 3 millennia and identifies global fluctuations.
Previous attempts to reconstruct global sea level over the last 2-3 thousand years have relied on one of three approaches.
Some studies have used records of local sea-level change, attempted to correct them for processes (such as the ongoing response to the end of the last ice age) that are approximately steady over this time period, and added some additional uncertainty to account for the fact that local sea-level variability can differ for a variety of reason from global sea-level change. A good example of this approach is the study of Kemp et al. (2011), based on records from North Carolina, which involved a number of authors of the new study.
Some studies have attempted to manually tune the parameters of physical models that relate ice sheet changes to sea-level changes; the best recent example of this was the study of Lambeck et al. (2014), which covered the last 20,000 years. Such studies have not generally employed modern statistical methodologies, and generally have not focused on the last 2-3 millennia. Lambeck’s study, for instance, included only 31 geological data points from the last millennium (1000-2000 CE), compared to 790 such data points in our study.
Some studies have attempted to estimate the statistical relationship between temperature and global sea level seen in the period for which tide gauge records exist (the last 2-3 centuries) and then, using geological reconstructions of past temperature changes, extrapolate backward (‘hindcast’) past sea-level changes. The study of Grinsted et al. (2010), which significantly overestimated the responsiveness of sea-level to temperature change, is a good example.
- My co-author Stefan Rahmstorf’s blog post at RealClimate
- Rutgers Today article
- News articles: Associated Press, Bloomberg, Climate Central, Gizmodo, Mashable, New York Times, Washington Post
UPDATE (March 3): Our paper has gotten stellar coverage, including leading the New York Times on February 23 and meriting a mention by @BarackObama on March 1: