PEN Feasibility Studies

In the summer of 2016 the PEN held a call for feasibility studies. Our members responded with an excellent range of proposals and after a difficult decision process six were awarded funding. Each of these studies will take place during 2017 and address key scientific questions posed in climate modeling, statistical analysis of palaeoclimate data and palaeoclimate reconstructions.

Moving forward with forward models – synthesising past water tables from multiple peat cores

(PI) Maarten Blaauw, Queen's University Belfast / Andrés Christen / Marco Aquino Lopez

Forward models are often promoted as enhancements over ‘reverse’ models traditionally used in palaeoecology. In reverse models, past environments are inferred from fossil ‘proxies’ through statistical correlations between the response (proxies) and their supposed causal factors (climate, environment). This approach is sub-optimal for a number of reasons – poor quantification of uncertainties, dangers of reconstructing non-causal variables, and the difficulties of combining multiple reconstructions. Instead, forward models aim to model how causal processes (e.g., climate) determine the resulting fossil proxy records, via intermediate processes such as hydrology and ecology. In doing so, these models should enable the palaeo-community to obtain a much fuller understanding of how climate, environments and (multiple) fossil proxy records are linked, and also to better quantify the uncertainties involved in palaeoecological reconstructions.

Here we will build and ‘sell’ to the palaeo-community a simple, intuitive and educational forward model of how climate dynamics has forced hydrological changes in Holocene raised bog peat deposits. It builds on existing basic models that employ statistical distributions to simulate how environmental factors could steer fossil proxy values. Through combining multiple peat records into a single framework, we will take a significant step ahead toward a fuller understanding of 1) how climate and other processes steer fossil proxy records, 2) the reproducibility and reliability of reconstructions from single cores, and 3) the possibilities and limitations of combining multiple records.

Emulating the Isotope Response to Changes in the Geometry of the Antarctic Ice Sheet

(PI) Louise Sime, BAS / Dario Domingo / Irene Malmierca / Jochen Voss

This project aims to revolutionise our approach to using ice core observations of water isotopes to constrain past environmental changes. The shape and size of the Antarctic ice sheet leaves imprints on water isotopes in ice cores allowing for inferences about the shape to be made from these isotopes. Isotope enabled simulations allow exploration of past ice sheet and sea-ice influences on water isotopes. Results from data-model comparisons allow for the reconstruction of past ice sheet geometries, giving us methods of evaluating future predictions of global and polar sea-level changes.

The complexity and computational cost limit our progress of understanding the joint impact of a variety of possible ice sheet/sea ice changes and these variations through time. To address these issues we propose to use Gaussian process emulation (O’Hagan, 1992) to approximate the response of the isotope-enabled simulator over a wide range of ice geometries.

A novel statistical approach for palaeoclimate spatial model-data comparison

(PI) Richard Wilkinson, University of Sheffield / Phillip Paine / Dan Lunt / Fran Bragg

One of the reasons for studying palaeo-climate, is that it provides us with an independent set of data which we can use to test the performance of climate simulators (GCMs). By simulating climate using palaeo boundary conditions (different CO2 levels, different ice-sheet configurations, and even different continental geography if we go far enough back into the past), we can test the physical understanding encoded in the simulators (GCMs) and compare the skill of different GCMs.

However, it is challenging to compare palaeo data with GCM output. The data and model predictions are often at different locations, record different quantities, and the palaeo-data are often very noisy (uncertain). This project will look at developing statistical methods for model-data comparison. We are going to use spatial statistical models (known as kriging models) to represent the data, and then compare this to GCM output using proper scoring rules. By doing this, we hope to be able to over-come various issues that exist with current approaches, namely, taking into account the uncertainty in the data, the clustering in the locations of the data points, and putting the data and GCMs on a comparable footing.

Searching for the deglaciation: spatio-temporal boundary condition uncertainty and its implications for understanding abrupt climate change

(PI) Daniel Williamson, University of Exeter / Lauren Gregoire / Tamsin Edwards

A chicken-and-egg problem exists in ice sheet models, wherein ice sheet model boundary conditions (BC) derived from global climate simulations contain large uncertainties, while climate simulations take into account ice sheet surface elevation and melting, compounding the uncertainties from the former. We will develop a statistical method to explore the spatio-temporal uncertainty in climate Boundary Conditions to ice sheet models simulating the last 21k years. This will aid in addressing both BC uncertainties and the failure of current models reproduction of reconstructed Northern Hemisphere ice sheet retreat.

Large ensembles of model scenarios are required for identifying BCs and their parameter values. Due to the very high dimension of deglaciation BCs and the computing costs associated with simulating these, ensemble sizes are limited. Novel ‘rotation’ methods developed by the PI of this study will be used to improve tractability and performance of Principal Component analysis. Using the Glimmer-CISM model, simulations producing low-order representations of climate fields will be refined and optimised with ‘rotation’ methods to produce more thorough uncertainty quantification.

Tackling uncertainty in rainfall projections with past changes

(PI) Peter Hopcroft, University of Bristol / Jonty Rougier / Paul Valdes

Rainfall is a key characteristic of climate, and it is important to have reliable projections, particularly for vulnerable populations. Climate models show that 21st Century will bring variations more extreme than observed over the late 20th century, but models disagree on the pattern and in some cases the sign of change.

One approach to improving this is to turn to climate states from the past which fall outside of the narrow envelope of observed variability. These can be used to test models. The mid-Holocene, 6000 years before present is among the best-placed past time-periods for this, with a striking monsoon enhancement that caused a ’greening’ of the Sahara. However, climate model simulations fail to reproduce this required rainfall increase.

Recently it has been shown that at sufficiently high-resolution (~25km) one climate model is able to reproduce for the first time the observed 20th Century decadal rainfall changes in north Africa. These results call for a new appraisal of what we can learn about rainfall and climate change from the mid-Holocene test case. We propose to combine a perturbed parameter approach with simulations at very high spatial resolution. We aim to provide new insight into past monsoon change, and using robust statistical approaches, to find out whether the past can narrow uncertainty on future rainfall projections.

Automated tuning of Palaeoclimate records

(PI) Timothy Heaton, University of Sheffield / William Austin / Edouard Bard / Andrea Burke / Vasile Ersek / Luke Skinner / Ludger Evans / Phillip Paine

When studying past climate we often need to to create timescales for the palaeo-records we study e.g. speleothem records or deep sea cores which accumulate slowly over time. A common way of doing this is to tune multiple records to one another. This involves the matching together of multiple separate records (from cores with varying and different accumulation rates) based on correlating proxy information within each core. Such a process can be thought of as squashing or stretching one record onto another.

Tuning allows us to borrow strength from multiple records to transfer timescale information between records. However the accuracy of such tuning and specifically the recognition of the various uncertainties involved is crucial if we want high quality past climate reconstructions.

Most current tuning (e.g. Darfeuil et al., 2016; Dickson et al., 2008) is performed by hand whereby an expert identifies, by eye, individual sharp peaks in the signal which are thought to belong to global phenomena and shared by all the records. This project aims to develop an automated alignment method that provides not just tuned estimates but also a measure of uncertainty on the resultant match and inferred chronologies.

Download Call Document