PEN Feasibility Studies

In the summer of 2016 the PEN held a call for feasibility studies. Our members responded with an excellent range of proposals and after a difficult decision process six were awarded funding. Each of these studies will take place during 2017 and address key scientific questions posed in climate modeling, statistical analysis of palaeoclimate data and palaeoclimate reconstructions.

Moving forward with forward models – synthesising past water tables from multiple peat cores

(PI) Maarten Blaauw, Queen's University Belfast / Andrés Christen / Marco Aquino Lopez

Forward models are often promoted as enhancements over ‘reverse’ models traditionally used in palaeoecology. In reverse models, past environments are inferred from fossil ‘proxies’ through statistical correlations between the response (proxies) and their supposed causal factors (climate, environment). This approach is sub-optimal for a number of reasons – poor quantification of uncertainties, dangers of reconstructing non-causal variables, and the difficulties of combining multiple reconstructions. Instead, forward models aim to model how causal processes (e.g., climate) determine the resulting fossil proxy records, via intermediate processes such as hydrology and ecology. In doing so, these models should enable the palaeo-community to obtain a much fuller understanding of how climate, environments and (multiple) fossil proxy records are linked, and also to better quantify the uncertainties involved in palaeoecological reconstructions.

Here we will build and ‘sell’ to the palaeo-community a simple, intuitive and educational forward model of how climate dynamics has forced hydrological changes in Holocene raised bog peat deposits. It builds on existing basic models that employ statistical distributions to simulate how environmental factors could steer fossil proxy values. Through combining multiple peat records into a single framework, we will take a significant step ahead toward a fuller understanding of 1) how climate and other processes steer fossil proxy records, 2) the reproducibility and reliability of reconstructions from single cores, and 3) the possibilities and limitations of combining multiple records.

Emulating the Isotope Response to Changes in the Geometry of the Antarctic Ice Sheet

(PI) Louise Sime, BAS / Dario Domingo / Irene Malmierca / Jochen Voss

This project aims to revolutionise our approach to using ice core observations of water isotopes to constrain past environmental changes. The shape and size of the Antarctic ice sheet leaves imprints on water isotopes in ice cores allowing for inferences about the shape to be made from these isotopes. Isotope enabled simulations allow exploration of past ice sheet and sea-ice influences on water isotopes. Results from data-model comparisons allow for the reconstruction of past ice sheet geometries, giving us methods of evaluating future predictions of global and polar sea-level changes.

The complexity and computational cost limit our progress of understanding the joint impact of a variety of possible ice sheet/sea ice changes and these variations through time. To address these issues we propose to use Gaussian process emulation (O’Hagan, 1992) to approximate the response of the isotope-enabled simulator over a wide range of ice geometries.

A novel statistical approach for palaeoclimate spatial model-data comparison

(PI) Richard Wilkinson, University of Sheffield / Phillip Paine / Dan Lunt / Fran Bragg

One of the reasons for studying palaeo-climate, is that it provides us with an independent set of data which we can use to test the performance of climate simulators (GCMs). By simulating climate using palaeo boundary conditions (different CO2 levels, different ice-sheet configurations, and even different continental geography if we go far enough back into the past), we can test the physical understanding encoded in the simulators (GCMs) and compare the skill of different GCMs.

However, it is challenging to compare palaeo data with GCM output. The data and model predictions are often at different locations, record different quantities, and the palaeo-data are often very noisy (uncertain). This project will look at developing statistical methods for model-data comparison. We are going to use spatial statistical models (known as kriging models) to represent the data, and then compare this to GCM output using proper scoring rules. By doing this, we hope to be able to over-come various issues that exist with current approaches, namely, taking into account the uncertainty in the data, the clustering in the locations of the data points, and putting the data and GCMs on a comparable footing.

Searching for the deglaciation: spatio-temporal boundary condition uncertainty and its implications for understanding abrupt climate change

(PI) Daniel Williamson, University of Exeter / Lauren Gregoire / Tamsin Edwards

A chicken-and-egg problem exists in ice sheet models, wherein ice sheet model boundary conditions (BC) derived from global climate simulations contain large uncertainties, while climate simulations take into account ice sheet surface elevation and melting, compounding the uncertainties from the former. We will develop a statistical method to explore the spatio-temporal uncertainty in climate Boundary Conditions to ice sheet models simulating the last 21k years. This will aid in addressing both BC uncertainties and the failure of current models reproduction of reconstructed Northern Hemisphere ice sheet retreat.

Large ensembles of model scenarios are required for identifying BCs and their parameter values. Due to the very high dimension of deglaciation BCs and the computing costs associated with simulating these, ensemble sizes are limited. Novel ‘rotation’ methods developed by the PI of this study will be used to improve tractability and performance of Principal Component analysis. Using the Glimmer-CISM model, simulations producing low-order representations of climate fields will be refined and optimised with ‘rotation’ methods to produce more thorough uncertainty quantification.

Tackling uncertainty in rainfall projections with past changes

(PI) Peter Hopcroft, University of Bristol / Jonty Rougier / Paul Valdes

Projections show that precipitation variability is likely to be greater in the future, with widespread effects on society it is crucial to more accurately understand future changes in regional precipitation. Due to the dynamic response of rainfall to increased temperatures, current models struggle to emulate regional changes in precipitation, a notable example of this is the ‘Greening of the Sahara’.

The discrepancies between models and palaeo-data cannot be accounted for by uncertainties in reconstructions or local processes, highlighting a possible systematic bias in current climate models. The emerging consensus is that models do not display the correct sensitivity to external perturbations. This study will use a perturbed parameter approach using the HadCM3 model which is ~1000 x less computationally expensive than the current Met Office model. We will simulate the pre-industrial, a 4xCO2 scenario, and the mid Holocene aiming for a better understanding of convection and to quantify using robust statistical approaches, whether the past is able to constrain future precipitation projections. This will serve to inform future study with the HadGEM3-GA7 climate mode, which benefits from improved atmospheric components. Further outcomes are uncertainty distributions quantifying how much the mid-Holocene case study has contributed to constraining the future.

Automated tuning of Palaeoclimate records

(PI) Timothy Heaton, University of Sheffield / William Austin / Edouard Bard / Andrea Burke / Vasile Ersek / Luke Skinner / Ludger Evans / Phillip Paine

A common way to place past climate records onto a timescale is to tune multiple palaeo-records. This involves the matching together of multiple separate based on correlating proxy information within each core. Such a process can be thought of as squashing one record onto another. Therefore the accuracy of tuning and various uncertainties involved is critical to the quality of past climate reconstruction. Most current tuning takes place manually whereby an expert identifies sharp peaks in the signal shared by all the records.

Existing automated techniques have proven extremely useful however only provide a ‘best estimate’. Furthermore these methods employ simple accumulation models comparted to recent advances such as BCHRON, BACON and OxCAL. We propose to incorporate these more recent models into a Bayesian MCMC approach providing uncertainty tuning at any depth of interest.

This project aims to:

  • Project initial investigation into automated tuning techniques that provide robust uncertainty quantification;
  • Develop user-friendly software for a wide range of palaeoclimate researchers to aid tuning
  • Initiate collaboration and promote discourse on both tuning and proxy data issues

Download Call Document