Edinburgh Research Explorer

Information Theory for Climate Change and Prediction

Research output: Chapter in Book/Report/Conference proceedingChapter (peer-reviewed)

Related Edinburgh Organisations

Original languageEnglish
Title of host publicationEncyclopedia of Applied and Computational Mathematics
EditorsB. Engquist
PublisherSpringer
Pages682-686
Number of pages92
ISBN (Electronic)978-3-540-70529-1
ISBN (Print)978-3-540-70528-4
DOIs
Publication statusPublished - 21 Nov 2015

Abstract

The Earth’s climate is an extremely complex system coupling physical processes for the atmosphere, ocean, and land over a wide range of spatial and temporal scales (e.g., [5]). In contrast to predicting the small-scale, short-term behavior of the atmosphere (i.e., the “weather”), climate change science aims to predict the planetary-scale, long-time response in the “climate system” induced either by changes in external forcing or by internal variability such as the impact of increased greenhouse gases or massive volcanic eruptions [14]. Climate change predictions pose a formidable challenge for a number of intertwined reasons. First, while the dynamical equations for the actual climate system are unknown, one might reasonably assume that the dynamics are nonlinear and turbulent with, at best, intermittent energy fluxes from small scales to much larger and longer spatiotemporal scales. Moreover, all that is available from the true climate dynamics are coarse, empirical estimates of low-order statistics (e.g., mean and variance) of the large-scale horizontal winds, temperature, concentration of greenhouse gases, etc., obtained from sparse observations. Thus, a fundamental difficulty in estimating sensitivity of the climate system to perturbations lies in predicting the coarse-grained response of an extremely complex system from sparse observations of its past and present dynamics combined with a suite of imperfect, reduced-order models.

For several decades, the weather forecasts and the climate change predictions have been carried out through comprehensive numerical models [5, 14]. However, such models contain various errors which are introduced through lack of resolution and a myriad of parameterizations which aim to compensate for the effects of the unresolved dynamical features such as clouds, ocean eddies, sea ice cover, etc. Due to the highly nonlinear, multi-scale nature of this extremely high-dimensional problem, it is quite clear that – despite the ever increasing computer power – no model of the climate system will be able to resolve all the dynamically important and interacting scales.

Recently, a stochastic-statistical framework rooted in information theory was developed in [1, 10–12] for a systematic mitigation of error in reduced-order models and improvement of imperfect coarse-grained predictions. This newly emerging approach blends physics-constrained dynamical modeling, stochastic parameterization, and linear response theory, and it has at least two mathematically desirable features: (i) The approach is based on a skill measure given by the relative entropy which, unlike other metrics for uncertainty quantification in atmospheric sciences, is invariant under the general change of variables [9, 13]; this property is very important for unbiased model calibration especially in high-dimensional problems. (ii) Minimizing the loss of information in the imperfect predictions via the relative entropy implies simultaneous tuning of all considered statistical moments; this is particularly important for improving predictions of nonlinear, non-Gaussian dynamics where the statistical moments are interdependent.

ID: 17860481