James Jackaman

Project Title:

Mimetic discontinuous Galerkin methods on surfaces

Project Details:

The goal of this project would be to design and analyse numerical schemes that preserve certain quantities of interest. Our scheme of choice is a family of discontinuous Galerkin (dG) schemes. These schemes are well suited for this class of problems because they allow flexibility in the choice of flux over element edges. The question we want to address is the following: Is it possible to preserve discrete symplectic forms together with other conserved quantities, for example, mass, momentum, etc., to some prescribed tolerance? If so, how many can be preserved and what effect does this have on the accuracy of the (modelled) dynamics of the problem?

The three main difficulties in the design of such schemes that will be tackled in this project are:

1. Design of appropriate timestepping schemes.

2. Design of appropriate spatial discretisations on planar domains.

3. Extension of these schemes to physically relevant models.

Supervisor:

Dr Tristan Pryer

Carlo Cafaro

Project Title:

Extracting slow modes of atmospheric variability using conservation properties of fluid dynamics

Project Details:

Assessment of the quality of numerical models of the atmospheric flow takes several forms. One way it is to compare the behaviours with regard to some coherent structures in the atmospheric flow. One such set of structures arises from the partition the full flow into a background state and perturbations which evolve around it. Over the last decade the co-Is (Methven and Berrisford) have developed a new method to extract a slowly evolving background state from atmospheric analysis data. This background is defined to be a zonally symmetric adiabatic rearrangement of the full flow that preserves two of its key integral properties: mass and circulation integrals that are functionals of potential temperature and potential vorticity (PV). Since potential temperature and PV are materially conserved for adiabatic and frictionless flow, the functionals and therefore the background state would be time-invariant in the absence of non-conservative processes. Such non-conservative processes are weak in the atmosphere and so the background state evolves slowly (without time-filtering). Furthermore, all the adiabatic motions are reflected only in the perturbations which can be described by two wave activity conservation laws valid at finite amplitude (for pseudomomentum and pseudoenergy – e.g., Methven, 2013).This project will examine the perturbations, defined in this way, in both atmospheric analysis data and extended range ensemble forecasts from the ECMWF (out to 30 days). Slow modes of variability in the perturbation field will be extracted using the empirical normal mode (ENM) technique pioneered by Brunet (1994). The technique combines the well-known EOF technique used to examine the statistics of large datasets with dynamical insight to extract structures that explain the majority of the variance in the data, but which also have a dynamical interpretation. Brunet (1994) has shown that the discrete spectrum of slow modes obtained from data exhibits very few modes dominating the variance.

Supervisor:

Dr Thomas Frame, Dr Jochen Broecker

Joshua Prettyman

Project Title:

Tipping point analysis of geophysical variables

Project Details:

Understanding components of a dynamical system, such as fluctuations and natural/anthropogenic changes in the Earth  challenging scientific task. The scale of geophysical monitoring required, the changes due to the development of instrumentation, and the most appropriate analysis methods are the current task under debate. The challenges in studying such time series include being able to handle a large dynamic range, non-Gaussian distributions at various frequency bands, the presence of high-amplitude transients, and processing of large data sets. The uncertainties associated with the trend estimates must account for the nature of the fluctuations as well as the choice of variables, the model, or metrics. In order to understand natural- and anthropogenic-driven climate change, we plan to study Essential Climate Variables (ECVs), in particular, land surface humidity and sea-surface temperature (SST). The goals of this project are to 1) develop a methodology for studying gradual and abrupt changes in geophysical data, by testing and refining previously developed techniques 2) compare these techniques with classical parametric inference approaches for stochastic processes and evaluate the model error by considering higher dimensional models; and 3) determine whether such abrupt changes in the variables under consideration are underway.

Supervisor:

Dr Tobias Kuna

Hinesh Chotai

Project Title:

Forward-backward stochastic differential equations and applications to carbon emissions markets

Project Details:

As a response to the risk of Climate Change, carbon markets are currently being implemented in several regions worldwide. Since 2005, the European Union (EU) has had its own Emissions Trading System (ETS), which today is the largest such market. In September 2015, it was announced that China (whose carbon emissions make up approximately one quarter of the global total) will introduce a national emissions trading scheme in 2017. When it comes into effect, China’s market will be the largest of its kind, overtaking the EU ETS. At that point, some 40% of global emissions will be covered by cap-and-trade schemes. According to the World Bank, the world’s emissions trading schemes are currently valued at about $30 billion. However, scientific, and particularly mathematical, studies of these carbon markets are needed in order to expose their advantages and shortcomings, as well as allow their most efficient implementation.

In this project, we will consider a mathematical model for the pricing of emissions permits. The model has particular applicability to the trading of European Union Allowances (EUAs) in the EU ETS but could also be used to model other cap-and-trade schemes. We will investigate mathematical properties such as the existence & uniqueness of solutions for the pricing problem, stability of solutions (e.g. the sensitivity of prices under small perturbations to the market) and their behaviour, as well as computational algorithms for solving pricing problems. The model can be extended to account for multiple trading periods.

The project will involve three work packages. Firstly, the project will consider a qualitative analysis of the class of FBSDEs introduced above. As demonstrated in the literature already, this may require one to relax the notion of solution to a FBSDE. Secondly, numerical schemes applicable to the pricing problem will be investigated. The pricing problem is high-dimensional (4-8 dimensions) and a robust numerical scheme will be required to produce satisfactory numerical results, particularly if the multi-period model is to be considered. Thirdly, a case study of the UK energy market will be considered. This will involve calibrating the model’s parameters and processes to real data.

With action on climate change being strongly needed, and the implementation of carbon markets a promising policy option, this research will not only innovate on the theoretical side, but also has a high policy and economic impact potential.

Supervisor:

Professor Dan Crisan

Thomas Leahy

Project Title:

Stochastic and Statistical Modelling of Extreme Meteorological Events: Tropical Cyclones  

Project Details:

PhD summary: Catastrophe models are essential tools for insurance and reinsurance companies. These models provide support to (re)-insurers calculating an adequate level of capital to cover an extreme meteoro- logical event, a seismic event etc. Catastrophe models are invaluable tools in actuarial science when conducting risk analysis for insurance purposes and are crucial to the managing catastrophe risk expo- sure. The output from these models is used by insurers and re-insurers to make premium and reserve calculations. The reserves set aside are imperative for the reconstruction of housing and infrastructure damaged and destroyed by such extreme events. Due to the power and sophisticated nature of these models, they are very expensive for insurers to lease. One of the inputs to such models is the historical observations of the event, for example historical tropical cyclones. However, since we only have one observable earth, we only have a finite set of reliable observations. This directly influences the level of confidence we have on decisions made based on this data. It is therefore necessary to supplement the historical observations and include synthetic data to improve the level of confidence on the output of catastrophe models. There are models in operation that can achieve this task already, namely Global Climate Models (GCMs). However, depending on the temporal and spatial resolution of the GCMs, they are computationally expensive and time consuming to run. There is also no guarantee that a GCM run will produce the desired event. The primary focus of this project is to develop a stochastic model for the genesis, propagation and termination of tropical cyclones. Stochastic models can be run with basic computing power and can produce significantly more events in a much shorter time scale. Stochastic models are good at replicating summary statistics of historical tropical cyclones [Leahy, 2015], however in order to quantify possible future scenarios, the stochastic model will have to depend on some physical covariates, for example the sea surface temperature. Reanalysis data is readily available on may physical variables through the ECMWF1. One of the aims of this project is to predict the genesis (frequency and location) of tropical cyclones in the future.

Supervisor:

Professor Axel Gandy

Francesco Pons Llopis

Project Title:

Particle Filtering for applications in Data Assimilation.        

Project Details:

PhD summary PhD Summary: Modelling climate dynamics such as ocean circulation or wind turbulence are very difficult research questions with large economic and societal impacts. Quantifying uncertainty in climate prediction and estimating the potential strength of extreme meteorological events is of great importance, even more in the face of global warming. The objective is to provide estimates for spatial and temporal varying phenomena based on a sequence of observations. This task is commonly referred to as Data Assimilation and is of great importance for numerical weather prediction. So far most methods used for this task are based on heuristics, and although they can be tuned to provide fairly accurate estimates it has been not possible to provide any characterisation of the underlying uncertainty of these estimates. Within the scientific communities working in Probability and Statistics, problems like Data Assimilation are usually formulated as high dimensional non-linear filtering problems. Thereby, one aims to compute the probability distribution of a latent Markov process conditional on the collected observations. These so called filtering distributions can provide detailed information about the uncertainty of estimates of the latent process, but are typically intractable and one needs to rely on numerical approximations. The aim is to develop advanced particle filters, which also utilise efficiently information from the observations and the model structure. In addition, emphasis will be given to using highly parallelised implementations on appropriate hardware platforms such as Graphics Processor Units (GPU) and computing clusters, which could provide massive computational speed-ups. For the modelling, there are many choices available and one could include various spatial and temporal varying processes from Geophysical Fluid Dynamics. The data sets containing observations of these phenomena can be either synthesised from more sophisticated models or real observations provided by weather forecasting centres.

Supervisor:

Dr Nikolaos Kantas

Mr Tobias Schwedes

Project Title:

Uncertainty quantification in systems of partial differential equations with an application in tidal turbine array layouts

Project Details:

The regularity of the tides makes them an attractive prospect for clean and secure renewable power production.  At multiple locations around the UK the local acceleration of tidal currents makes them potentially suitable sites for the installation of large arrays of tidal turbines.  However, these arrays will only be developed if they can be shown to be economically viable.  Optimisation of the array size, location and its precise configuration is essential to achieve this, and sophisticated numerical models are required to assist developers.  Work in this area to date has focussed primarily on the solution of the shallow water equations, and relatively simple optimisation algorithms.  This PhD project will seek to extend the state-of-the-art in optimal turbine array design by using advanced gradient-based optimisation methods, with key novelties coming from the presence of three-dimensional (turbulent) dynamics, the rigorous treatment of system uncertainties, and subgrid-scale turbulent closures. Three-dimensional effects will require the use of scalable, efficient solution procedures for the Navier-Stokes equations, and here will be based upon code generation techniques developed at Imperial and which are designed to take advantage of the high aspect ratio of our target application to deliver highly efficient discretisation techniques.  Part of the project will involve investigation of parameterisation methods to represent tidal turbines (e.g. as momentum sinks, and turbulent sources) within the model. Further to this, both RANS and LES based turbulence closures can and will be considered.  Given the scales of the problem and the fact that an inverse energy cascade applies to the depth-averaged dynamics, the use of LES techniques in the horizontal supplemented with RANS models for fully-three-dimensional turbulent dynamics (e.g. providing back-scattering effects) is an attractive and viable large-scale approach that could be considered.  The adjoint capability which is key to the optimisation component of this work can also be used to help calibrate these turbulence models to best represent turbine wakes, and thus accurately represent the interactions between turbines within an array.  It is clearly crucial to get this right when attempting to optimise the relative locations of turbines.

Supervisor:

Dr Ben Calderhead

Francesco Ferrulli

Project Title:

Spectral properties of one- and two-dimensional Dirac-type operators describing the single or double layer Graphene Hamiltonian

Project Details:

The goal of this project is to study different mathematical aspects of partial differential equations describing the Hamiltonian of single or double

layer of graphene near the Fermi energy level.  Mono and bilayer graphene are of particular interest because of the remarkable physical and chemical properties they exhibit. For instance, graphene is incredibly light, yet also one of the strongest materials ever tested, with a break strength 200 times that of steel. Mono and bilayer graphene are both zero-gap semiconductors with one type of electron and one type of hole. The charge carriers have high mobility, giving rise to extraordinary electronic properties. As a result, these materials have a wealth of potential applications, for instance in industry, electronics, quantum computing, composite materials, hydrogen storage and research to name just a few. The aims of this project is to study the spectral properties of the operator describing the bilayer graphene with non-Hermitian matrices external-

potential and to relate the already known knowledge about the single layer case with physical meaningful applications.

Supervisor:

Prof Ari Laptev