Seminars

Seminars / Informal seminars / Lectures by ECMWF Staff and Invited Lecturers

Seminars contribute to our ongoing educational programme and are tailored to the interests of the ECMWF scientific community.

Informal seminars are held throughout the year on a range of topics. Seminars vary in their duration, depending on the area covered, and are given by subject specialists. As with the annual seminar, this may be an ECMWF staff member or an invited lecturer.

The following is a listing of seminars/lectures that have been given this year on topics of interest to the ECMWF scientific community.  See also our past informal seminars

2019

30 May
at 10:30

The atmospheric response to increased ocean model resolution in the ECMWF Integrated Forecasting System: a seamless approach

Speaker: Chris Roberts

Abstract

This study uses initialized forecasts and climate integrations to evaluate the wintertime North Atlantic response to an increase of ocean model resolution from ∼100 km (LRO) to ∼25 km (HRO) in the European Centre for Medium-Range Weather Forecasts Integrated Forecasting System (ECMWF-IFS). The presented analysis considers the atmospheric response at lead times of weeks to decades and assesses mean biases, variability, and subseasonal predictability. Importantly, the simulated impacts are highly dependent on lead time such that impacts seen at climate timescales cannot be generalized to initialized forecasts. At subseasonal lead times (weeks 1-4), sea surface temperature (SST) biases in LRO and HRO configurations are similar and partly inherited from ocean initial conditions. At multidecadal timescales, mean differences are dominated by biases in the LRO configuration, which include a North Atlantic cold bias and a breakdown in the characteristic patterns of ∇2SST and surface wind convergence. Some elements of variability, such as blocking and the intensity of the storm track, respond to mean biases and are more sensitive to ocean resolution at longer lead times. Other aspects of variability, such as the intensity of air-sea interaction over the Gulf Stream, are highly sensitive to ocean resolution at all lead times. Finally, an increase of ocean model resolution from 100 km to 25 km improves subseasonal predictability over Europe. These improvements are associated with increased ensemble spread in forecast SSTs and significant improvements to the predicted amplitude and phase of the Madden Julian Oscillation.

28 May
at 10:30

Room: LT

Improving atmospheric reanalyses for historical extreme events by rescuing lost weather observations

Speaker: Ed Hawkins (University of Reading)

Abstract

Our understanding of past changes in weather and climate rely on the availability of observations made over many decades. However, billions of historical weather observations are effectively lost to science as they are still only available in their original paper form in various archives around the world. The large-scale digitisation of these observations would substantially improve atmospheric reanalyses back to the 1850s. Recently, volunteer citizen scientists have been assisting with the rescue of millions of these lost observations taken across western Europe over a hundred years ago. The value of these data for understanding many notable and extreme weather events will be demonstrated.

16 May
at 11:15

Room: Council

Are seasonal forecasts useful to improve operational decisions for water supply in the UK?

Speakers: Francesca Pianosi and Andres Peñuela (Bristol University)

Abstract

Improved skill of seasonal predictions for the North Atlantic circulation and Northern Europe are motivating an increasing effort towards developing seasonal hydrological forecasting systems, such as the Copernicus Climate Change Service (C3S). Among other purposes, such forecasting systems are expected to deliver better-informed water management decisions. Using a pumped-storage reservoir system in the UK as a pilot application, we investigate the potential for using seasonal weather forecasts to simultaneously increase supply reliability and reduce pumping costs. To this end, we develop a Real-Time Optimisation System (RTOS) that uses C3S seasonal weather forecasts to generate hydrological forecasts, and combine them with a reservoir simulation model and an evolutionary optimisation algorithm, to generate release and pumping decisions.

We evaluate the performances of the RTOS over historical periods and compare it to several benchmarks, including a simplified operation scheme that mimic the current operational procedures, and a RTOS that uses Ensemble Streamflow Predictions (ESP) in place of C3S seasonal forecasts. We also attempt at linking the improvement of system performances to the characteristic of hydrological conditions and forecasts properties. Ultimately, we aim at addressing key questions such as ‘To what extent improving forecast skill translates into an increase of the forecast value for water supply decisions?’ and ‘Does accounting for forecast uncertainty in optimization improve decisions?’.

 16 May
at 10:15

Room: LT

Understanding the intraseasonal variability over Indian region and development of an operational extended range prediction system

Speaker: Dr Sahai (ITM, India)

 Abstract

Extended range forecast of sub seasonal variability beyond weather scale is a critical component in climate forecast applications over Indian region. The sub-seasonal to seasonal (s2s) project, undertaken by WCRP, started in 2013 to improve the forecast beyond weather scale which is a challenging gap area in research and operational forecast domain. The primary objective of this s2s project is to provide the sub-seasonal to seasonal forecast in various lead times.

The prediction of weather and climate in the extended range (i.e. 2-3 weeks in advance) is much in demand in the sectors depending on water resources, city planning, dam management, health management (e.g. protection against heat death) etc. This demand has grown up manifold in the last five years with the experimental implementation of dynamical extended range prediction system (ERPS) by Indian Institute of Tropical Meteorology (IITM), Pune. At the heart of ERPS is a forecast system based on the NCEP-CFSv2 Ocean-Atmosphere coupled dynamical model (hereafter CFS), which is configured to run in two resolutions (T382 and T126) and an atmospheric only version (hereafter GFS) configured to run with CFS sea surface temperature (SST) boundary condition that is bias corrected with observations. The initial conditions to run the model are generated through an in-house developed perturbation technique using NCMRWF(atmospheric) and INCOIS(ocean) data assimilation system. From every initial condition the model is run for the next 28 days and a multi ensemble forecast run is created. Forecast product variables are then separated for sector specific application with suitable post processing and downscaling based on advanced statistical techniques. The application of this forecast can be made in several allied fields like agro-meteorology, hydrometeorology, health sector etc. My talk will provide a brief overview of ERPS keeping the focus on few sector specific  applications.

15 May
at 10:30

Room: LT

Parallel in Time Integration Using PFASST

Speaker: Michael Minion (Lawrence Berkeley National Laboratory)

Abstract

The Parallel Full Approximation Scheme in Space and Time (PFASST) is an iterative approach to parallelization for the integration of ODEs and PDEs in the time direction.  I will give an overview of the PFASST algorithm, discuss the advantages and disadvantages of PFASST compared to other popular parallel in time (PinT) approaches, and show some examples of PFASST in applications.  I will also explain the construction of a new class of PinT integrators that combine properties of exponential integrators and PFASST, including some preliminary results on the accuracy and parallel performance of the
algorithm.

13 May
at 11:00

Room: LT

Flood Forecasting and Inundation Mapping using the US National Water Model

Speaker: David R Maidment (University of Texas at Austin)

Abstract

The US National Water Model forecasts flows on 5.2 million km of streams and rivers in the continental United States, divided into 2.7 million forecast reaches.  A Medium Range Forecast from this model for Hurricane Harvey prepared 3 days before the hurricane made landfall successfully predicted the spatial pattern of inland flooding in Texas.  A continental scale inundation map has been developed using the Height Above Nearest Drainage (HAND) method, and an associated cell phone app called Pin2Flood built that enables flood emergency responders to create their own flood inundation maps using the same HAND map base, thus connecting predictive and observational flood inundation mapping.

About the Presenter: David R Maidment is the Hussein M Alharthy Centennial Chair in Civil Engineering at the University of Texas at Austin, where he has served on the faculty since 1981.  He received his BE degree from the University of Canterbury in Christchurch, New Zealand, and his MS and PhD degrees from the University of Illinois.  In 2016, he was elected to the US National Academy of Engineering for application of geographic information systems to hydrologic processes.

21 March
at 10:30

Room: LT

Constraining Stochastic Parametrization Schemes using High-Resolution Model Simulations

Speaker: Hannah Christensen (Oxford University)

Abstract

Stochastic parametrizations are used in weather and climate models to represent model error. Designing new stochastic schemes has been the target of much innovative research over the last decade, with a focus on developing physically motivated approaches. We present a technique for systematically deriving new stochastic parametrizations or for constraining existing stochastic parametrizations. We take a high-resolution model simulation and coarse-grain it to the desired forecast model resolution. This provides the initial conditions and forcing data needed to drive a Single Column Model (SCM). By comparing the SCM parametrized tendencies with the evolution of the high-resolution model, we can measure the ‘error’ in the SCM tendencies. As a case study, we use this approach to assess the physical basis of the widely used ‘Stochastically Perturbed Parametrization Tendencies’ (SPPT) scheme using the IFS SCM. We provide justification for the multiplicative nature of SPPT, and for the large temporal and spatial scales used in the stochastic perturbations. However, we also identify issues with the SPPT scheme and motivate improvements. In particular, our results indicate that independently perturbing the tendencies associated with different parametrization schemes is justifiable, but that an alternative approach is needed to represent uncertainty in the convection scheme. It is hoped this new coarse-graining technique will improve both holistic and process-based approaches to stochastic parametrization.

20 March
at 10:30

Room: LT

About novel time integration methods for weather and climate simulations

Speaker: Martin Schreiber (Tech University of Munich)

Abstract

Weather and climate simulations face new challenges due to changes in computer architectures caused by physical limitations. From a pure computing perspective, algorithms are required to cope with stagnating or even decreasing per-core speed and increasing on-chip parallelism. Although this leads to an increase in the overall on-chip compute performance, data movement is increasingly becoming the most critical limiting factor. All in all, these trends will continue and already led to research on partly disruptive mathematical and algorithmic reformulations of dynamic cores, e.g. using (additional) parallelism in the time dimension.

This presentation provides an overview and introduction to the variety of newly developed and evaluated time integration methods for dynamical cores, all aimed at improving the ratio of wall clock time to error:

First, I will begin with rational approximations of exponential integrator methods in their various forms: Terry Haut's rational approach of exponential integrators (T-REXI), Cauchy contour integral methods (CI-REXI) on the complex plane and their relationship to Laplace transformations, and diagonalized Butcher's Tableau (B-REXI).

Second, Semi-Lagrangian (SL) methods are often used to overcome limitations on stable time step sizes induced by nonlinear advection. These methods show superior properties in terms of dispersion accuracy, and we have used this property with the Parareal parallel-in-time algorithm. In addition, a combination of SL with REXI is discussed, including the challenges of such a formulation due to Lagrangian formulation.

Third, the multi-level time integration of spectral deferred correction (ML-SDC) will be discussed, focusing on the multi-level induced truncation of nonlinear interactions and the importance of viscosity in this context. Based on this, the "Parallel Full Approximation Scheme in Space and Time" (PFASST) adds a time parallelism that allows even higher accelerations on the time-to-solution compared to ML-SDC and traditional time integration methods.

All studies were mainly conducted based on the shallow water equations (SWE) on the f-plane and the rotating sphere to investigate horizontal aspects of dynamical cores for weather and climate simulation. Overall, our results motivate further investigation and combination of these methods for operational weather/climate systems.

(With contributions and more from Jed Brown, Francois Hamon, Richard Loft, Michael Minion, Matthew Normile, Nathanaël Schaeffer, Andreas Schmitt, Pedro S Peixoto).

12 March
at 11:15

Room: CC  

Running serverless HPC workloads on top of Kubernetes and Jupyter notebooks

Speaker: Christopher Woods (University of Bristol)

6 March
at 10:30

Room: LT

Trends in data technology: opportunities and challenges for Earth system simulation and analysis

Speaker: V Balaji (Princeton Uni and NOAA/GFDL)

Abstract

Earth system modeling, since its origin at the dawn of modern computing, has operated at the very limits of technological possibility. This has led to tremendous advances in weather forecasting, and the use of models to project climate change both for understanding the Earth system, and in service of downstream science and policy. In this talk, we examine changes in underlying technology, including the physical limits of miniaturization, the emergence of a deep memory-strategy hierarchy, which make "business as usual" approaches to simulation and analysis appear somewhat risky. We look simultaneously at trends in Earth system modeling, in terms of the evolution of globally coordinated climate science experiments (CMIP-IPCC) and the emergence of "seamless prediction", blurring the boundaries between weather and climate. Together, these point to new directions of research and development in data software and data science. Innovative and nimble approaches to analysis will be needed. Yesterday's talk examines this in the context of computational science and software, but it seems apparent that computation and data are inseparable problems, and a unified approach is indicated.

6 March
at 14:00

Room: LT

Statistics for Natural science in the age of Supercomputers

Speaker: Dutta Ritabrata (Warwick University)

Abstract:

To explain the fascinating phenomenons of nature, natural scientists develop complex models which can simulate these phenomenons almost close to reality. But the hard question is how to calibrate these models given the real world observations. Traditional statistical methods are handicapped in this setup as we can not evaluate the likelihood functions of parameters of this models. In last decades or so, that statisticians answer to these questions has been approximate Bayesian computation (ABC), where the parameters are calibrated by comparing simulated and observed data set in a rigorous manner. Though it only became possible to apply ABC for realistic and hence complex models when it was efficiently combined with High Performance Computing (HPC). In this work, we will focus on this aspect of ABC, by showing how it was able to calibrate expensive models of epidemics on networks, of molecular dynamics, of platelets deposition in blood-vessels, of passenger queue in airports or volcanic eruption. This was achieved using standard MPI parallelization, nested MPI parallelization or nested CUDA parallelization inside MPI. Finally, we want to raise and discuss the open-questions regarding how to evolve and strengthen this inferential methods when each model simulation takes a full day or a resource equivalent to the best supercomputers of today.

5 March
at 10:30

Room: LT

Machine learning and the post-Dennard era of climate simulation

Speaker: V Balaji (Princeton Uni and NOAA/GFDL

Abstract

In this talk, we examine approaches to Earth system modeling in the post-Dennard era, inspired by the industry trend toward machine learning (ML). ML presents a number of promising pathways, but there remain challenges specific to introducing ML into multi-phase multi-physics modeling. A particular aspect of such 'multi-scale multi-physics' models that is under-appreciated is that they are built using a combination of local process-level and global system-level observational constraints, for which the calibration process itself remains a substantial computational challenge. These include, among others: the non-stationary and chaotic nature of climate time series; the presence of climate subsystems where the underlying physical laws are not completely known; and the imperfect calibration process alluded to above. The talk will present ideas and challenges and the future of Earth system models as we prepare for a post-Dennard future, where learning methods are poised to play an increasingly important role.

21 January
at 11:00

Room: LT

ESSPE: Ensemble-based Simultaneous State and Parameter Estimation for Earth System Data-Model Integration and Uncertainty Quantification

Speaker: Fuqing Zhang (Pennsylvania State University)

Abstract

Building on advanced data assimilation techniques, we advocate to develop and apply a generalized data assimilation software framework on Ensemble-based Simultaneous State and Parameter Estimation (ESSPE) that will facilitate data-model integration and uncertainty quantification for the broad earth and environmental science communities. This include, but not limited to, atmospheric composition and chemistry, land surface, hydrology, and  biogeochemistry, for which many of the physical and chemical processes in their respective dynamic system models rely heavily on parametrizations. Through augmenting uncertain model parameters as part of the state vector, the ESSPE framework will allow for simultaneous state and parameter estimation through assimilating in-situ measurements such as those from the critical-zone ground-based observational  networks and/or remotely sensed observations such as those from radars and satellites. Beyond data model integration and uncertainty quantification, through systematically designed ensemble sensitivity analysis, examples will be given to the application of the ESSPE framework to: (1) identify key physical processes and their significance/impacts and to better represent and parameterize these processes in dynamical models of various earth systems; (2) design better observation strategies in locating the optimum sensitive regions, periods and variables to be measured, and the minimum accuracies and frequencies of these measurements that are required to quantify the physical processes of interest; explore the impacts of heterogeneity and equifinality; (3) understand predictability and nonlinearity of these processes, and parameter identifiability; and (4) facilitate upscale cascading of knowledge from smaller-scale process understanding to larger-scale simplified representation and parametrization. I will end the presentation with an introduction on the preliminary findings from our ongoing collaborations with ECMWF on using the data assimilation methodology to identify potential deficiencies in the convective gravity drag parametrization that led to  stratospheric temperature biases in the operational model, and the potential pathways for using SSPE to improve model physics in the future.

25 January
at 10:30

Room: LT

Windstorm and Cyclone Events: Atmospheric Drivers, Long-term Variability and Skill of current Seasonal Forecasts 

Speaker: Daniel J Befort (University of Oxford)

Abstract

In this study, observed long-term variability of wind storm events is analysed using two state-of-the-art 20th century reanalyses called ERA-20C and NOAA-20CR. Long-term trends partly differ drastically between both datasets. These differences are largest for the early 20th century, with a higher agreement for the past 30 years. However, short-term variability on sub-decadal time-scales is in much better agreement especially over parts of the northern hemisphere. This suggests that these datasets are useful to analyse drivers of interannual variability of windstorm events  as these simulations allow to extend the time-period covered by common reanalyses as e.g. ERA-Interim. 

ERA-20C is used to analyse the relationship between atmospheric and oceanic conditions on windstorm frequency over the European continent. It is found that large parts of their interannual variability can be explained by few atmospheric patterns, including the North Atlantic Oscillation (NAO) and the Scandinavian pattern. This suggests that it is crucial to capture these atmospheric modes of variability e.g. in seasonal forecast systems to reasonably represent windstorm variability over Europe. 

The skill in windstorms and cyclones is analysed for three modern seasonal forecast systems: ECMWF-S3, ECMWF-S4 and GloSea5. Whereas skill for cyclones is generally small, significant positive skill of ECMWF-S4 and GloSea5 is found for windstorms over the eastern North Atlantic/western Europe. Further to analysing skill in windstorms using a dedicated tracking algorithm, it is also tested in how far the NAO can be used as a predictor for their variability. Results suggest that using the NAO adds some skill over northern Europe, however, using the whole model information by tracking windstorm events is superior over large parts over the eastern Atlantic and western Europe. 

7 January
at 10:30

Room: LT

When fossil fuel emissions are no longer perfect in atmospheric inversion systems

Speaker: Thomas Lauvaux (LSCE, Saclay, France)

Abstract

The biogenic component of greenhouse gas fluxes remains the primary source of uncertainties in global and regional inversion systems. But recent results suggest that anthropogenic greenhouse gas emissions from fossil fuel use, so far assumed perfect at all scales, represent a larger fraction of the uncertainties in these systems, and can no longer be ignored. Inversion systems capable of reducing fossil fuel uncertainties are discussed in parallel with planned observing systems deployed across the world and in space. The remaining challenges and recent advances are presented to not only infer fossil fuel emissions but to provide support to climate policy makers at national and local scales.

 

Uncertainty quantification of pollutant source retrieval: comparison of Bayesian methods with application to the Chernobyl and Fukushima Daiichi accidental releases of radionuclides

Speaker: M Bocquet (CEREA, France)

Abstract

Inverse modeling of the emissions of atmospheric species and pollutants has significantly progressed over the past fifteen years.  However, in spite of seemingly reliable estimates, the retrievals are rarely accompanied with an objective estimate of their uncertainty, except when Gaussian statistics are assumed for the errors which is often unrealistic.  I will describe rigorous techniques meant to compute this uncertainty in the context of the inverse modeling of the time emission rates -- the so-called source term -- of a point-wise atmospheric tracer.  Lognormal statistics are used for the positive source term prior and possibly the observation errors, which precludes simple Gaussian statistics-based solutions.

Firstly, through the so-called empirical Bayesian approach, parameters of the error statistics -- the hyperparameters -- are estimated by maximizing the observation likelihood via an expectation-maximization algorithm. This enables the estimation of an objective source term.  Then, the uncertainties attached to the total mass estimate and the source rates are estimated using four Monte Carlo techniques: (i) an importance sampling based on a Laplace proposal, (ii) a naive randomize-then-optimize (RTO) sampling approach, (iii) an unbiased RTO sampling approach, (iv) a basic Markov chain Monte Carlo (MCMC) simulation. Secondly, these methods are compared to a full Bayesian hierarchical approach, using an MCMC based on a transdimensional representation of the source term to reduce the computational cost.

I will apply those methods, and improvements thereof, to the estimation of the atmospheric cesium-137 source terms from the Chernobyl nuclear power plant accident in April/May 1986 and Fukushima Daiichi nuclear power plant accident in March 2011.

LT = Lecture Theatre, LCR = Large Committee Room, MZR = Mezzanine Committee Room,
CC = Council Chamber