ECMWF Fellow explores new ideas to boost accuracy in NWP

Share
Professor Tim Palmer

Professor Tim Palmer says work on numerical weather prediction (NWP) carried out by his group at the University of Oxford can help ECMWF achieve “more accuracy with less precision”.

The eminent scientist in the field of atmospheric predictability and ensemble forecasting systems was appointed as an ECMWF Fellow in July 2014.

Halfway through his Fellowship, he welcomes the close co-operation between the Centre and scientists in his group on the methods used to simulate model uncertainty in NWP.

The stochastic nature of these methods has led him to ask a new set of questions: What is the real information content in the variables used in weather and climate models? Are we wasting valuable computing resources by representing those variables overly precisely? Could we increase the resolution of models, and thus the accuracy of forecasts, if we reduced the precision in our computations?

Professor Palmer was at the forefront of research on predictability in NWP at ECMWF before he became a Royal Society Research Professor in Climate Physics at the University of Oxford.

His initial training was as a physicist specialising in the general theory of relativity. This gave him the background to tackle mathematical problems in weather science. The latter – chaos theory in particular – has in turn helped him develop his ideas on fundamental problems in physics, such as the unification of gravity with quantum mechanics.

You have a doctorate in physics, more precisely in the general theory of relativity. How did you come to move into the field of weather and climate dynamics and prediction?

Having come to the end of my PhD at the University of Oxford, I had achieved my ambition of making some contribution to the theory of relativity, but I was a bit concerned that what I was doing was very theoretical and not really of much use to society.

By chance I met a well-known meteorologist at the Met Office, Raymond Hide, who convinced me that climate was an interesting problem for a physicist to think about, and that the Met Office was a good place to work. I had a few moments when I was not sure whether I was doing the right thing, but certainly after a couple of years I realised that I had made the right move. I liked the fact that research in meteorology is very collaborative and people are very open to sharing their ideas.

How relevant are fundamental concepts in mathematics and physics to weather forecasting?

One area which I looked into for a while as a physicist was applying non-equilibrium thermodynamics and the principle of maximum entropy production to try to understand Hawking radiation – the black body radiation released by black holes. And when I had my first discussion with Raymond Hide from the Met Office, almost the first thing he said to me was that there is a lot of interesting work at the moment using the principle of maximum entropy production to try to understand Earth’s climate. This was the first example I encountered of how ideas that were relevant in fundamental physics could be relevant for meteorology.

The application of concepts in instability theory is another example where ideas from different areas of physics are relevant for meteorology. We know that the atmosphere exhibits modes of instability, in other words small perturbations can grow into large weather systems. But it turns out that the atmosphere does not behave like the classic textbook systems one learns about in undergraduate maths or physics courses. When I first came to ECMWF in the 1980s, we did some work trying to understand how to generalise this idea of instability to properly describe how initial forecast errors will grow during a forecast. And that involved a mathematical theory, called singular vector theory, which turned out to be the crucial element needed to make our ensemble forecasting system work.

What are the main objectives of the Predictability of Weather and Climate Group that you lead at the University of Oxford?

We have a lot of objectives but let me focus on just two. The first relates to what we call ‘stochastic parametrization’.

Whilst I was here at ECMWF, we developed what was a new idea in weather and climate modelling. This is to treat the equations that you solve on the computer as partially stochastic, so that they have some inherent randomness in them. Why is this a good idea? It is because the whole point of ensemble forecasting is to determine the forecast with all its uncertainty. We argue that this randomness occurs in the subgrid parametrization: the representation of processes such as thunderstorms that you can’t resolve explicitly in the model because your computer is not big enough.

If you are in a situation where the atmosphere is in a state of predictability, all members of the ensemble forecast will more or less say the same thing, and that is when you can be confident that this is what is going to happen. If, on the other hand, the atmosphere is in a more chaotic state, then the ensemble enables you to characterise that, too. The initial conditions are obviously one source of uncertainty, but the model itself is another important source of uncertainty, especially in the tropics.

Stochastic parametrization is a way to represent that model uncertainty in an ensemble forecasting system. One of my great ambitions is for this type of stochastic work to become not only part of weather forecasting models but also of the climate models that are used in, for example, IPCC assessment reports. And that is actually starting to happen, for example at the Met Office.

What is the other objective?

The equations that describe the turbulent atmosphere and oceans are computationally too difficult to solve exactly because we do not have computers big enough to represent all the scales of motion, from jet streams down to turbulence around the ECMWF building. And yet we know that the higher the resolution, the better the forecasts are going to be. The question is how to square that circle: the need for higher resolution on the one hand and affordable supercomputers on the other.

We have been doing some research to try to understand how much real information there is in all the variables in a model. If you take the ECMWF model for example, the way we solve the equations is to project all the fluid mechanical variables – velocity, temperature, pressure, humidity and so on – onto wave motions in the atmosphere, called spherical harmonics. Traditionally every variable, every spherical harmonic coefficient in the model, is represented very precisely with 64 bits of data.

Moving all this data around in the computer consumes a lot of energy, and this puts a limit on the kind of resolution that can be achieved while keeping power consumption down. Moving towards a more stochastic representation in the model makes one realise that you may only need a small number of bits to represent some variables, especially at small wavelengths. That is to say, you are not going to see any benefits if you use more bits because the effects of doing so are going to be hidden by the inherent noise from the stochastic parametrization.

So, are we wasting computing resources by representing all variables across the board with these 64 bits of data? Getting that down significantly could substantially reduce wasted energy consumption in computations. Freeing up this energy might in turn allow us to produce models which have much higher resolution and therefore better accuracy, both for weather forecasting and climate. This is linked to the Scalability Programme at ECMWF: how to get maximum efficiency from a next-generation supercomputer.

In collaboration with ECMWF scientists, we have already shown that reducing all IFS variables to 32 bits has no notable effect on ensemble performance, or the model climatology.

The title of talks I often give is ‘More accuracy with less precision’. It is a kind of oxymoron. It sounds contradictory but at a deeper level it is not.

What does being an ECMWF Fellow involve?

At Oxford, I have a number of people working on different aspects of stochastic parametrization: the atmosphere, oceans, land surface, sea ice, all the different parts of the Earth system. We are working very closely with ECMWF in these areas. We try to organize a workshop every year, either in Oxford, at ECMWF, or at the Met Office.

I regularly visit the Centre to discuss at a more strategic level how we will interact, but the Fellowship has also provided an opportunity for my group in Oxford to interact with the scientists here. Over the last year there has been somebody from my group down here at least every other week. This is a two-way process: a lot of the research that we do uses the ECMWF model, and we have, and are extremely grateful for, special project time on the computers. We have made a lot of use of OpenIFS, as well. None of this work would be possible without interaction and collaboration with ECMWF scientists.

Your training in physics has served you well in weather science. Has your experience in the field of weather science also given you a different perspective on some fundamental problems in physics?

It definitely has. At the time of my PhD I had no idea how to approach the holy-grail problem in the field of gravitational research, which is how to unify general relativity theory and quantum mechanics. My hunch was that quantum mechanics, although it has been incredibly successful at the laboratory scale, might not actually be correct on the scale where gravity starts to become important. This remains very much a minority view, but I felt, and continue to feel, that we need to explore possible alternatives to quantum mechanics, while recognising that it has been very successful at the laboratory scale.

At the end of my PhD work I had no idea how to pursue this hunch, which was one reason why I moved fields. However, my immersion into chaos theory and non-linear dynamics has given me some quite concrete ideas on how to make progress. One of the pioneers of chaos theory, Edward Lorenz, developed a new and beautiful type of geometry, fractal geometry in state space, which links to some deep areas of number theory (e.g. the theory of p-adic numbers). I strongly believe that one can formulate an alternative to quantum theory using these types of fractal geometries. This alternative theory has none of the properties of quantum mechanics to which Einstein objected, such as so-called spooky action-at-a-distance, or indeed stochasticity. Being geometric, it couples directly to general relativity theory.

This is obviously speculative at the moment. But you are right to say that I would not have had these ideas were it not for being very heavily involved in non-linear dynamical systems theory and chaos theory. Incidentally, Ed Lorenz was a long-term visitor at ECMWF. He was a kind of early Fellow if you like. We did not use that word then but I learnt a lot from him during his visits here.