Why the Met Office needs a £97m supercomputer

Marianne Freiberger Share this page

Last week the Met Office announced a new £97 million supercomputer to help them with their weather and climate predictions. When fully installed it will be one of the fastest supercomputers in the world. It will be able to perform more than 16,000 trillion calculations per second, and at 140 tonnes, will weigh the equivalent of eleven double decker buses.

But why does the Met Office need this miracle of computation?


In 2009 the predicted "BBQ summer" turned out to be a wash-out.

The Earth's weather and climate are results of a complex interaction of many factors — the atmosphere, the oceans, radiation from the Sun, to name just a few. Mathematically these factors are described by the equations of thermodynamics and the Navier-Stokes equations, which describe the behaviour of fluids. A weather or climate model divides the Earth and its atmosphere into a three-dimensional grid. Given the weather conditions right now at each of the grid points, you can use the equations to calculate the weather a time step into the future (say in a few hours or, for a climate model, next month), then another step into the future, and another, and so on.

The trouble is that the equations involved are incredibly difficult to solve. The Navier-Stokes equations are actually the subject of one of the biggest open problems in mathematics. Nobody knows if physically meaningful solutions exist for the most general form of the equations. If you prove or disprove their existence, you're set to win $1 million from the Clay Mathematics Institute (see the Plus article How maths can make you rich and famous).

Weather and climate models therefore approximate the solutions of the equations, but this takes a huge amount of computing power. The finer the grid used in the model, the more calculations have to be made and the more computing power you need.

But a fine grid is important. Partly because it pinpoints the weather in specific locations, which is useful if you are planning a barbecue, or if you are an airport needing to know if it's going to be foggy tomorrow morning. But it is also important because weather phenomena that happen on very small scales, say localised thunderstorms, can develop into large scale weather events that disrupt much larger areas. That's the famous butterfly effect summed up in the 1970s by the meteorologist Edward Lorenz's famous metaphor: a butterfly flapping its wing in Brazil can cause a tornado in Texas. If your grid spacing is too coarse, you miss those localised weather patterns and your predictions are less accurate. The butterfly effect also means that you can't be sloppy when it comes to approximating the solutions to equations: a small inaccuracy in the numbers, for example the numbers that reflect the initial conditions, can quickly snowball into a large error in your predictions.

When we talked to the Met Office in 2004 (see And now, the weather) the grid points in their global model were spaced 60 km apart, those for the European model 20 km apart and those for the UK model 12 km apart. According to the BBC, the new supercomputer will be able to run UK-wide models with a 1.5 km resolution and localised models with a 300 m resolution — that's a massive improvement!

But still, do we really need such accuracy? After all, it's only the weather. The Met Office points out that better forecasts are "anticipated to deliver £2bn of socio-economic benefits to the UK by enabling better advance preparation and contingency plans to protect peoples' homes and businesses." But high performance computing is also essential in climate change research — so perhaps the new computer will not just tell us if it's safe to slap the sausages on the barbie, but also help us deal with the biggest challenge facing humanity today.

Further reading

To find out more about weather and climate modelling see these Plus articles: