The Business Times

Climate Science: A tough call to make?

Accurate predictions of Earth's warming require computers that are too expensive for one country or institution and need a global initiative.

Published Thu, Jun 13, 2019 · 09:50 PM
Share this article.

EARTH is warming, and we know why. Light is reflected and absorbed by clouds, air, oceans, ice and land. Greenhouse gases are released and adsorbed by organic and inorganic sources. Both exchanges depend on a variety of factors such as temperature, ocean acidity, the amount of vegetation and - yes - the burning of fossil fuels.

What's less clear is what climate change means for our future. "It's not like this is string theory," said Timothy Palmer, professor of climate physics at the University of Oxford. "We know the equations." But we don't know how to solve them. The many factors that affect the climate interact with one another and give rise to interconnected feedback cycles. The mathematics is so complex, the only way scientists know to handle it is by feeding the problem into computers, which then approximately solve the equations.

The Intergovernmental Panel on Climate Change based its latest full report, in 2014, on predictions from about two dozen such computer models. These models were independently developed by institutions in multiple countries. While similar in methodology, the models arrive at somewhat different long-term predictions. They all agree that Earth will continue to warm, but disagree on how much and how quickly.

In a paper that recently appeared in Nature Reviews Physics, Prof Palmer summarised a controversy that has smoldered in the climate community for 20 years. The claim: The current method neglects an important source of uncertainty.

The root of the problem is one of the most basic assumptions of the computer simulations, the possibility of dividing up the atmosphere and oceans into a "grid" of small pieces. Computers then calculate how the pieces interact with one another in small time increments. While doing so, information that is smaller than the size of the pieces - so-called sub-grid information about clouds, ocean eddies and the capacity of soil to retain water - must be approximated.

According to Prof Palmer, this method of calculation may be overly simplistic and suffers from severe shortcomings. The formula used to calculate changes of the atmosphere and oceans - the Navier-Stokes equation - has what physicists call "scale symmetry", meaning it works the same on all distances. However, as Prof Palmer points out, this symmetry is violated when calculations approximate the sub-grid information. The consequences for climate predictions are serious - we underestimate the durability of extreme weather situations and, at the same time, overestimate how likely our predictions are to be correct.

The Navier-Stokes equation, central to predicting Earth's climate, is famously difficult to solve and has caused mathematicians and physicists headaches for 200 years. To this day, turbulences and eddies have remained challenging to understand. The Clay Mathematics Institute has named the Navier-Stokes equation one of its millennium problems and will award progress towards solving it with a US$1 million prize.

In this situation, the best we can do is improve computer models to obtain more accurate, approximate solutions. It is knowledge we urgently need. As Earth continues to warm, we face a future of drought, rising seas and extreme weather events. But for all we currently know, this situation could be anywhere between a mere annoyance and an existential threat.

There are two possible ways to arrive at better climate predictions. The best way would be to use a higher resolution for the models - to divide up the land and oceans into much smaller pieces. But doing this with existing computing facilities would take too long to be of any use.

The second-best option, according to Prof Palmer, is to randomise the sub-grid processes. Counter-intuitively, this additional randomness has the effect of stabilising extreme weather conditions. Weather forecasts that take into account random (or "stochastic") processes make more accurate predictions for the frequency of tropical cyclones, the duration of droughts and other weather phenomena, such as the long-lasting heat spell over Europe in the summer of 2018. It seems only reasonable, then, that long-term climate predictions should use this method too.

Climate scientists have begun to take note of Prof Palmer's argument. The new British climate model, known as UKESM1, in use since 2018, uses this method of randomness, and others are sure to follow. Björn Stevens, director of the Max Planck Institute for Meteorology in Hamburg, Germany, agrees with Prof Palmer's assessment. For the next generation of models, he said, his institution "will be interested in exploring the role of stochastic treatments". But Prof Palmer does not want to settle for the second-best, and still hopes to bring the grid size of climate models down. A horizontal grid of about one square kilometre, or 0.4 square miles, he believes, would significantly improve the accuracy of our climate models and give us the information we need to accurately gauge the risks posed by climate change.

To do this, we need supercomputers capable of performing these calculations. Centres of exascale supercomputers - computers able to perform at least a billion billion calculations per second - would be up to the task. But these computing resources are more than any one institution or country can afford. Getting more accurate predictions would require an international initiative and an estimated US$1.1 billion in funding.

We need international collaboration, what Prof Palmer calls "A CERN for climate modelling". It's an apt comparison: CERN - the European Organization for Nuclear Research - was founded to pool resources into a jointly used facility, thereby enabling mega projects like the Large Hadron Collider that are beyond the budget of any one country. Importantly, this joint effort does not compete with research at national institutions, but instead builds on it. And if that worked for particle physics, Prof Palmer thinks, it can work for climate science too.

In 2018, together with climate scientists from 18 European institutions, Prof Palmer proposed such a computing initiative (called Extreme Earth) as a flagship project to the European Research Council. The proposal passed from the first to the second stage of evaluation. But this year, the ERC cancelled the 2020 flagship initiatives altogether. No other funding body has stepped up to fund the climate initiative.

But hesitating to fund better climate models makes no sense, neither scientifically nor economically. Climate change puts us all at risk. To decide which course of action to take, we need to know just what the risks are and how likely they are to come to pass. Increasing the resolution of current models from 100 kilometres to one kilometre would not be an incremental improvement but would make the predictions significantly more reliable.

The benefits of an international initiative for climate science would well outweigh the costs. We created this problem together, now we must solve it together. NYTIMES

BT is now on Telegram!

For daily updates on weekdays and specially selected content for the weekend. Subscribe to  t.me/BizTimes

Columns

SUPPORT SOUTH-EAST ASIA'S LEADING FINANCIAL DAILY

Get the latest coverage and full access to all BT premium content.

SUBSCRIBE NOW

Browse corporate subscription here