Posted in | News | Climate Change

Research Consortium Makes Efforts to Develop New Climate Model from Scratch

With a most certain change in climate together with the uncertainty that looms with respect to predicting how it will change, engineers and researchers from across the United States are working together to develop an innovative climate model designed to offer highly precise and actionable predictions.

MIT professors Raffaele Ferrari and John Marshall, along with colleagues from Caltech, NASA’s Jet Propulsion Lab, and the Naval Postgraduate School, envision a revolution in climate modeling using data assimilation and machine learning. (Image credit: Caltech and Tapio Schneider)

Cashing in on the latest advances in the computational and data sciences, the all-encompassing effort takes advantage of enormous amounts of data that are currently available and of increasingly robust computing potentials not just for processing data but also for simulating the Earth system.

The new model will be developed by a consortium of scientists headed by Caltech, in collaboration with MIT, the Naval Postgraduate School (NPS), and the Jet Propulsion Laboratory (JPL), which is managed by Caltech for NASA. The goal of the consortium, termed the Climate Modeling Alliance (CliMA), is to fuse Earth observations and high-resolution simulations into a model that illustrates significant small-scale features, like turbulence and clouds, more reliably than prevalent climate models. The aim is to develop a climate model that predicts future changes in crucial variables such as rainfall, cloud cover, and sea ice extent in a more accurate manner—with uncertainties at least half the size of those in prevalent models.

Projections with current climate models—for example, of how features such as rainfall extremes will change—still have large uncertainties, and the uncertainties are poorly quantified,” stated Tapio Schneider, Caltech’s Theodore Y. Wu Professor of Environmental Science and Engineering, senior research scientist at JPL, and principal investigator of CliMA. “For cities planning their stormwater management infrastructure to withstand the next 100 years’ worth of floods, this is a serious issue; concrete answers about the likely range of climate outcomes are key for planning.”

The consortium will function in a fast-paced, start-up-like atmosphere, and believes it can develop the innovative model and make it work within the next five years—a contentious timeline for developing a climate model typically from scratch.

A fresh start gives us an opportunity to design the model from the outset to run effectively on modern and rapidly evolving computing hardware, and for the atmospheric and ocean models to be close cousins of each other, sharing the same numerical algorithms.

Frank Giraldo, Professor of Applied Mathematics, NPS.

Existing climate modeling is dependent on splitting up the globe into a grid and subsequently computing what is happening in each sector of the grid, and also how the sectors interact with one another. For any given model, its accuracy relies in part on the resolution with which the model can observe the Earth—that is, the size of the sectors of the grid. Restrictions in the available computer processing power indicate that in general, those sectors cannot be any smaller compared to tens of kilometers per side. However, when it comes to climate modeling, the bothering factor is in the details—details that tend to get missed in an extremely large grid.

For instance, the impact of low-lying clouds on climate is significant as they reflect sunlight; however, the turbulent plumes sustaining them are so small that they tend to fall through the cracks of prevalent models. Likewise, variations in Arctic sea ice have been connected to large-scale effects on various things, such as from polar climate to drought in California; however, it is challenging to predict how that ice will change in the future since it is sensitive to the density of cloud cover above the ice and the temperature of ocean currents below, both of which cannot be resolved using existing models.

The researchers will develop high-resolution simulations modeling the features in detail in chosen regions of the globe to capture the wide-ranging effect of these small-scale features. These simulations will be nested within the larger climate model. The impact will be a model with the potential to “zoom in” on chosen regions, offering detailed local climate information related to those areas and informing the modeling of small-scale processes everywhere else.

The ocean soaks up much of the heat and carbon accumulating in the climate system. However, just how much it takes up depends on turbulent eddies in the upper ocean, which are too small to be resolved in climate models. Fusing nested high-resolution simulations with newly available measurements from, for example, a fleet of thousands of autonomous floats could enable a leap in the accuracy of ocean predictions.

Raffaele Ferrari, Cecil and Ida Green Professor of Oceanography, MIT.

Although current models are usually tested by verifying predictions against observations, the new model will take ground-truthing a step ahead by employing data-assimilation and machine-learning tools to “teach” the model to optimize itself in real time, harnessing the Earth observations as well as the nested high-resolution simulations.

The success of computational weather forecasting demonstrates the power of using data to improve the accuracy of computer models; we aim to bring the same successes to climate prediction.

Andrew Stuart, Bren Professor of Computing and Mathematical Sciences, Caltech.

Each collaborative institution offers a distinctive strength and research expertise to the project. The focus of Schneider and Stuart from Caltech will be on developing the data-assimilation and machine-learning algorithms, and also models for turbulence, clouds, and other atmospheric features. Ferrari and John Marshall, also a Cecil and Ida Green Professor of Oceanography, both from MIT, will head a group that will model the ocean, including its wide-ranging circulation and turbulent mixing. Giraldo from NPS will head the development of the computational core of the innovative atmosphere model in partnership with Jeremy Kozdon and Lucas Wilcox. A team of researchers from JPL will work together with the group at Caltech’s campus to create process models for the atmosphere, biosphere, and cryosphere.

Funding for this study was offered by the generosity of Eric and Wendy Schmidt (by recommendation of the Schmidt Futures program); Mission Control Earth, an initiative of Mountain Philanthropies; Paul G. Allen Philanthropies; Caltech trustee Charles Trimble; and the National Science Foundation.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.