BERKELEY, CA, Sept. 4, 2002 — Scientists from the National Center for Atmospheric Research (NCAR) have just completed a 1,000-year run of a powerful new climate system model on a supercomputer at the U.S. Department of Energy’s National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory.
Accurately predicting global climate change demands complex and comprehensive computer simulation codes, the fastest supercomputers available, and the ability to run those simulations long enough to model century after century of the global climate. Scientists at NCAR in Boulder, Colorado, ran the millennium-long simulation of their new Community Climate System Model (CCSM2) for more than 200 uninterrupted days on the IBM SP supercomputer at NERSC.
According to Warren Washington, a senior scientist at NCAR and recently elected chair of the National Science Board, being able to more accurately model climate change over a very long time scale is of great societal importance. The scientific problem of climate prediction continues to require complex interactions between atmosphere, ocean, land/vegetation and the cryosphere.
State-of-the-art climate models such as the CCSM are making the interactions over day- to century-long time scales much more accurate, though a number of uncertain aspects can still be improved.
“One reason we need a long control simulation is that it gives the climate modeling community a very good idea of the ‘natural’ model variability on annual, decadal, and century time scales, so that as we perform climate change simulations, we can separate the natural forcing from the anthropogenic changes caused by increasing greenhouse gases, aerosols and land surface changes,” said Washington, an internationally recognized expert in computer modeling of the Earth’s climate.
The CCSM2 effort is headed by Jeff Kiehl at NCAR. CCSM2 tightly couples four complex models, including atmosphere and land modeling codes developed at NCAR and ocean and sea ice models developed at Los Alamos National Laboratory.
Because of its comprehensive integration of four complex component models, CCSM2 has emerged as one of the United States’ flagship computer codes for studying climate change. The CCSM2 simulations being run at NERSC are part of the Climate Change Prediction Program in the Office of Science of the Department of Energy. Data from CCSM2 simulations run at NERSC and NCAR will be made freely available to the nation’s climate research community.
“As the Department of Energy’s flagship facility for unclassified supercomputing, NERSC is able to provide both the uninterrupted computing resources and the staff expertise to enable this important simulation to run, as well as the data storage facility and network connectivity necessary to ensure that the resulting data can be easily accessed and analyzed,” said Horst Simon, director of the NERSC Center.
NCAR scientist Tony Craig began the CCSM2 millennium-long run at NERSC last January. The lengthy run served as a kind of “shakedown cruise” for the new version of the climate model and demonstrated that its variability is stable, even when run for century-after-century simulations.
“This simulation will enable climate scientists to study the variability of the climate system on decade to century time scales, which is an important aspect of climate change detection and attribution studies,” said Jeff Kiehl, a climate scientist at NCAR and chair of the scientific steering committee for CCSM2.
“The computational resource provided by NERSC was essential for accomplishing this important simulation.” Previous climate models have suffered in accuracy by allowing too much “drift,” which meant the resulting climate temperature changes could have too much variation to be scientifically useful. The 1,000-year CCSM run had a total drift of one-half of one degree Celsius, compared to older versions with two to three times as much variance.
“The 1,000-year simulation is the first ever fully coupled climate simulation with this high of spatial resolution,” Kiehl said.
The CCSM model was developed by a community of climate researchers that includes scientists and software engineers at the National Center for Atmospheric Research, many universities and DOE laboratories. The CCSM simulates a number of natural variability signals such as El Nino, the Pacific Decadal Oscillation and the North Atlantic Oscillation.
“This is a significant accomplishment, and results from the improved representation of physics of the atmosphere, land, ocean and ice,” said Inez Fung, director of the Berkeley Atmospheric Sciences Center at UC Berkeley.
“Climate variability on interannual and interdecadal time scales reveals the dynamic non-linear internal interactions within the climate system. For example, there are ‘active’ periods when El Ninos are strong and frequent, as well as ‘quiescent’ periods for the El-Nino/ Southern Oscillations. The results establish the ‘naturally varying’ base-line against which externally-forced climate change, such as from increasing CO2 in the atmosphere, can be compared.”
In addition to Washington, Kiehl and Craig, other scientists contributing to the successful 1,000-year run include Gerald Meehl, Jim Hack and Peter Gent of NCAR, Burt Semtner of the Naval Postgraduate School and John Weatherly of the U.S. Army Cold Regions Research and Engineering Laboratory.
For more information about NERSC, visit http://www.nersc.gov. Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California. Visit our website at http://www.lbl.gov.