You can’t just walk into the data center in the College of Earth, Ocean, and Atmospheric Sciences (CEOAS). The sign on the door says you need a pass card. There should be another sign too: Caution, planetary experiments in progress. Inside, computer clusters churn 24/7, spinning out information about ocean currents, winds, air temperatures, ice sheets and flows of energy. Lights blink and fans drone as they cool the machines that run calculations on command from scientists who may be just down the hall or on another continent. In this case, proximity doesn’t matter.
Andreas Schmittner‘s office is a 30-second walk from the data center, but the CEOAS assistant professor doesn’t have to go there to check on his experiments. From his desk, he logs on to his Linux computer cluster at the center and reviews the status of 20 or more projects that he may have running simultaneously.
Schmittner is an oceanographer who devotes himself to climate models, those mathematical descriptions of the real world that allow scientists to envision possible sea levels, ice sheets and temperature and precipitation patterns on a warmer planet. Grounded in physics and tested against real data from the past, climate models range from the simple to the complex. Think of them as alternative futures.
“Models should be regarded as tools to understand the climate system better and to address research questions,” says Schmittner. “Depending on the research question you have, you use different tools. Just like in your workshop, if you need to screw something down, you don’t need a wrench. You use a screwdriver.”
In short, models have become the high-tech workhorses of climate science. Scientists rely on them to consider how coastal communities, food and water supplies, forests and weather would fare on a changing Earth.
More than 20 years ago, OSU researchers created models to study global atmospheric circulation and the Pacific Ocean system known as the El Niño Southern Oscillation. Today’s models are more sophisticated and the goals more ambitious: to make them more realistic (aligned with actual climate data), to incorporate all significant processes and to identify the uncertainties that inevitably affect modeling outcomes.
With better models come results that illuminate how the world may change in coming decades. In a report published in the journal Global Biogeochemical Cycles that generated headlines in 2008, Schmittner showed that even if greenhouse gas emissions increase gradually until 2100 and are then virtually eliminated by 2300, the planet would continue to warm for the next 200 years or more.
In 2005, he and colleagues in Europe and North America reported that doubling the amount of carbon dioxide in the atmosphere (now about 35 percent higher than before the Industrial Revolution) could affect the North Atlantic with steep plankton declines and a 25 percent slowdown in currents that carry heat toward Europe. Actual observations based on water temperature and salinity suggest that currents may actually be slowing, but scientists are still debating what the data mean. “We have to get more observational data and improve our models,” Schmittner told the BBC.
An Uncertain Future
Future scenarios amount to potential conditions in a changing world, not to firm predictions. “We can’t say exactly how much warmer the climate is going to be in 50 years,” says Karen Shell, an assistant professor in CEOAS. “Part of that is uncertainty in the science and how we translate the science into the models. You can’t take every single cloud and put it into a model. We don’t have the computational resources to do that.”
Shell came to OSU in 2008 from the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. She studies variations among the two dozen or so global circulation models used by the international climate science community. In the course of her work, she downloads so much data that she has generated calls from OSU network technicians. “They were concerned that my computer had been infected by a virus,” she says.
Data from modeling runs and from the field (including satellites, ocean buoys and monitoring stations on the polar ice sheets) are a modeler’s bread and butter. They contain clues about what drives the climate system over long periods of time. Shell and her colleagues analyze how models treat factors such as solar energy flows at the top of the atmosphere (how energy is absorbed and reflected) and the distribution of atmospheric water vapor from the equator to the poles.
“If you can figure out what’s causing the spread (among model results) and link that to satellite data, you can get clues about cause and effect,” says Shell. “That’s how you make progress. It’s slow progress, but it has to be done.
“I love what I do,” she adds, noting that model results provide important information for responding to the likely consequences of climate change.
Bringing It Home
Over the past two decades, models have improved in both scope (how many physical and biological processes they incorporate) and resolution (the grid or spatial density of a region). They enable researchers to look at what might be in store for Klamath Basin water supplies or for forest fire risks in the western United States. Hydrologist Steve Hostetler has worked on such regional issues for about 20 years for the U.S. Geological Survey. The courtesy professor in the OSU Department of Geosciences continues to work on current and past climate conditions with colleagues at the USGS, OSU and the University of Oregon.
“It’s very collaborative with lots of different ways of looking at things, lots of different types of expertise. I seldom do things on my own,” he says.
In 2006, the National Science Foundation’s Paleoclimate Program supported this network with five-year grants totaling $3.3 million to OSU and partners at UO and the University of Minnesota. The goal is to develop a detailed picture of climate change from ocean records, ice core samples, terrestrial cave formations and global climate models.
In the late 1980s, Hostetler was doing fieldwork for the USGS when he became interested in paleoclimate, focusing on trends over the last 50,000 years. Since then, he has used the results of global and regional atmospheric models to estimate how climate influences water balances and fire frequency in the West.
For the Klamath Basin, modeling can improve the accuracy of multi-year evaporation estimates, Hostetler has reported. Evaporation is critical for determining how much water is available from year to year. Under a changing climate, accurate predictions will be necessary for resolving the region’s legendary water disputes.
In 2006, Hostetler and two USGS scientists co-authored the Atlas of Climatic Controls of Wildfire in the Western United States. For the period 1980-2000, their maps show how fires were closely linked with monthly water and energy balances in eight ecoregions, including the coastal and interior Pacific Northwest. Their report could lead to better predictions of wildfire risk.
“A lot of modeling is really mundane, boring stuff. But when you complete something and can look at the results and interpret what’s going on, that’s the payoff. These maps are the payoff,” Hostetler says.
Mining the Data
Behind the doors at the CEOAS data center are the information systems that make such results possible. “We have the networking, computational and storage infrastructure to move large amounts of data,” says manager Chuck Sears, who salts conversation with talk of “terabytes” (one terabyte equals a million million data points) and “arrays” (large tables of data).
Models aren’t the center’s only source of data. Continuous streams of information from satellites, ocean buoys and other monitoring systems flow into the center’s databanks, enabling scientists to test and to refine their models. And since maps and other visual displays enhance communication among scientific teams and with the public, the center offers state-of-the-art visualization systems as well.
“We’ve created a production studio,” says Sears, “and we’ve enabled 2,000 different devices to be connected outside the center, as if they were in the center. These devices range from desktop computers to handheld devices such as iPhones.”
Increasingly, collaborative climate science is being done in remote offices and at meetings and other locations, not on the premises of computing centers. “Ultimately you have to get all of those data out for real work,” says Mark Abbott, dean of CEOAS and member of the National Science Board. “It’s going to be personalized and local. You’ll be able to get to it everywhere. The key is the balance between what’s in the center and what’s out on your desktop, your PDA (personal desktop assistant) or what you have in your home.”
Access to a variety of such devices allows scientists at CEOAS to act like symphony conductors, Abbott adds, orchestrating the different tools they need. “If you’re a real woodwinds expert, you just use that, but if you really want to use some other instruments, you can do that too.
“Supercomputer centers do great things,” he adds, “but the excitement is out on the edges,” where scientific teams are sharpening our views of a changing planet.
For more about climate modeling at OSU:
Philip Mote to Lead Oregon’s New Climate Research Institute, January 6, 2009
New Study: Long-Term Global Warming May be Tough to Reverse, February 25, 2008
Research Team to Explore Past Climate by Looking for Triggers to Rapid Change, June 28, 2006
Atlantic Current Shutdown Could Disrupt Global Ocean Food Chain, April 5, 2005
To support research in the College of Earth, Ocean, and Atmospheric Sciences, contact the OSU Foundation