With a few keystrokes, climate scientist Shian-Jiann Lin *89 conjures up his life’s work. Line after line of computer code pops up on his screen — inscrutable logic loops and parenthetical statements written in the old Fortran language. “I worked on this code for more than 20 years,” says Lin. “Every single line, I’ve stared at it over and over again. And there are something like 20,000 lines of code. Not a single statement is wasted.”
Lin works at the Geophysical Fluid Dynamics Laboratory (GFDL), a federal research center, affiliated with Princeton, that has developed one of the world’s best climate models. Before attending Princeton for his Ph.D. Lin earned a master’s degree in aerospace engineering at the University of Oklahoma, and his code draws heavily from his expertise with rockets and jet planes and how they fly. Now he’s working to perfect a model that not only can describe what will happen to Earth’s climate in the coming decades, but also can help predict the weather just a few days out.
In the blink of an eye, Lin’s program cuts Earth’s atmosphere into thousands of boxes, spanning the globe and stacked from surface to stratosphere. Conditions in each box — temperature, pressure, humidity, and wind speeds, for example — are set, based on the weather or climate at that particular moment. Then a supercomputer puts the weather in motion by calculating the way conditions change from box to box. Unlike most climate models, Lin’s program carves Earth into nested grids of different sizes. It might create tiny boxes in regions with complicated weather, like a hurricane, and apply more widely spaced grids in areas where things are less exciting. And it produces results with a speed and accuracy that are the envy of the world.
Lin’s program is the guts of the climate model at GFDL, the world’s first climate laboratory. The model has a special knack for underscoring the sobering outcomes expected to befall the planet in the coming decades: how carbon dioxide and other greenhouse gases, injected into the atmosphere under different fossil-fuel-burning scenarios, will raise temperatures, boost extreme weather, and cause droughts.
But Lin’s program will soon be helping with short-term forecasts, too. Last July, the National Weather Service (NWS) selected it to be the engine for the agency’s own weather models. The NWS is keen to catch up to competitors at the European forecasting agency, which in 2012 — to the embarrassment of the home team — predicted Hurricane Sandy’s destructive turn into New Jersey days before the Americans did. But the selection of Lin’s program could mean more than just an uptick in local pride. With the Trump administration threatening to cut funding for research on climate change, becoming involved in the less controversial business of weather forecasts could help GFDL thrive.
Weather forecasters and climate scientists typically live in separate worlds. The meteorologists concentrate on what will happen tomorrow and next week, while the climate scientists gaze decades into the future. But the boundaries have started to blur. Weather forecasters’ seven-day forecasts are as skillful now as their five-day forecasts were 20 years ago. They’d like to predict weather beyond two weeks, into the realm of seasonal forecasts, but their models — focused more on speed than accuracy — tend to break down.
Climate scientists, meanwhile, have started to care more about shorter timescales. They have realized that phenomena that occur seasonally or every few years — such as El Niño — have impacts on both weather and climate. Yet in general, their models have been too complicated and slow to work as weather-forecasting machines. Lin’s program anticipates the day when all models will be driven by the same engine, whether it’s a forecast for a hurricane’s path or a 100-year simulation of the global climate. “The two worlds are now coming together,” says Leo Donner, a GFDL climate scientist.
Climate modeling has Princeton roots that predate GFDL. After World War II, the eminent mathematician John von Neumann was on the faculty at the Institute for Advanced Study. A pioneer in game theory and quantum mechanics, von Neumann was also a visionary for computer science. Working with the military, he was using early computers to research the hydrogen bomb. He realized that the same physics that governed thermonuclear explosions — nonlinear fluid dynamics — also applied to the chaotic weather in Earth’s atmosphere, and that the number-crunching power of computers, then in their infancy, would have a huge role in forecasts. “The quest was to do a 24-hour forecast,” says GFDL director Venkatachalam “Ram” Ramaswamy.
In 1950, von Neumann’s team used an early computer powered by punch cards and cathode-ray tubes to perform the first computerized 24-hour weather simulation. It was crude — just a small two-dimensional grid focused on North America, with points separated by 700 kilometers. But it was a start.
With government support, von Neumann recruited Joseph Smagorinsky, a young meteorologist, to head up an arm of what was then the U.S. Weather Bureau. His goal was to move past the early simulations and create a global, three-dimensional “general circulation model” of Earth’s atmosphere that could be used for weather forecasting. GFDL was founded in 1955 in Suitland, Md., as a research division of the weather agency; in 1968, it moved to a low-slung building on Princeton’s Forrestal Campus, where it remains a star in the National Oceanic and Atmospheric Administration (NOAA) laboratory system. Today, many GFDL researchers hold joint appointments with the University’s Program in Atmospheric and Oceanic Sciences. GFDL also pays Princeton several million dollars to support graduate students and postdoctoral researchers.
By the 1960s, GFDL was moving beyond weather and beginning to touch on what might be called climate science. A seminal moment came in 1967, when Syukuro Manabe, whom Smagorinsky had recruited from Japan, published a study on the effects of carbon dioxide in the atmosphere. Less than a decade earlier, scientists began to notice a rising pattern in daily carbon-dioxide measurements taken on top of Hawaii’s Big Island. Researchers at GFDL were just beginning to face the idea that this carbon dioxide might be causing a greenhouse effect.
Manabe’s approach was simple. He analyzed a vertical column through the atmosphere, and looked at how carbon dioxide would affect the balance between incoming sunlight and the outgoing energy reflected and radiated into space. Manabe calculated how much average surface temperatures would rise if carbon-dioxide levels, then 300 parts per million, were to double. (The world passed 400 parts per million in 2016 and is well on its way to achieving that ominous doubling.) His answer: 2.3 degrees Celsius (or 4.7 degrees Fahrenheit) — a finding, now a half-century old, that’s remarkably similar to the 3-degree Celsius rise predicted by many of the most recent models. “The basic answer really hasn’t changed,” says Gavin Schmidt, the director of NASA’s Goddard Institute for Space Studies in New York, another major climate-modeling center.
GFDL scientists next turned to the ocean. In 1975, Manabe published the first three-dimensional model linking the atmosphere and oceans. The oceans absorb heat and carbon dioxide from the atmosphere, and can store them for long periods of time — centuries, even — which means that any long-term climate model needs to represent this interchange. “Everyone has always said that the ocean is the ‘memory’ of the climate system,” says Alistair Adcroft, who is helping to develop GFDL’s Modular Ocean Model, which hooks into the lab’s overall climate model. Oceans are currently sucking up large chunks of both carbon-dioxide emissions and rising atmospheric heat. A pressing question is: How long can that trend continue? Will the oceans reach a saturation point, causing atmospheric temperatures to rise even faster?
Nowadays, the main GFDL model boasts all sorts of subcomponents. For instance, there is an ice model, needed not only to understand how a warming ocean and atmosphere will melt ice at the poles, but also to understand the reverse: how melting ice will affect the ocean and atmosphere. There is a land-use module, to explore the different ways that forests and agricultural crops soak up carbon, and there is an atmospheric chemistry module, to capture the flows of different pollutants and the chemical reactions that lead to smog and holes in the ozone layer.
There are even efforts to use the model for what Ramaswamy calls “ecosystem services” — for example, by testing the effects of climate change on ocean-nutrient flows and phytoplankton to understand the future of fisheries dependent on these microscopic organisms. In January, a study led by GFDL scientist Charles Stock ’97 showed how climate change’s effect on phytoplankton would be amplified up the food chain onto fish stocks: Fisheries would be decimated in regions such as East Asia, and stocks would grow in warming areas like an increasingly ice-free Arctic Ocean.
With weather forecasts, the proof is in the pudding: If the sky opens up after a sunny-day forecast, you know the model needs work. It’s harder with climate models, because you don’t have 50 years to wait to see if you were right. So climate scientists often test their models against the known data of the past, to see how faithfully the models re-create the climate. Another tactic is to test models against each other, in so-called ensemble runs. The average result in the ensemble is often seen as the “best” answer, although Ramaswamy points out that there’s no single measure of what is best: One model might predict temperatures well, but do poorly at simulating hurricanes or El Niño events.
Weather agencies have different demands — they need their models to run fast and often. Today’s NWS model, for instance, runs four times a day. Each time, it must incorporate as much real-time weather data as possible, from satellites, weather balloons, and ocean buoys. The model is speedy, but it strays from reality over long time periods. Current weather models, it seems, have reached their limits.
That’s where Lin’s program comes in. In addition to helping the NWS challenge European forecasters, his program will break down barriers between weather and climate researchers. “It has opened the door for true unification,” he says. “It’s everything in one single package.” In 2018, the agency plans to switch to Lin’s program — which would make his 20,000 lines of code the foundation of everything from the weather alerts on your phone to the reports warning about the planet’s imperiled future.
And in blurring the lines between weather and climate, Lin might have stumbled on a way for climate science to survive — and perhaps even thrive — under Trump’s administration. The president scrapped U.S. involvement in the Paris accord, which set voluntary country-by-country emissions reductions, and has appointed climate-change skeptic Scott Pruitt to head the Environmental Protection Agency. Trump’s budget proposal for the 2018 fiscal year that begins Oct. 1 would cut funding by 26 percent for NOAA’s research division, which oversees the $21 million GFDL budget and other laboratory research.
If approved by Congress, that budget would cut $5 million from GFDL’s next-generation weather model and would save $5 million by terminating the development of models that would extend weather outlooks to 30 days. But so far, the Republican majority in Congress has shown little inclination to follow Trump’s lead: In the recent budget deal for the remainder of fiscal-year 2017, Congress actually boosted NOAA’s research budget by 3.5 percent.
That’s because good weather forecasting, with all of its applications to the business world, has always enjoyed bipartisan support. In April, Congress passed the Weather Research and Forecasting Innovation Act, which calls for NOAA to boost its research into seasonal forecasting — one of GFDL’s specialties, and one of the areas that Trump is attempting to cut. A co-sponsor of that bill was Lamar Smith, the Republican head of the House science committee, who has subpoenaed NOAA records and launched investigations into what he believes is fraudulent climate science.
When PAW visited GFDL recently, scientists were in a wait-and-see mode. Many were unwilling to talk about how the new administration might affect their work. But Ramaswamy, who has seen many presidents come and go in his 30 years at GFDL, is unruffled. “The campaign rhetoric, if that’s to be believed — it’s a very pessimistic situation,” he says. “But I think there are opportunities.
“There’s a benefit toward a more improved understanding, leading to more improved predictions,” he continues. “I have a feeling that message will catch on.”
Eric Hand ’97 is a deputy news editor at Science magazine, responsible for the physical sciences.