Salmon Receive Armed Support in Fight for Survival
January 17, 2015
Great Ecology Featured in Green Build & Design Magazine
January 26, 2015
Salmon Receive Armed Support in Fight for Survival
January 17, 2015
Great Ecology Featured in Green Build & Design Magazine
January 26, 2015
Show all

Marlene Tyner, M.E.S.M. 

The New York Times recently urged everyone to pull up their New York, San Diego, and Miami roots and move on over to Portland, Seattle, and Detroit. Why the incitement to mass-migration? Climate change.

In an article published in the journal Nature, climate change researchers concluded that in a warmer future, several American cities will be losers (higher temperatures, drought, sea level rise), and some will be winners (moderate climate, reasonable precipitation, consistent shorelines). Because climate models contain both spatial and temporal variables, they can be mapped. And as a result, the authors found that some parts of the United States will feel the impacts of climate change more than others.

Sea Level Rise can really lengthen your morning commute. Photo courtesy of Marvin Nauman.

Sea Level Rise can really lengthen your morning commute. Photo courtesy of Marvin Nauman.

But, before you sell your beachfront Miami home and move to Milwaukee, it might be useful to ask: How accurate are these predictions anyway? To answer that question we need to understand how these predictions were generated and how to interpret them.

General Circulation Models
General Circulation Models (GCMs) are the models used by climate scientists to generate global climate change predictions. There are a ton of these models out there, developed and run at institutions as wide-ranging as the U.S. National Centre for Atmospheric Research, the Russian Institute for Numerical Mathematics, and the Korean Meteorological Administration. In total, approximately 80 GCMs from across the globe were evaluated for the Intergovernmental Panel on Climate Change’s (IPCC) Climate Change 2013 Report.

GCMs attempt to mimic the global climate, a complex system influenced by the way the atmosphere, land, sea-ice, and ocean behave and interrelate. Climate modelers apply basic laws of physics to each individual component of the earth system, and then use those same laws to link all the systems.

Each GCM differs in both the equations used to approximate physical processes in these systems, and how the equations are parameterized, or what values can be plugged into the model. You may see model results reported under different ‘scenarios’ – that just means that they’ve run the models using varying levels of carbon dioxide (CO2) emissions to see how the climate will be behave in a world filled with the same or more CO2.

AtmosphericModelSchematic

Climate models are systems of differential equations based on the basic laws of physics, fluid motion, and chemistry. To “run” a model, scientists divide the planet into a 3-dimensional grid, apply the basic equations, and evaluate the results. Graphic & caption courtesy of NOAA.

GCMs are run on computers, which read and process data in globe-sized grids. Think of the pixels on photos or your TV screen, but bigger (~40 square miles per cell) and covering the entire world. The various component models are layered on top of each other on this grid, and the computer calculates a target value for each grid cell. If scientists were interested in temperature, for example, they would output the average temperature in each cell across the globe. You can run GCMs multiple times in succession to approximate how temperature might change over time, with each run representing a month, or a year, etc. This works because the models reference back to the previous time step for many of their parameters. The temporal aspect of climate models allows scientists to make temperature and other variable predictions based on various climate starting-points.

That’s partly why GCMs are run under different climate change scenarios. These scenarios represent approximations of variables in the environment that will be different in the future and by how much. The standard today is to use Representative Concentration Pathways (RCPs), which describe greenhouse gas concentration trajectories and their effect on how much of the sun’s energy gets trapped between the Earth’s surface and the atmosphere. A higher RCP indicates a warmer future, with the hottest scenario denoted as RCP8.5 by the IPCC. Running different scenarios allows us to understand what the Earth will look like in the future based on the actions we take today, such as reducing greenhouse gas emissions.

So how robust are climate change model predictions?

Satellite observations align with the most extreme IPCC predictions of sea level rise, which were generated by running several GCMs using historical data, Graph source: http://www.copenhagendiagnosis.com

Satellite observations align with the most extreme IPCC predictions of sea level rise, which were generated by running several GCMs using historical data. Graphic courtesy of The Copenhagen Diagnosis.

One way is to use a data comparison method known as hindcasting. When you hindcast, you’re really matching model projections of past climate variables, such as sea level rise, with observed data (see sea level rise graph). This allows you to see how good your model is at generating the same climate data as what we already observed long ago.

Another way is to run the model multiple times under the same climate scenario(s) to account for any variation in predictions. The figure below shows temperature observations in black, model temperature predictions in yellow, and the average of the model predictions in red. See how close the red average predicted value tracks the observed values in black?

Global mean near-surface temperatures over the 20th century from observations (black) and as obtained from 58 simulations produced by 14 different climate models driven by both natural and human-caused factors that influence climate (yellow). The mean of all these runs is also shown (thick red line). Temperature anomalies are shown relative to the 1901 to 1950 mean. Vertical grey lines indicate the timing of major volcanic eruptions.

Global mean near-surface temperatures over the 20th century from observations (black) and as obtained from 58 simulations produced by 14 different climate models driven by both natural and human-caused factors that influence climate (yellow). The mean of all these runs is also shown (thick red line). Temperature anomalies are shown relative to the 1901 to 1950 mean. Vertical grey lines indicate the timing of major volcanic eruptions. Graphic & caption courtesy of IPCC 2013

However, while GCMs are great at predicting long-term changes across large areas, they are limited in a few key ways. First, they don’t predict the weather, but instead show long term trends. Like the U.S. stock market, some days might be up and some might be down, but it has experienced an overall increase over the past 100 years. The same is true of CO2concentrations and temperature. Second, there is a lot of debate over the utility of global climate models to accurately predict regional climate change due to their large spatial scale. When one square contains 40 square miles, you miss a lot of detail. However, there are different models which can facilitate regional climate analysis.

Downscaling to Enhance Regional Climate Models
Regional Climate Models allow scientists to understand the impacts of climate change at higher resolutions over specific regions, and are developed through either dynamical or statistical downscaling. Dynamical downscaling involves running a climate model using a smaller grid size, zooming in on a specific part of the Earth, and interpolating the input data to fit the new grid size. Statistical downscaling uses advanced statistics to predict smaller-scale variation in each larger grid cells using predictor variables (IPCC 2013). However, the researchers of the previously-mentioned Nature article used a novel data aggregation technique that gives a much more local understanding of climate change in a statistically robust manner. This technique could be used to generate city-specific guidelines for land-use planning, real estate development, and an understanding of which natural resources to prioritize for conservation.

So say goodbye to the beach, San Diegans, and head to the Midwest. At least the real estate prices are reasonable.

About the Author:

Marlene TynerMarlene Tyner has over 6 years of experience in ecological research and environmental analysis. Her past work includes coastal and near-shore sedimentation modeling, responses of ecosystem processes and communities to climate change, and global habitat modeling under various climate change predictions.