The
  Cornell
    Journal
      of
        Architecture
15
Predictability of the Atmosphere



Gang Chen is an assistant professor in the Department of Earth and Atmospheric Sciences at Cornell University. He teaches atmospheric dynamics and climate dynamics. He researches global atmospheric circulation and climate change by developing and analyzing simple and comprehensive computer models from the laws of mathematics, physics, and chemistry.
Despite the chaotic nature of the atmosphere and the complexity involved in reconciling the motion of water, air masses, the spinning Earth, and solar radiation, atmospheric forecasts are much more accurate than the prediction of many other chaotic systems, such as elections or the stock market. The predictability of the atmosphere is special in that it relies on both observed (instrument derived) and mathematical (equation derived) inputs to yield results. Both inputs have been crucial over the past few decades, for the dramatically improved daily or weekly weather forecasts.

While it remains impossible to predict an individual weather event beyond a month—owing to the sensitivities of a chaotic system to small errors in the initial conditions—it is possible to predict the changes of a collection of weather events over decades with climate warming. This long-term prediction is possible because these collective changes of weather events are fundamentally dominated by the changes in the energy balance of the Earth’s climate system, rather than the fluctuations induced by chaos.

Much of meteorology was developed as applications of mathematics and physics to the motion of the atmosphere. Using Sir Isaac Newton’s laws of motion, one may readily conceive that the motion of the atmosphere can be predicted with certainty as the motion of macroscopic objects, such as the stars, planets, or aircrafts. For example, by interpreting and synthesizing the information contained in thousands of weather maps, an experienced meteorologist was able to develop an empirical picture of the initial direction and speed of various weather patterns. The weather was predicted by looking for similar weather patterns from past records, and then, extrapolating forward using basic arithmetic: the known quantities of distance to travel (d), and speed of a weather system (c), yielding the time to arrive at the destination (t = d/c).

In the 1920s, an English meteorologist, Lewis Fry Richardson, pioneered numerical weather prediction by attempting to forecast local weather using direct computation methods. Using a mathematical model and inputting data for the principal features of the atmosphere, Richardson attempted to predict the local weather over the course of the following six hours. For example, one can predict the changes of surface pressure (ps) from a discretized representation of the rate of change in a column of air mass: convergence of column integrated air mass, and the convergence of air mass can be determined by the current state of the atmosphere. The model resulted in an erroneous conclusion: by failing to exclude unphysical pressure surges, it predicted a huge rise in pressure—larger than the pressure increase from the center of Hurricane Katrina to its edge—while the actual pressure remained static. Despite these results, Richardson’s forecast method later proved to be essentially accurate.[1] This remarkable achievement opened the door to the era of numerical weather forecast by discretizing the differential equations of the atmospheric motion necessary for weather prediction.


The Earth showing sea-level pressure or Rossby wave depressions (blue) and crests (orange), mapped using the grids of longitude and latitude on December 27, 2010. The blue color (low pressure) indicates the locations of storms, and the fluctuations from blue to red are atmospheric waves. The red solid lines highlight North America. Courtesy of the author.
The Earth showing sea-level pressure or Rossby wave depressions (blue) and crests (orange), mapped using the grids of longitude and latitude on December 27, 2010. The blue color (low pressure) indicates the locations of storms, and the fluctuations from blue to red are atmospheric waves. The red solid lines highlight North America. Courtesy of the author.


Since the 1950s, computer models have been widely used in meteorology and climate science. There are generally two types of computer models: those that focus on short-term (e.g., a week) weather prediction in operational numerical weather forecast and those that concentrate on long-term (e.g., a century) climate prediction in the models of Intergovernmental Panel on Climate Change (IPCC) Assessment Reports. Both types of weather forecast models and climate models apply numerically the basic laws of mathematics, physics, and chemistry to describe the atmosphere and other components of the Earth’s climate system.

The atmosphere fluctuates from day to day, and this fluctuation imprints on the pressure at sea level, which in turn reflects the fluctuation of the weight of air masses above. Meteorologists generally map the global distribution of this pressure by longitude and latitude.

The roughly daily fluctuation between the high (red) and low (blue) pressure is analogous to the crest and trough of traveling waves after a stone is thrown into the lake: the low-pressure center indicates the location of a storm. These Rossby waves are key components of the atmosphere, their behavior provides the primary source of predictability for weather patterns.


The atmosphere is divided into finite grid boxes by longitude, latitude, and height; the discretized differential equations predict the changes of energy, mass, momentum, water, and chemical species within each grid box and the exchanges among neighboring boxes.

Inside each grid box, radiation and chemical reactions are calculated from the laws of physics and chemistry. For example, the emission, transfer, scattering, and absorption of the solar radiation and terrestrial infrared emission are calculated from the physical laws of electromagnetic waves traveling through the atmosphere. Meanwhile, the absorption of radiation by greenhouse gases such as water vapor or carbon dioxide is determined by their molecular structures and chemical characteristics, which have been established in chemistry.

Additionally, some subgrid scale processes are empirically parameterized (i.e., subgrid parameterization) on the basis of the notion that a collection of random events can produce a predictable average effect from their common characteristics. While a circulation larger than the size of a model grid box can be directly calculated by the governing equations, a circulation smaller than the size of a model grid box has to be empirically calculated based on the general characteristics of the specific grid box. An example from electoral prediction can illustrate this point: while it is difficult to predict which party any individual citizen will vote for in a presidential election, it is possible to predict the winning party for a given state, based on its demography.


Waves in a Large Free Sphere of Water, Don Pettit. These images of a sphere of water in zero gravity show a singular surface wave behavior when a puff of air is blown across its surface. Multiple wave occurrences happen on the Earth simultaneously, resulting in a more dynamic and chaotic system than this simulation. Courtesy of NASA. (Source)
Waves in a Large Free Sphere of Water, Don Pettit. These images of a sphere of water in zero gravity show a singular surface wave behavior when a puff of air is blown across its surface. Multiple wave occurrences happen on the Earth simultaneously, resulting in a more dynamic and chaotic system than this simulation. Courtesy of NASA. (Source)


Analogously, clouds, whose size are often less than the size of model grid box can be simply represented in the computer models by the parameters such as its size and height, the number of water droplets in the cloud, and other characteristics. The relationship between the clouds and these cloud parameters are determined by field campaigns from aircrafts, satellite observations, or cloud-resolving numerical models. The relationship between cloud parameters and observations is used in calculating the radiative budget of the atmosphere or the microphysical processes for rainfall or snowfall. As it is impossible to describe mathematically these cloud characteristics precisely—because of their range in size from a water droplet to a cloud of several hundred kilometers—the subgrid scale parameterizations are a major source of uncertainty in these models.

The use of computer modeling for prediction prompted the intriguing realization that a more nuanced understanding of the interrelatedness of atmospheric inputs was necessary for the accuracy of the results. Analysis of computer modeling results showed that a forecast outcome using deterministic physical laws is very sensitive to the initial conditions of a numerical calculation. With slightly different initial conditions or a small error in monitoring the current state of the atmosphere, the forecast results for a future weather pattern may be entirely different due to the chaotic nature of the atmosphere. Consequently, weather prediction capacity shifted from being deterministic to being probabilistic, by allowing a small range of errors in monitoring the initial state of the atmosphere. Deterministic methods relied on the accuracy of the input in the algorithm to produce the prediction: mathematically speaking, given a specific input, the same output will always result. Probabilistic methods attempt to incorporate uncertainty into the algorithm by using logical probabilities instead of crisp true-false values, and through the inclusion of degrees and ranges of interpretation based on previous occurrences.

Finite grid box divisions are based on latitude, longitude, and height. Weather forecast models and climate models are the numerical discretization of the basic laws of mathematics, physics, and chemistry for the atmosphere and other components of the Earth’s climate system. Courtesy of Alison Nash.
Finite grid box divisions are based on latitude, longitude, and height. Weather forecast models and climate models are the numerical discretization of the basic laws of mathematics, physics, and chemistry for the atmosphere and other components of the Earth’s climate system. Courtesy of Alison Nash.



From Newton’s law of motion, the acceleration of wind (a) can be determined by their relationship to the total forces (F) exerted on the air mass (a = F/m). If we are certain of the initial conditions and know the exact forces acting on the atmosphere, the changes of air motion can be predicted without any errors. However, errors in initial conditions are inevitable and inherently part of using observational instruments. These errors will compound exponentially in a chaotic system like the atmosphere, leading to the lack of accurate predictions for individual weather systems beyond a couple of weeks.

By using a traditional method of prediction: the extrapolation of traveling Rossby waves forward in time, the atmosphere can be predicted, to some extent. The holiday blizzard on December 27–29, 2010 can be used as an example of this method.

December 27–29, 2010 sea-level pressure in the North America region highlighted in red previously. Note a major storm in blue on the East Coast of the United States. The arrows describe the magnitude and direction of surface winds. Courtesy of the author.
December 27–29, 2010 sea-level pressure in the North America region highlighted in red previously. Note a major storm in blue on the East Coast of the United States. The arrows describe the magnitude and direction of surface winds. Courtesy of the author.

A notable storm (blue) is seen to move northeastward along the East Coast of the United States. A jet stream can be identified along the eastern coastline, guiding the motion of the storm. The arrival of a storm may be predictable by an extrapolation of past motion forward in time, similar to the way the arrival of a train can be calculated by a departure-time train schedule if the distance and speed are known.

However, the accuracy of this prediction is deteriorated by chaos inherent in the atmosphere. As the weather systems build from the potential energy from differential solar heating ranging from the equator to the poles, atmospheric waves occasionally break down nonlinearly due to the saturation in wave amplitude, in a process similar to the breaking waves at the shore, resulting in a chaotic atmosphere.

Further challenging to prediction is the simultaneously linear and nonlinear nature of the atmosphere: this characteristic makes weather prediction possible but potentially uncertain. As in the previous example, the speed of the traveling Rossby waves provides a source of predictability; however, the waves occasionally break down when their amplitude is saturated; therefore, the source of predictability is lost. The conditions for these wave breakings are highly nonlinear and unpredictable and are evidence of the chaos inherent in the atmosphere.


Edward Norton Lorenz (1917–2008) at mit first pointed out the sensitivity of deterministic numerical weather forecast methods to initial conditions. In a forecast experiment in the 1960s, he found that a tiny change of a non-dimensionalized initial condition from the full .506127 to .506 resulted in a completely different weather scenario. Commenting on this observation, he wrote that, “one meteorologist remarked that if the theory were correct, one flap of a seagull’s wings could change the course of weather forever.”[2] Known popularly as the “butterfly effect,” this result can be illustrated in two computer simulations. The computer models are identical, except for a slight difference in the initial conditions.

Two computer simulations show the sensitivity of the evolution of storms to initial states of the atmosphere. The figure shows the sea-level pressure (color) and surface winds (arrows) on days 1, 10, and 20 from very similar storm-free quiescent initial states. The two computer models are identical, except for a slight difference in initial conditions; the resulting duration of the storms in the bottom panel is longer. Courtesy of the author.
Two computer simulations show the sensitivity of the evolution of storms to initial states of the atmosphere. The figure shows the sea-level pressure (color) and surface winds (arrows) on days 1, 10, and 20 from very similar storm-free quiescent initial states. The two computer models are identical, except for a slight difference in initial conditions; the resulting duration of the storms in the bottom panel is longer. Courtesy of the author.

Starting from very similar storm-free quiescent states of the atmosphere, the storms observed on day 10 are much alike. The weather patterns are completely different on day 20: the overall duration of the storms in one simulation is much longer. The chaotic nature of the atmosphere presents challenges to accurate prediction of weather patterns on day 20, only using methods of extrapolation from the weather data on day 10.

As a result, while the atmosphere is fundamentally governed by deterministic Newtonian laws, the forecast of the atmosphere is generally expressed in terms of probability or chance. For instance, National Weather Service may announce that the thunderstorm “probability” tomorrow is 10 percent, or the IPCC Fourth Assessment Report concludes that global warming in the past few decades is “very likely” caused by humans.


In practice, multiple weather forecast experiments are carried out to estimate the effect of small errors in initial conditions (e.g., the flapping of a butterfly’s wing) on the forecast results, and the predicted weather is expressed in probability of future weather patterns. On a larger scale, a mean climate is calculated from a collection of chaotic weather events. We know with certainty, for instance, that the summer in year 2050 will be warmer than the winter of that year, and this is because the summer hemisphere receives more solar radiation than the winter hemisphere. As such, a change in atmospheric radiation due to increased greenhouse gases will alter the energy balance in the Earth’s climate, allowing atmospheric scientists to forecast future climate change in spite of the chaotic nature of the atmosphere. The errors in the parameterized processes, such as the geographic distribution of clouds, affect the accuracy of the Earth’s future climate, simulated in a model in a chaotic way. For example, a bias in the global cloud coverage can affect the albedo (i.e., the fraction of incident radiation that is reflected by a surface) of the planet Earth. The resultant radiative balance in the Earth’s climate affects the weather systems that transport the energy that balances the differential radiative heating from the sun, from the low latitudes to the polar region.

Improved resolution of computer models shows more detailed information for small-scale atmospheric features. Courtesy of the author.
Improved resolution of computer models shows more detailed information for small-scale atmospheric features. Courtesy of the author.

The primary challenge in predicting weather and climate lies in the uncertainties in the parameterized processes and the propagation of the uncertainties in a chaotic atmosphere over time. These uncertainties arise not only because we cannot observe the atmospheric conditions exactly due to lack of instrumental precision, but also because the atmospheric phenomena of interest range from the planetary scale to the size of a cloud water-droplet, which is impossible to resolve exactly in a computer model. These inaccuracies may compound in a nonlinear system, leading to different states in a few weeks. However, a mean climate can be obtained during a time average due to the dominant control of the radiative balance of the climate system rather than individual random events.

The accuracy of the computer model has improved dramatically over the past few decades, thanks to the exponential increase of computing power. One immediate consequence of increased realism is that more small-scale atmospheric features can be resolved in a higher-resolution model with a smaller size of grid box.

It is expected that extreme weather events, such as winter blizzards, hurricanes, or heat waves, will be better represented in a computer model with finer resolution. Representing these events is a major challenge to the current generation of computer models. However, as it is generally agreed among meteorologists that extreme weather will be more likely to occur as a consequence of climate warming, the enhanced computer models will need to continue improving our ability to predict weather patterns in order to overcome the inevitable increasing uncertainty of the weather.


Endnotes

1. Peter Lynch, “Margules’ Tendency Equation and Richardson’s Forecast,” Weather 58, (May 2003): 186–193.

2. Edward N. Lorenz, “Deterministic Nonperiodic Flow,” Journal of the Atmospheric Sciences 20, no. 2 (March 1963): 130–141.




Go back to 9: Mathematics