Last time, we saw that some mathematical systems are so sensitive to initial conditions that even very small uncertainties in their initial state can snowball, causing even very similar states to evolve very differently. The equations describing fluid turbulence are examples of such a system; Lorenz’s discovery of extreme sensitivity to initial conditions ended hopes for long term weather forecasting. Because the state of the weather can only be known so well, the small errors and uncertainties will quickly build up, rendering weather simulations useless for looking more than a few days ahead of time.

But Lorenz’s discovery doesn’t have much impact on climate modelling, contrary to the claims of some climate skuptix. Climate is not weather, and modelling is not forecasting.

Weather refers to the state of the atmosphere at a particular time and place: What temperature is it? Is it raining? How hard is the wind blowing, and in which direction? Climate, on the other hand, is defined in terms of the statistical behavior of these quantities:

“Climate in a narrow sense is usually defined as the average weather, or more rigorously, as the statistical description in terms of the mean and variability of relevant quantities over a period of time ranging from months to thousands or millions of years. […] Climate change refers to a change in the state of the climate that can be identified (e.g., by using statistical tests) by changes in the mean and/or the variability of its properties, and that persists for an extended period, typically decades or longer. ” IPCC

Many climate skuptik talking points derive from confusing these two quantities, in much the same way that a gambler might win a few hands of poker and decide that they are on a roll.

Although it is generally not possible to predict a specific future state of the weather (there is no telling what temperature it will be in Oregon on December 21 2012), it is still possible to make statistical claims about the climate (it is very likely that Oregon’s December 2012 temperatures will be colder than its July 2012 temperatures). It is very likely that the reverse will be true in New Zealand. It is safe to conclude that precipitation will be more frequent in the Amazon than in the Sahara, even if you can’t tell exactly when and where that rain will fall.

In fact, Lorenz’s groundbreaking paper, ‘Deterministic Nonperiodic Flow’, would seem to endorse this sort of statistical approach to understanding fluid dynamics:

“Because instantaneous turbulent flow patterns are so irregular, attention is often confined to the statistics of turbulence, which, in contrast to the details of turbulence, often behave in a regular well-organized manner.” (Lorenz 1963)

Let’s take a closer look.

Fig. 1. Three solutions of the Lorenz equations, starting at virtually identical points. Although the solutions are similar at first, they rapidly decouple around T=12.

The Lorenz equations consist of three variables describing turbulent fluid flow (X,Y, and Z), and three controlling parameters (r, b, and s). The equations are differential equations, meaning that a variable is described in terms of how it changes over time- saying ‘Johnny is driving west at 60 miles per hour’ is a simple differential equation. In order to solve a DiffEq, you need an initial condition – “Johnny started in Chicago” is an initial condition; without knowing that, you can’t say where she will be after driving for three hours.

Here’s the equations:

 X’ = s * (Y – X)

 Y’ = X * (r  – Z) – Y

 Z’ = X * Y – b * Z

… where V’ is the time derivative of variable V.

Its ironic that the same people who criticize climate models for their supposed lack of realism are relying on results derived from a such an austere model, described by James Gleick:

‘The sun beat down through a sky that had never seen clouds. The winds swept across an earth as smooth as glass. Night never came, and autumn never gave way to winter. It never rained.’ – (Gleick 1987 p. 11)

Fig. 1, above, demonstrates this model’s sensitivity to initial conditions. Even though the three simulation runs start off almost exactly the same, they slowly drift, and then spontaneously fly apart, quickly losing any similarity to each other. This quick catastrophic divergence might seem like it would doom a climate model, but remember that climate is defined statistically, and even very dissimilar runs can have consistent statistical behavior.

Fig. 2. The mean value of the variable Z in the Lorenz equations. At each value of the Rayleigh number, the Lorenz system is solved with 10 different starting conditions, with s = 10 and b = 8/3.

Here’s an example. In Fig 2, I’ve varied the parameter r (more on that in a second – for now, concentrate on the canonical Lorenz value of r=28) and then solved the Lorenz equations with a number of different initial conditions. Then, I found the average value of the output of variable Z (interpreted as a measure of the heat profile in the turbulent fluid). The values of mean(Z) are very tightly clustered, meaning that the initial conditions didn’t influence the average very much.

On the other hand, the mean is strongly determined by the parameter r. r is called the Rayleigh number (Strogatz 1994), and it describes how easily a fluid will convect. Similarly, climatic quantities such as globally averaged temperature are determined by system parameters such as greenhouse gas concentrations much more than they are by initial conditions

words words words

Fig. 3. The standard deviations of a Lorenz variable from an ensemble of simulations, as a function of the Rayleigh number. They're not as tightly clustered as the means were, but they still appear well-behaved - the 'climate' of the Lorenz model can be described in spite of chaos!

Here’s another example. I’ve run the simulations again, and taken the variable X (interpreted as the intensity of convection). Then I calculate the standard deviation of the time series. At a given Rayleigh number (eg, 28) the standDev is not as well clustered as the average was in Fig 2, so initial conditions play a larger role here. Still, the statistics of X seem to be clustered pretty tightly, and the parameter r seems to have a strong(er?) control on the statistics. There is a clear increase in standDev(X) over 20 < r < 45, and a tight linear increase over 35 < r < 45. There also appears to be a lessened dependence on initial conditions in 35 < r < 45 and there seems to be a significant dip in 30 < r < 35. We can also see a sharp transition around r = 25; this corresponds to a critical value of the Rayleigh number ~24.74, the point at which the stability of steady convection changes. (Lorenz 1963)

I took a chaotic model, ran several simulations with different starting conditions, and considered the statistical properties of the ‘ensemble’ of all the simulations. This procedure is standard in climate modelling. My favorite example is in “Is the climate warming or cooling?” (Easterling & Wehner 2009). This paper explores the question: does a decade-long downturn in global temperatures indicate that global warming has stopped? To answer this question, the authors looked at a 20th century temperature record, observing that there are several ten-year intervals containing no trend, or even a slight cooling trend, despite a long-term warming. (Fig 4; this might sound familiar 😀)

Fig. 4. The probability that a given decade will have a particular trend. These probabilities are calculated from ensembles of model runs and 20th century temperature reconstructions. As you'd expect, the preindustrial simulations, lacking today's greenhouse forcing, show equal probabilities for warming and cooling trends on the decade scale. Observed and predicted climate change shifts the probabilities towards warming, but the odds of a decade showing no trend, or even a cooling trend, remains significant. Notice also the good agreement between the statistics derived from the model runs for the 20th century and those derived from the instrumental record. Sauce is (Easterling & Wehner 2009)

They also took an ensemble of climate simulations – some hindcasts which replayed variations of the preindustrial era or the 20th century, some forecasts of the 21st century – and they looked at the probability of a decade having a particular trend (a statistical property of the collection). They found (Fig. 4):

“The observed record shows a very similar distribution to the 20th century simulations, especially considering that only one version of the observed record was used in this analysis adding credence to the conclusions in the IPCC AR4 that the observed warming since 1950 is very likely due to increasing greenhouse gases. Finally for the simulations of the entire 21st century there is still about a 5% chance of a negative decadal trend, even in the absence of any simulated volcanic eruptions. If we restrict the period to the first half of the 21st century the probability increases to about 10% revealing that the trend in surface air temperature has its own positive trend in the A2 emissions scenario.”

The Lorenz equations are a canonical example of extreme sensitivity to initial conditions. However, their chaotic nature is not an obstacle to making statistical claims about their behavior. Climate is defined statistically, so the fact the weather is chaotic is not a barrier to climate modelling.

~~~
there will be a Lorenz attractor next time, I promise…
~~~

Easterling, D., & Wehner, M. (2009). Is the climate warming or cooling? Geophysical Research Letters, 36 (8) DOI: 10.1029/2009GL037810

James Gleick. Chaos: Making a New Science. 1987.

Lorenz, Edward N. (1963). Deterministic Nonperiodic Flow Journal of the Atmospheric Sciences, 20 (2)

Stephen Strogatz. Nonlinear Dynamics and Chaos. 1994.
Lorenz equations modeled with software adapted from that available from TitanLab.