Tag Archive: climate change

There is a companion article exploring the issue from the perspective of environmental monitoring over at ArkFab.

Human influence on the environment has increased dramatically over the last 10,000 years, to the point that some geologists have argued that human reworking of the earth defines a new geologic age, The Anthropocene. (Zalasiewicz et al, 2008) Much of the focus has been on relatively robust, tangible changes in biogeochemistry. Examples include:

  • megafaunal extinction, accelerated erosion (Zalasiewicz et al, 2008) and nitrogen fixation resulting from the spread of intensive subsistence patterns
  • the loss of stratospheric ozone resulting from the release of novel chlorofluorocarbons

However, fleeting and less tangible effects are also important. Two examples are:

  • the light pollution resulting from urbanization and transportation infrastructure
  • changes in the acoustic environment resulting from direct addition of sonic energy and memes, as well as indirect sources.

A year-long composite view of the earth at night, showing human light generation. White lights are cities; blue lights are fishing boats; green lights are natural gas flares, and red lights are ‘ephemeral light sources’, interpreted as fires. Image from  NOAA National Geophysical Data Center – click for source + discussion.

Light pollution, the scourge of urban astronomers, is a well-accepted phenomenon with serious consequences. A 2004 review begins:

In the past century, the extent and intensity of artificial night lighting has increased such that it has substantial effects on the biology and ecology of species in the wild. We distinguish “astronomical light pollution”, which obscures the view of the night sky, from “ecological light pollution”, which alters natural light regimes in terrestrial and aquatic ecosystems. Some of the catastrophic consequences of light for certain taxonomic groups are well known, such as the deaths of migratory birds around tall lighted structures, and those of hatchling sea turtles disoriented by lights on their natal beaches. The more subtle influences of artificial night lighting on the behavior and community ecology of species are less well recognized, and constitute a new focus for research in ecology and a pressing conservation challenge. (Longcore & Rich 2004)

The amount of sonic energy released by human activity is recognized as an urban nuisance as well as an occupational safety concern. It also has recognized ecological effects: urban European robins have begun singing at night, when they have less acoustic competition. (Fuller et al 2007) Frogs have begun changing the pitch of their croaks in order to talk over traffic noise (Paris et al 2009)  In addition to sonic energy, human activity has released sonic memes into the environment. A meme is a self-replicating information pattern; jokes and computer viruses are two examples of memes. A person or computer acquires a meme and then spreads it, through retelling or infected emails. Sonic memes, such as ambulance sirens and cellphone ringtones, have been picked and repeated by songbirds. (Stover 2009) This is very interesting: human memes, the basis of Richard Dawkins’ ‘extended phenotype’ concept, have organically extended into other animals’ extended phenotype. (Recent reports of dolphins mimicking human speech are also very interesting in this context. The reverse flow also occurs, as animal communications are repackaged as ringtones or ambient music.)

Continue reading

I had thought that once I graduated college, annoying student publications would quit being so… annoying. Alas, this isn’t the case. A previous article examined the quality of analysis at the Carolina Review, UNC’s ‘journal of conservative thought and opinion’; let’s see if things have approved any in the handful of years that I’ve been away.

Okay, checking their blog… mhmm… skim the headlines, clickety clicky….

… oh sweet cthulu, rise from your watery slumber and please make it stop.

The linked article describes environmentalism as factually challenged and lacking a vision of “the overall big picture”; let us categorically examine the main evidence presented in support of this thesis:

  1. “global warming, or climate change, or whatever they feel like calling it now” has been grossly exaggerated.
  2. Lighter cars are inherently more dangerous than gas-guzzlers.
  3. Recycling is bad.
  4. Fossil fuels can be greenwashed.

Ready? Let’s go.

Why is [head of NASA's GISS program and accomplished geophysicist Dr. James] Hanson [sic] so important?” – Carolina Review columnist Alex Thomas

I was disappointed by the coverage of climate change. I expected it to be lousy, and it was, but I didn’t expect it to be so… unsatisfying. The only evidence presented is the claim that Dr. Hansen’s 1988 congressional testimony was critically flawed, greatly overestimating the amount of temperature change to come. This is a PRATT, a Point Refuted A Thousand Times, so my treatment will be a bit superficial.  (For more detail, read this)

Some of Hansen’s scenarios gave realistic predictions, and some didn’t. The real question is why.

A climate simulation isn’t a magickal box that spits out numbers. In order to run it, you have to input certain parameters, like how bright the sun is, the greenhouse gas concentrations, and so on. For the past you might have direct measurements or proxy records; the future is not only unwritten, but contingent upon human agency. So you have to come up with plausible scenarios for what’s coming. Maybe we cut down on fossil fuel usage; maybe we ramp it up; maybe we relax clean air standards; maybe we have a nuclear war. You run the scenarios you’re interested in on climate models, and you compare, contrast, and interpret the output. One of the scenarios that Dr. Hansen used (“Scenario A”) overestimated greenhouse gas emissions – but not carbon dioxide. Scenario A assumed that we continued to emit CFCs, which are potent greenhouse gasses. Because they threatened the ozone layer, CFCs were phased out under the Montreal Protocols, which went into effect in 1989 – the year after Hansen’s testimony. Nowhere in the Carolina Review article do we hear about such confounding factors, nor the general success of government regulation in cutting down on ozone depletors. Nor is there mention that Scenarios B and C match observations well (see above), nor that Hansen’s 1981 predictions were freakinshly accurate. * Also, why is Dr. Hansen important? Because he was an adviser to Al Gore, of course!

Usually investigators only present and discuss the risk to occupants of the car or truck in question—as if society at large has no stake in the mayhem caused by some vehicles as long as those riding in them aren’t themselves killed.” – Wenzel and Ross 2008

Continue reading

I sit at the Carrboro Really Free Market, on the first caturday in July. I sit in the shade and the banners are blowing lazily in the breeze; still it’s nearly 100 degrees; the humidity jacks it up to 103, and the breeze is welcome but ineffectual. Air quality is ‘Orange’: ozone levels ‘may approach or exceed unhealthy standards.’ A parade is planned but only a handful want to move; I’m definitely not going back out. I keep a cold pack in my bag to refridgerate my computer, but I worry that the condensation from the humid air will offset the benefits of a cool processor. Whatever; I need chill tunes if I’m going to bike around in this weather.

A constant source of frustration for me is communicating the local importance of global problems. Climate change is real, and it’s serious – but at the same time it can be intangible and diffuse. I live in the North Carolina piedmont, hours away from the beach. I can explain to my neighbors that ocean acidification is a serious problem, that the demise of coral reefs would mean the loss of food and resources for the third world. But even if they believe me, even if they agree that it’s bad news, it can still be hard to see how global warming effects them personally, as a homeowner, a farmer, a pet owner or the parent of a young child, a worker with a daily commute. How does carbon dioxide pollution impact North Carolina and beyond?

rock me momma like the wind and the rain//rock me momma like a hurricane

Let’s start at the beach. An obvious problem here is rising sea levels. As the ocean heats up, it expands; as ice heats up, it melts and drains into the sea (or, it calves, falls into the sea, and then melts). This causes a slow but steady rise in sea level. Sea level is predicted to rise by a meter (maybe more) over the 21st century, and 4-6 m over the next few centuries. This is bad news bears – in many coastal counties, more than 10% of the population lives within a meter of high tide. The threat to homes and businesses is worsened by storm surges, which will also be higher as the seas rise [Strauss 2012]. North Carolina has a unique relationship with sea level rise. The coastal salt marshes have recorded 2,100 years of sea level history in their smelly mucky sediments; the ocean stayed relatively stable up until about 1880, when it began to creep upwards. The average rate of sea level rise for the NC coast over the 20th century was ‘greater than any other persistent, century-scale trend’ in the marsh’s memory. During this time period, the seas rose 3.5 times faster than they did even during the Medieval Warm Period, and regional sea level rose faster than model predictions over the 20th century (though the uncertainties involved overlap.) [Kemp et al. 2011]

Sea level rise at the North Carolina coast over the past two millenia. Things are pretty stable, even during climatic episodes like the MWP – until we get to the late 19th century. Then the hockey stick gets hockey stuck. GIA is glacial isostatic adustment, an additional factor which must be considered. It deals with the fact that the North American landmass is still rebounding from the weight of Ice Age glaciers. Image from Figure 2 of Kemp et al. 2011

But what’s really special is the state legislature’s reaction to the rising tide. This June, the NC Senate infamously outlawed the use of accelerating sea level scenarios in planning urban development. The usual astroturfing seems to be at play: the money trail for this legislation leads back to the Locke Foundation; spokespeople and nonprofits proliferate to establish a consent factory. These hijinks are as cynical as they are asinine: not only is global sea level rise accelerating [Church & White 2006], but North Carolina is at the southern end of a ‘hotspot’ where the sea is rising 3-4 times as fast as the global average, [Sallenger et al. 2012] putting its coastline at exceptional risk. The legislation is also a lovely inversion on a popular skuptik trope, that of an authoritarian scientific Orthodoxy dictating Truth and squelching dissidents. In this case, it’s the state government which has declared which climatic scenarios are kosher and which are thought crimes, favoring the least alarming. The proposed law would not merely declare what course sea level rise will take in the years to come, but also prohibit state planning agencies from considering alternatives. Not content to legislate straight marriage as the only valid relationship, the Old North State is considering straight lines as the only acceptable graph.

“You need to move indoors right now.”

Meteorologist Dr. Forbes, on Philadelphian extreme weather.

It’s Friday night, 29 June, and forecasts of a sweltering weekend have already started to come true. I am sifting through hardware at work when the power goes out. Continue reading

A while back, we started looking at a poorly thought-out article from the website C3Headlines. C3 is starting to make a name for itself as a goldmine of climate comedy- their claims have recently been addressed at Tamino and SkepticalScience.

We’re going to keep digging into C3‘s claim that carbon dioxide concentrations have been increasing linearly over the 20th century. They seemed to draw this claim by eyeballing the graph of CO2 concentrations and qualitatively describing them as linear, apparently using the inset in their first figure to compare linear, quadratic, and exponential trends. This is a faulty method: it’s an elementary fact of calculus that ANY smooth curve, when viewed appropriately, will appear linear. The point has already been made but it’s worthwhile to keep looking because there are some interesting graphical follies at play; examining them further might help us understand how and why graphs are misunderstood.

Figure 1: From C3Headlines’ article on “The Left/Liberal Bizarro Anti-Science Hyperbole”, which claims that CO2 concentrations are increasing linearly. Click to read it, if you dare…

C3‘s second graph in this article measures the change in atmospheric CO2 by calculating a month-to-month percentage change. It’s not entirely clear why they are using a percent change, rather than the standard practice of expressing rate of change as concentration change per year (like the source of their data uses). Whereas ppm/year is an absolute measure, each datum generated by the percentage-change method depends strongly upon the value of the previous month. As a measure of long-term rate of change, it is a bit questionable.

My primary concern, though, is with their use of monthly data in the first place. In my last article, we noted that, without explanation, C3 confined their focus to January CO2 concentrations. Were they consistent, they’d also look at January rates of change – of course, doing so might lead to unacceptable conclusions.

 Figure 2. Rates of CO2 accumulation have been calculated for the month of January, consistent with earlier investigation of January CO2 concentration. Over the period of observation, rates have increased at a significant (P~0.0005) acceleration of 0.11 ppm/year^2. Monthly rates throughout this article have been calculated by considering the change in CO2 between adjacent months, and assuming that a month is 1/12 of a year. Interpolated values of CO2 were used to avoid annoying data holes early in the record.

Instead, they look at the rate of change for every single month on record. Why do I find that problematic? Well, let’s look at the full record, with monthly resolution: Continue reading

dry ice in occupied durham

And what,

you might be asking yourselves,

have they been doing all these recent months instead of writing high-octane science friction and science fact here on the intarwubs?

Frozen carbon dioxide turns directly into a gas. How sublime! The dry ice is so cold that it causes water vapor in the air to condense, forming a fog.

Answer: All sorts of zany things! During a recent Really Free Market hosted by Occupy Durham, I had the opportunity to do another chemistry show.  Like the demonstration in my CO2 Problems video, I used soapy water and phenol red pH indicator to help illustrate the properties of frozen carbon dioxide. The color change is particularly dramatic, and is a good tie-in to the environmental effects of CO2. The greenhouse effect seems harder to demonstrate effectively – if anyone has a good way of demonstrating the idea, let me know!


dry ice and phenol red, bubblin' away... { pix courtesy of Specious }

One thing I showed in this demo which wasn’t in CO2 Problems is the strange noises that dry ice makes in response to metal. If you try to cut a piece of dry ice with a knife, or press a paperclip into it, the ice will make a horrible screeching shriek. It’s most dramatic if you put a larger chunk of dry ice into a metal pot – it will scream and skitter around! My explanation? The warm, thermally conductive metal speeds up the sublimation of CO2 near its edge; the expanding gas pushes the metal away briefly and then the pressure buildup dissipates, bringing the metal back in contact with the ice. This oscillation makes the screeching noise. Try it out yourself and see if you think I’m right!

temperature aNOMalies

If you are new to climate science, you might be wondering what, exactly, this ‘temperature anomaly’ thing is that you keep hearing about. I know I was a bit confused at first! This post explains the concept, using a real-world example.

Absolute temperatures (yearly averaged) from two sites in the UK: one urban (St. James Park, green) and one rural (Rothamsted, red). Although the urban site is consistently warmer, the two sites show the same warming trend. But is there a way to compare them directly? Data from Jones et al. 2008, kindly provided by Dr. Jones.

Cities tend to be warmer than their surrounding countrysides, a fact known as the urban heat island effect (UHI). This occasionally is offered as an alternative explanation for greenhouse warming, but it fails on closer inspection. We can use data from Jones et al. (2008) [PDF] to see one reason UHI can’t explain observed warming. One time series is from St. James Park, in the city of London; the other is from nearby Rothamsted, a rural site some tens of miles away. As you can see, the urban location is consistently about 2 C warmer; however, the warming is nearly identical at both sites (a strongly significant 0.03 deg C/year). Jones et al. note:

“… the evolution of the time series is almost identical. As for trends since 1961 all sites give similar values …  in terms of anomalies from a common base period, all sites would give similar values.”

This gives us a hint about what a temperature anomaly is: Continue reading

cnfusin rained and chas

Last time, we looked at a very simple atmospheric model known as the Lorenz equations, and saw it exhibit the ‘Butterfly Effect,’ in which even very small changes in initial conditions can dramatically effect which path the system takes. However, we also saw that the initial condition had a relatively small impact on the statistical properties of the system. Because climate is a statistical property of the earth system, asking
“How can we expect to predict future climate when we can’t predict the weather?”
is a lot like asking

“How can we claim to know the half-life of a radioactive element when we can’t predict when a given atom will decay?”

To those familiar with chaos, this shouldn’t come as a surprise. Lorenz didn’t just discover apparent disorder in his model, but a deeper, eerie structure lurking in the noise.

The Lorenz Attractor: wibbly-wobbly mess of the millenium. Three simulation runs (red, green, blue) are shown; they start close together but quickly spin off on different trajectories, demonstrating sensitivity to initial conditions. Nonetheless, the trajectories quickly converge on an intricate structure in the phase space, called an 'attractor'. The attractor doesn't vary with initial conditions, but is instead a feature of the Lorenz equations themselves. Image generated with code from TitanLab - click to check them out :)

You may remember that the Lorenz equations relate three variables (X, Y, Z), which vary over time. In the above image, I’ve plotted the evolution of three runs of the Lorenz model by putting a dot at each (X(t), Y(t), Z(t)) coordinate, at every time t in the given interval. The three runs start very close together in this three-dimensional ‘phase space’, but quickly diverge.

However, despite their different individual behaviors, these runs are confined to a structure in phase space, known as the Lorenz attractor – an attractor, because all trajectories converge on it, regardless of their initial conditions. If you perturb the system by bouncing it off the attractor, it quickly settles back into the same loops through phase space. Lorenz (1963) described it: Continue reading

A companion article at ArkFab shares my thoughts on peer review in regards to this project and DIY/community/citizen science in general. 

At long last, the much-anticipated booklet, “CO2 Trouble: Ocean Acidification, Dr. Everett, and Congressional Science Standards” is available and approved for human consumption! Download and share HERE (or at Scribd HERE).

In this document, I have bundled, updated, and expanded my series of essays debunking the congressional testimony of Dr. John Everett regarding the environmental chemistry of carbon dioxide.

It has been designed to be a fairly short (less than 30 pages, including images, appendicies, etc.) and accessible read. It has been challenging but fun to write; I have had to learn a lot about GIMP, Python, Scribus, social networking, and of course ocean acidification to get to this point.

It was also very useful for me as an opportunity to go back through my earlier remarks and double-check my work. For example, I later realized that the documentation which Dr. Everett provides for his CO2 data in part two is ambiguous: Although the citation for the rate data is referred to as “Recent Global CO2”, the URL provided links to the longer record as measured at Mauna Loa Observatory. This confusion had led me in the past to make incorrect claims about some of the figures he presents. Ultimately it was inconsequential to my argument, but it was frustrating to have to deal with such ambiguities. On the other hand, this led me into comparing the Mauna Loa record with the global record (Appendix B) which was an interesting exercise.

In researching this project, I also came across new phenomena I wasn’t previously aware of. For example, while I was calculating historical rates of CO2 change, I ran though the 1000-year Law Dome record and saw this:

Continue reading

Last time, we saw that some mathematical systems are so sensitive to initial conditions that even very small uncertainties in their initial state can snowball, causing even very similar states to evolve very differently. The equations describing fluid turbulence are examples of such a system; Lorenz’s discovery of extreme sensitivity to initial conditions ended hopes for long term weather forecasting. Because the state of the weather can only be known so well, the small errors and uncertainties will quickly build up, rendering weather simulations useless for looking more than a few days ahead of time.

But Lorenz’s discovery doesn’t have much impact on climate modelling, contrary to the claims of some climate skuptix. Climate is not weather, and modelling is not forecasting.

Weather refers to the state of the atmosphere at a particular time and place: What temperature is it? Is it raining? How hard is the wind blowing, and in which direction? Climate, on the other hand, is defined in terms of the statistical behavior of these quantities:

“Climate in a narrow sense is usually defined as the average weather, or more rigorously, as the statistical description in terms of the mean and variability of relevant quantities over a period of time ranging from months to thousands or millions of years. [...] Climate change refers to a change in the state of the climate that can be identified (e.g., by using statistical tests) by changes in the mean and/or the variability of its properties, and that persists for an extended period, typically decades or longer. ” IPCC

Many climate skuptik talking points derive from confusing these two quantities, in much the same way that a gambler might win a few hands of poker and decide that they are on a roll.

Although it is generally not possible to predict a specific future state of the weather (there is no telling what temperature it will be in Oregon on December 21 2012), it is still possible to make statistical claims about the climate (it is very likely that Oregon’s December 2012 temperatures will be colder than its July 2012 temperatures). It is very likely that the reverse will be true in New Zealand. It is safe to conclude that precipitation will be more frequent in the Amazon than in the Sahara, even if you can’t tell exactly when and where that rain will fall.

In fact, Lorenz’s groundbreaking paper, ‘Deterministic Nonperiodic Flow’, would seem to endorse this sort of statistical approach to understanding fluid dynamics:

“Because instantaneous turbulent flow patterns are so irregular, attention is often confined to the statistics of turbulence, which, in contrast to the details of turbulence, often behave in a regular well-organized manner.” (Lorenz 1963)

Let’s take a closer look.

Fig. 1. Three solutions of the Lorenz equations, starting at virtually identical points. Although the solutions are similar at first, they rapidly decouple around T=12.

The Lorenz equations consist of three variables describing turbulent fluid flow (X,Y, and Z), and three controlling parameters (r, b, and s). The equations are differential equations, meaning that a variable is described in terms of how it changes over time- saying ‘Johnny is driving west at 60 miles per hour’ is a simple differential equation. In order to solve a DiffEq, you need an initial condition – “Johnny started in Chicago” is an initial condition; without knowing that, you can’t say where she will be after driving for three hours. Continue reading

Regarding climate models, physician and science fiction writer Michael Crichton had this to say:

“Since climate may be a chaotic system—no one is sure—these predictions are inherently doubtful, to be polite.” (Aliens Cause Global Warming)

What does he mean when he says that climate may be chaotic, and what impact does this have on climate modelling?

Flash back to the early 1960s. Meteorologist Edward Lorenz was studying a bare-bones weather model, consisting of three differential equations. Give the model an initial state and the differential equations would describe how the state changes over time, in much the same way that you can predict where Johnny will be in three hours’ time, given that he starts in Chicago and is driving west at 60 miles per hour. The hope was that with a big enough computer, a powerful enough model, and an accurately measured state of the atmosphere, the weather could one day be predicted far in advance.

Lorenz, the story goes, found a run of the model which interested him, and sat down to replay the simulation. He entered the initial conditions and set the model in motion, only to watch in bewilderment as the replay rapidly diverged from the original simulation.

"From nearly the same starting point, Edward Lorenz saw his computer weather produce patterns that grew farther and farther apart until all resemblance disappeared" (Image and caption from Chaos: Making a New Science, by James Gleick, 1987, p.17)

Lorenz tore his code apart looking for the error, only to realise that the error had been in his assumptions. In a distinctly Crichtonesque twist, the computer worked with numbers to six decimal places (0.123456) but only printed out values to three decimal places (0.123) in order to save space. It was these shortened number which Lorenz entered as the initial conditions for his model. Surely those last digits were inconsequential; after all, they were but a few hundred parts per million, comparable to the atmospheric concentration of the trace gas carbon dioxide.

Oh, but the consequences! Its roots stretched back to earlier anomalies and the term ‘chaos’ would not be introduced for another decade, but it was Lorenz’s observation which heralded the beginnings of chaos theory.

Lorenz had discovered that even very small changes in the state of a chaotic system can quickly and radically change the way that the system develops over time. This property is known as extreme sensitivity to initial conditions, also called the ‘Butterfly Effect’ because it suggested neglecting an event as small as the flapping of a butterfly’s wings could be enough to derail a weather forecast. There is more to chaotic systems than the Butterfly Effect, but this characteristic is one of their best know properties. Lorenz’s work put and end to hopes of long-term weather forecasting. The state of the atmosphere could only be known so well, and even the smallest of imprecisions would lead the simulations to catastrophic failure.

‘Nobody believes a weather prediction twelve hours ahead. Now we’re asked to believe a prediction that goes out 100 years into the future? And make financial investments based on that prediction? Has everybody lost their minds?’ – Crichton

But does chaos theory signal doom for climate modelling? Stay tuned for part II….


Get every new post delivered to your Inbox.