Category: mathemagick


V

One of my hobbies is sacred geometry‭ ‬-‭ ‬loosely defined,‭ ‬it’s the study and use of mathematical archetypes in nature and culture,‭ ‬often with a focus on traditional compass and straightedge constructions.‭

Don’t worry‭; ‬I’m not about to go off into numerology,‭ ‬telling you that you can derive the groovy cosmic secrets of the ancients by studying numeric coincidences.‭ ‬I actually take a rather dim reading spurious meaning into special cases of the Interesting Number Theorem.‭* ‬The worst case scenario might resemble the Arronofsky film‭ “‬Pi‭”‬,‭ ‬except without the sweet soundtrack.

Nonetheless,‭ ‬exploring the traditions behind sacred geometry can give insight into art and design.‭ ‬The traditional compass and straightedge drove the development of mathematics until just a century or two ago,‭ ‬and modern algebra has its roots in questions about why it is,‭ ‬exactly,‭ that ‬you can’t construct a perfect heptagon‭. The practice is also fun and relaxing, and can even give occasional insights into physical reality.

design2

One subject that comes up a lot in these discussions is a number called the golden ratio,‭ ‬sometimes abbreviated phi.‭ ‬Phi is defined in terms of the relationship between a line segment and its parts.‭ Continue reading

In case you haven’t heard, the North Carolina General Assembly has run amok.

It’s hard to believe that things could get worse: the last NCGA approved Ammendment One, which declared that straight marriage was the only recognized family. And they tried to outlaw accelerating sea level rise by declaring that straight lines were the only recognized graph.

And yet after the 2012 election, things turned upside down.

  • Senator Tom Tucker displayed amazing arrogance and unfamiliarity with his job description when he told a reporter: “I am the senator, you are the citizen. You need to be quiet” (Huffington Post)
  • A House resolution was proposed which would allow the establishment of a state religion, as well as incorporating prayer as a public institution (WRAL)
  • Another bill was proposed to criminalize womens’ nipples. (DTH)
  • The budget committee has considered making ends meet by closing NC’s public universities (a tactic known as, ‘eating your seed corn’) (N&O)
  • The Senate has passed a bill rolling back 40 years of environmental protections in order to make way for fracking, in defiance of the state Department of Environment’s recommendations. (McClatchyDC)

That’s just some of the more bizarre social experimentation going on; there’s been plenty of garden-variety attacks on voting rights, public education and social services for the poor.

The point of all this is, a lot of people are justifiably annoyed. So much so, that weekly protests at the state capitol have broken out, headed by the state NAACP and dubbed ‘Moral Mondays’. Peaceful protesters have been arrested by the score, then the hundred, for voicing their disgust with a runaway legislature.

Conservatives have fought back, and some have fought dirty. In one especially skeezy move, the right-wing Civitas Institute has published a public database of information on the protesters, including their photograph, and city of residence. It’s creepy, but now that it exists, it’s a window into what is happening on Moral Mondays.

The Civitas data record a total of 457 arrests. Of these, all but 8 gave their residence as in North Carolina. That is to say, 98% of the arrested are clearly locals. This data reinforces an earlier survey which found the same proportion in the protesters as a whole. This matters because some, including governor Pat McCrory have tried to dismiss the protests as the work of outside agitators.

Something disappointing about the Civitas effort is that the infographics provided are drab and at times completely inappropriate. (I mean, really?)

To show them how it’s done, let’s map out some information. Here are the absolute number of arrests, categorized by county and by city. Unfortunately, the city data which were available from the NC DOT did not have all of the cities in the arrest data, leading to 65 of 85 cities being represented in the map, explaining why some counties (eg, Cherokee) report arrests but contain no cities reporting arrest. This may introduce a bias in which smaller cities and towns are not represented when city-based data are used.

Words

Fig 1a. Moral Monday arrests, binned by county.

Words

Fig 1b. Moral Monday arrests, binned by city

Composite

Fig 1c. Composite map of 1a and 1b.

A few things seem to pop out: Arrests are geographically centered around the Triangle (Chapel Hill, Durham, and Raleigh), with other major centers around cities (eg, Charlotte, Wilmington, and Asheville). Comparing to other political maps (such as Amendment One or the 2012 presidential election), this pattern is not surprising, however, why it is happening is less clear.

Continue reading

Today in LabLulz, I’m going to walk through a recent preparation I did in my chemistry lab: increasing and measuring the concentration of hydrogen peroxide.

WARNING: This procedure involves heat and the end product is a powerful oxidizer. Don’t get burned and don’t get it on yourself – wear gloves, splash-resistant goggles, and an apron. I had a spill of ~15%, all over everything, including myself. It was okay, but only because I followed safety protocols. I didn’t have the apron though, and I had to get pantsless.

Hydrogen peroxide is an interesting substance; it’s formula is H2O2, meaning that it is composed of two hydrogen atoms bonded to two oxygen atoms.

sdfsfasdf

Figure 1. Behold, the hydrogen peroxide molecule!

It is a powerful oxidizer, decaying into water and free oxygen. This is because the bond between the two oxygen atoms, called the peroxide bond, is unstable. Some substances which contain the peroxide bond are even explosive, like triacetonetriperoxide. Because it’s an explosive precursor, and somewhat dangerous on its own, concentrated hydrogen peroxide can be difficult to come by. The weak 3% solution found in drugstores is all that is available to DIYers, hobbyists, and other scientists outside of the mainstream chemical supply chain.

Fortunately, it is relatively trivial to increase the concentration from 3% to around 30%. There are several tutorials on the subject at YouTube (TheChemLife; zhmapper, nerdalert226) so I’m going to focus on measuring the concentration of the end product, a procedure which the videos tend to treat very qualitatively. I hope this tutorial will be informative and useful, even outside of punklabs; the process is easily generalized and density is important in many fields, including medicine and winemaking.

The concentrating procedure is pretty simple: pour about 500 mL of the 3% solution into a beaker and heat it, forcing the excess water to evaporate until there is a tenth as much liquid left (peroxide boils at 150 C, compared to 100 C for water.) There are only a couple of tricky points: the liquid must NOT boil, only steam – if it starts boiling, the peroxide will decay. Bits of dust and dirt will also cause disintegration, so the equipment must be kept very clean and free from scratches.

Okay, so after a few hours, I have about 50 mL of liquid. I drop a bit into a solution of corn starch and potassium iodide, and the mixture turns black, a positive test for oxidizers. I add a squirt to some sulfuric acid and copper wire, and the metal wire begins bubbling and the solution begins to turn blue with copper sulfate*. This reaction is faster and more vigorous than when I try it with the 3% solution, so I’ve clearly succeeded in increasing the concentration, but to what level? To answer that question, I’m going to measure the density of the solution. Continue reading

I’ve been working through Michael Barnsley’s book, “Fractals Everywhere”; it’s a relatively advanced textbook on fractal geometry. The first chapter is a survey of analysis and topology, which has been a nice opportunity to refresh my math skills, as well as a more thorough exploration of metric spaces than I’d gotten before. I was double checking one of the problems and wrote it out all organized, and then I decided to tell you about it. So I scanned it in, started cleaning it up in GIMP, one thing led to another…

Learnin me some GIMP skillz

I later realized that I could actually generalize the bulk of the proof into a lemma: Any subset of a totally bounded set is itself totally bounded.

Images used:

 

 

 

A while back, we started looking at a poorly thought-out article from the website C3Headlines. C3 is starting to make a name for itself as a goldmine of climate comedy- their claims have recently been addressed at Tamino and SkepticalScience.

We’re going to keep digging into C3‘s claim that carbon dioxide concentrations have been increasing linearly over the 20th century. They seemed to draw this claim by eyeballing the graph of CO2 concentrations and qualitatively describing them as linear, apparently using the inset in their first figure to compare linear, quadratic, and exponential trends. This is a faulty method: it’s an elementary fact of calculus that ANY smooth curve, when viewed appropriately, will appear linear. The point has already been made but it’s worthwhile to keep looking because there are some interesting graphical follies at play; examining them further might help us understand how and why graphs are misunderstood.

Figure 1: From C3Headlines’ article on “The Left/Liberal Bizarro Anti-Science Hyperbole”, which claims that CO2 concentrations are increasing linearly. Click to read it, if you dare…

C3‘s second graph in this article measures the change in atmospheric CO2 by calculating a month-to-month percentage change. It’s not entirely clear why they are using a percent change, rather than the standard practice of expressing rate of change as concentration change per year (like the source of their data uses). Whereas ppm/year is an absolute measure, each datum generated by the percentage-change method depends strongly upon the value of the previous month. As a measure of long-term rate of change, it is a bit questionable.

My primary concern, though, is with their use of monthly data in the first place. In my last article, we noted that, without explanation, C3 confined their focus to January CO2 concentrations. Were they consistent, they’d also look at January rates of change – of course, doing so might lead to unacceptable conclusions.

 Figure 2. Rates of CO2 accumulation have been calculated for the month of January, consistent with earlier investigation of January CO2 concentration. Over the period of observation, rates have increased at a significant (P~0.0005) acceleration of 0.11 ppm/year^2. Monthly rates throughout this article have been calculated by considering the change in CO2 between adjacent months, and assuming that a month is 1/12 of a year. Interpolated values of CO2 were used to avoid annoying data holes early in the record.

Instead, they look at the rate of change for every single month on record. Why do I find that problematic? Well, let’s look at the full record, with monthly resolution: Continue reading

I love graphs – my eyes quickly glaze over at a table of numeric data, but a graph, used correctly, can quickly and easily tell the whole story.

‘Used correctly’ is the key phrase – for all their power, graphs are infamously easy to bungle, and when used incorrectly they can misinform – or lie outright.

I’m going to look at an example that touches on a few graphical and statistical concepts near and dear to my heart, as well as carbon geochemistry.

Fig. 1: An image from C3Headlines; the 3 C's are "Climate, Conservative, Consumer". Oh, and the article is titled "The Left/Liberal Bizarro, Anti-Science Hyperbole Continues". It sure would be tragic if they made obvious n00b mistakes after using such language. Click for link!

Coming from an article on the website C3Headlines, this image claims that carbon dioxide concentrations have ‘Linear, Not Exponential Growth’. thereby ‘expos[ing] the lunacy of typical left/liberal/progressive/Democrat anti-science’, The author has reached this conclusion by graphing January CO2 levels* and fitting a linear trendline to them.

Already this is a warning sign – the comparisons the author makes are entirely qualitative, apparently  based up on eyeballing the graph. However, trend lines are created by a statistical process called a linear regression, which comes with a caveat: it will fit a trend line to ANY data given to it, linear or nonlinear. Fortunately, there are also ways of evaluating how good a trend line is. Continue reading

temperature aNOMalies

If you are new to climate science, you might be wondering what, exactly, this ‘temperature anomaly’ thing is that you keep hearing about. I know I was a bit confused at first! This post explains the concept, using a real-world example.

Absolute temperatures (yearly averaged) from two sites in the UK: one urban (St. James Park, green) and one rural (Rothamsted, red). Although the urban site is consistently warmer, the two sites show the same warming trend. But is there a way to compare them directly? Data from Jones et al. 2008, kindly provided by Dr. Jones.

Cities tend to be warmer than their surrounding countrysides, a fact known as the urban heat island effect (UHI). This occasionally is offered as an alternative explanation for greenhouse warming, but it fails on closer inspection. We can use data from Jones et al. (2008) [PDF] to see one reason UHI can’t explain observed warming. One time series is from St. James Park, in the city of London; the other is from nearby Rothamsted, a rural site some tens of miles away. As you can see, the urban location is consistently about 2 C warmer; however, the warming is nearly identical at both sites (a strongly significant 0.03 deg C/year). Jones et al. note:

“… the evolution of the time series is almost identical. As for trends since 1961 all sites give similar values …  in terms of anomalies from a common base period, all sites would give similar values.”

This gives us a hint about what a temperature anomaly is: Continue reading

cnfusin rained and chas

Last time, we looked at a very simple atmospheric model known as the Lorenz equations, and saw it exhibit the ‘Butterfly Effect,’ in which even very small changes in initial conditions can dramatically effect which path the system takes. However, we also saw that the initial condition had a relatively small impact on the statistical properties of the system. Because climate is a statistical property of the earth system, asking
“How can we expect to predict future climate when we can’t predict the weather?”
is a lot like asking

“How can we claim to know the half-life of a radioactive element when we can’t predict when a given atom will decay?”

To those familiar with chaos, this shouldn’t come as a surprise. Lorenz didn’t just discover apparent disorder in his model, but a deeper, eerie structure lurking in the noise.

The Lorenz Attractor: wibbly-wobbly mess of the millenium. Three simulation runs (red, green, blue) are shown; they start close together but quickly spin off on different trajectories, demonstrating sensitivity to initial conditions. Nonetheless, the trajectories quickly converge on an intricate structure in the phase space, called an 'attractor'. The attractor doesn't vary with initial conditions, but is instead a feature of the Lorenz equations themselves. Image generated with code from TitanLab - click to check them out 🙂

You may remember that the Lorenz equations relate three variables (X, Y, Z), which vary over time. In the above image, I’ve plotted the evolution of three runs of the Lorenz model by putting a dot at each (X(t), Y(t), Z(t)) coordinate, at every time t in the given interval. The three runs start very close together in this three-dimensional ‘phase space’, but quickly diverge.

However, despite their different individual behaviors, these runs are confined to a structure in phase space, known as the Lorenz attractor – an attractor, because all trajectories converge on it, regardless of their initial conditions. If you perturb the system by bouncing it off the attractor, it quickly settles back into the same loops through phase space. Lorenz (1963) described it: Continue reading

A companion article at ArkFab shares my thoughts on peer review in regards to this project and DIY/community/citizen science in general. 

At long last, the much-anticipated booklet, “CO2 Trouble: Ocean Acidification, Dr. Everett, and Congressional Science Standards” is available and approved for human consumption! Download and share HERE (or at Scribd HERE).

In this document, I have bundled, updated, and expanded my series of essays debunking the congressional testimony of Dr. John Everett regarding the environmental chemistry of carbon dioxide.

It has been designed to be a fairly short (less than 30 pages, including images, appendicies, etc.) and accessible read. It has been challenging but fun to write; I have had to learn a lot about GIMP, Python, Scribus, social networking, and of course ocean acidification to get to this point.

It was also very useful for me as an opportunity to go back through my earlier remarks and double-check my work. For example, I later realized that the documentation which Dr. Everett provides for his CO2 data in part two is ambiguous: Although the citation for the rate data is referred to as “Recent Global CO2”, the URL provided links to the longer record as measured at Mauna Loa Observatory. This confusion had led me in the past to make incorrect claims about some of the figures he presents. Ultimately it was inconsequential to my argument, but it was frustrating to have to deal with such ambiguities. On the other hand, this led me into comparing the Mauna Loa record with the global record (Appendix B) which was an interesting exercise.

In researching this project, I also came across new phenomena I wasn’t previously aware of. For example, while I was calculating historical rates of CO2 change, I ran though the 1000-year Law Dome record and saw this:

Continue reading

Last time, we saw that some mathematical systems are so sensitive to initial conditions that even very small uncertainties in their initial state can snowball, causing even very similar states to evolve very differently. The equations describing fluid turbulence are examples of such a system; Lorenz’s discovery of extreme sensitivity to initial conditions ended hopes for long term weather forecasting. Because the state of the weather can only be known so well, the small errors and uncertainties will quickly build up, rendering weather simulations useless for looking more than a few days ahead of time.

But Lorenz’s discovery doesn’t have much impact on climate modelling, contrary to the claims of some climate skuptix. Climate is not weather, and modelling is not forecasting.

Weather refers to the state of the atmosphere at a particular time and place: What temperature is it? Is it raining? How hard is the wind blowing, and in which direction? Climate, on the other hand, is defined in terms of the statistical behavior of these quantities:

“Climate in a narrow sense is usually defined as the average weather, or more rigorously, as the statistical description in terms of the mean and variability of relevant quantities over a period of time ranging from months to thousands or millions of years. […] Climate change refers to a change in the state of the climate that can be identified (e.g., by using statistical tests) by changes in the mean and/or the variability of its properties, and that persists for an extended period, typically decades or longer. ” IPCC

Many climate skuptik talking points derive from confusing these two quantities, in much the same way that a gambler might win a few hands of poker and decide that they are on a roll.

Although it is generally not possible to predict a specific future state of the weather (there is no telling what temperature it will be in Oregon on December 21 2012), it is still possible to make statistical claims about the climate (it is very likely that Oregon’s December 2012 temperatures will be colder than its July 2012 temperatures). It is very likely that the reverse will be true in New Zealand. It is safe to conclude that precipitation will be more frequent in the Amazon than in the Sahara, even if you can’t tell exactly when and where that rain will fall.

In fact, Lorenz’s groundbreaking paper, ‘Deterministic Nonperiodic Flow’, would seem to endorse this sort of statistical approach to understanding fluid dynamics:

“Because instantaneous turbulent flow patterns are so irregular, attention is often confined to the statistics of turbulence, which, in contrast to the details of turbulence, often behave in a regular well-organized manner.” (Lorenz 1963)

Let’s take a closer look.

Fig. 1. Three solutions of the Lorenz equations, starting at virtually identical points. Although the solutions are similar at first, they rapidly decouple around T=12.

The Lorenz equations consist of three variables describing turbulent fluid flow (X,Y, and Z), and three controlling parameters (r, b, and s). The equations are differential equations, meaning that a variable is described in terms of how it changes over time- saying ‘Johnny is driving west at 60 miles per hour’ is a simple differential equation. In order to solve a DiffEq, you need an initial condition – “Johnny started in Chicago” is an initial condition; without knowing that, you can’t say where she will be after driving for three hours. Continue reading