Category: DIY


If you’ve ever taken an organic chemistry lab class, you’ve probably done a melting point determination. That’s when you take a small sample of a solid, heat it up, and make note of the temperature at which it melts. This can be used to identify an unknown, but it is often used to assay purity. This is because impurities tend to make solids melt over a range of temperatures rather than at a single point, and because they tend to lower the melting point overall. There are fancy instruments you can buy which will measure melting points, but they’re so simple that I decided to make my own.

One way to do it is to use a Thiele tube, but I didn’t have one of those lying around. So I reached for my volumetric flask, filled it with mineral oil, and set it on a hot plate. Then, I put a tiny bit of the chemical vanillin into a capillary tube; this is my sample to test. I rubber-banded the capillary tube to a thermometer, such that the sample was next to the bulb. I set up a stand and clamped the thermometer in place, suspended in the mineral oil.

This would have worked, except that the samples used are typically so small that they are difficult to see with the naked eye. So I grabbed my USB microscope and clamped it in place, focused on the sample.

do it yerself

With my apparatus assembled, I turned up the heat and sat back to watch. Sure enough, between 80 and 82 degrees C. My copy of the Merck Index actually gives two melting point ranges for this compound, 80-81 and 81-83 degrees, which is a little confusing but seems to confirm that my melting point apparatus works as expected. Sweet!

melting

beesknees

Figure A: The knee of a bee

I recently bought a 80-800x USB microscope. It has really good quality for less than $40, and I’ve been using it to get up close and personal with crystals, dead bugs, and gross parts of my anatomy. Feast your eyes!

A wasp, with a bizarre, insectile tongue

A wasp, with a bizarre, insectile tonguemouththing. Proboscis. Weird.

mole

One of my moles. Maybe it’s time I get it looked at, hm?

I wasn't too thrilled to find out that I had warts, but they grow on you.

I wasn’t too thrilled to find out that I had warts, but they grow on you.

thermite bead

Iridescence in a bead of thermite residue.

cuso4

Copper sulfate blocks

cuacet8-4

Feathers of copper acetate

cuacet8-3

Copper acetate closeup

I hope to get some polarizing filters and videotape the growth of chiral crystals, like sugar or vitamin C. Stay tuned…

Today in LabLulz, I’m going to walk through a recent preparation I did in my chemistry lab: increasing and measuring the concentration of hydrogen peroxide.

WARNING: This procedure involves heat and the end product is a powerful oxidizer. Don’t get burned and don’t get it on yourself – wear gloves, splash-resistant goggles, and an apron. I had a spill of ~15%, all over everything, including myself. It was okay, but only because I followed safety protocols. I didn’t have the apron though, and I had to get pantsless.

Hydrogen peroxide is an interesting substance; it’s formula is H2O2, meaning that it is composed of two hydrogen atoms bonded to two oxygen atoms.

sdfsfasdf

Figure 1. Behold, the hydrogen peroxide molecule!

It is a powerful oxidizer, decaying into water and free oxygen. This is because the bond between the two oxygen atoms, called the peroxide bond, is unstable. Some substances which contain the peroxide bond are even explosive, like triacetonetriperoxide. Because it’s an explosive precursor, and somewhat dangerous on its own, concentrated hydrogen peroxide can be difficult to come by. The weak 3% solution found in drugstores is all that is available to DIYers, hobbyists, and other scientists outside of the mainstream chemical supply chain.

Fortunately, it is relatively trivial to increase the concentration from 3% to around 30%. There are several tutorials on the subject at YouTube (TheChemLife; zhmapper, nerdalert226) so I’m going to focus on measuring the concentration of the end product, a procedure which the videos tend to treat very qualitatively. I hope this tutorial will be informative and useful, even outside of punklabs; the process is easily generalized and density is important in many fields, including medicine and winemaking.

The concentrating procedure is pretty simple: pour about 500 mL of the 3% solution into a beaker and heat it, forcing the excess water to evaporate until there is a tenth as much liquid left (peroxide boils at 150 C, compared to 100 C for water.) There are only a couple of tricky points: the liquid must NOT boil, only steam – if it starts boiling, the peroxide will decay. Bits of dust and dirt will also cause disintegration, so the equipment must be kept very clean and free from scratches.

Okay, so after a few hours, I have about 50 mL of liquid. I drop a bit into a solution of corn starch and potassium iodide, and the mixture turns black, a positive test for oxidizers. I add a squirt to some sulfuric acid and copper wire, and the metal wire begins bubbling and the solution begins to turn blue with copper sulfate*. This reaction is faster and more vigorous than when I try it with the 3% solution, so I’ve clearly succeeded in increasing the concentration, but to what level? To answer that question, I’m going to measure the density of the solution. Continue reading

I’m revisiting some older research of mine, so that I can talk a little bit about some data visualization I did along the way. If you frequent TriZPUG or the SplatSpace, you might have seen my original presentation, but In Case You Missed It…

You might remember a while back I got interested in researching the statistical thermodynamics of poker tournaments. To briefly recap, I was treating the distribution of chips amongst players as a probability distribution, which meant that I could use the concept of entropy to describe the distribution. Entropy in thermodynamic systems is associated with how ‘spread out’ the energy is in that system: A hot cup of coffee in a cold room has low entropy while warm coffee in a warm room has high entropy. In a statistical system like a poker table, entropy measures how evenly distributed the chips are between the players. When the players start the tournament with equal amounts, the entropy is at a maximum. When one player wins all the chips, the entropy is at a minimum. Already things are interesting – entropy in this statistical system must decrease with time, in stark contrast with the second law of thermodynamics. And we haven’t even looked at what happens between those two points!

Poker entropy

Poker entropy

To better understand the behavior of tournaments, I needed a way to play them and replay them, to turn them into something other than tables of names and numbers. The first representation worked well at illustrating the distribution, but failed to capture the dynamics; except in catastrophic rearrangements, it was not always obvious how the chips moved around from hand to hand.

[link]

What’s going on is, I’ve whimsically renamed the players for anonymity, and then represented the size of their stack with a circle. Each hand is then represented by a transaction in which chips flow from one or more players to a single winner, with chip flow represented by black lines whose size is representative of the magnitude of flow. I find this hypnotic.

If you don’t care about coding, feel free to skip down….

How exactly did I put this together?

Zeroeth, we have to get our tools together.

import pickle, sys    #file IO utilities
import pygame    #pygame library
from pygame.locals import *    #more pygame stuff
from math import sin, cos, pi, sqrt    #math tools

First, there is a great deal of tedious regular expression slicing and dicing that you have to do to convert a tournament history file into usable data; I’ll be merciful and skip that. So I’ve finally bundled up the data in a couple of files.

Continue reading

There is a companion article exploring the issue from the perspective of environmental monitoring over at ArkFab.

Human influence on the environment has increased dramatically over the last 10,000 years, to the point that some geologists have argued that human reworking of the earth defines a new geologic age, The Anthropocene. (Zalasiewicz et al, 2008) Much of the focus has been on relatively robust, tangible changes in biogeochemistry. Examples include:

  • megafaunal extinction, accelerated erosion (Zalasiewicz et al, 2008) and nitrogen fixation resulting from the spread of intensive subsistence patterns
  • the loss of stratospheric ozone resulting from the release of novel chlorofluorocarbons

However, fleeting and less tangible effects are also important. Two examples are:

  • the light pollution resulting from urbanization and transportation infrastructure
  • changes in the acoustic environment resulting from direct addition of sonic energy and memes, as well as indirect sources.

A year-long composite view of the earth at night, showing human light generation. White lights are cities; blue lights are fishing boats; green lights are natural gas flares, and red lights are ‘ephemeral light sources’, interpreted as fires. Image from  NOAA National Geophysical Data Center – click for source + discussion.

Light pollution, the scourge of urban astronomers, is a well-accepted phenomenon with serious consequences. A 2004 review begins:

In the past century, the extent and intensity of artificial night lighting has increased such that it has substantial effects on the biology and ecology of species in the wild. We distinguish “astronomical light pollution”, which obscures the view of the night sky, from “ecological light pollution”, which alters natural light regimes in terrestrial and aquatic ecosystems. Some of the catastrophic consequences of light for certain taxonomic groups are well known, such as the deaths of migratory birds around tall lighted structures, and those of hatchling sea turtles disoriented by lights on their natal beaches. The more subtle influences of artificial night lighting on the behavior and community ecology of species are less well recognized, and constitute a new focus for research in ecology and a pressing conservation challenge. (Longcore & Rich 2004)

The amount of sonic energy released by human activity is recognized as an urban nuisance as well as an occupational safety concern. It also has recognized ecological effects: urban European robins have begun singing at night, when they have less acoustic competition. (Fuller et al 2007) Frogs have begun changing the pitch of their croaks in order to talk over traffic noise (Paris et al 2009)  In addition to sonic energy, human activity has released sonic memes into the environment. A meme is a self-replicating information pattern; jokes and computer viruses are two examples of memes. A person or computer acquires a meme and then spreads it, through retelling or infected emails. Sonic memes, such as ambulance sirens and cellphone ringtones, have been picked and repeated by songbirds. (Stover 2009) This is very interesting: human memes, the basis of Richard Dawkins’ ‘extended phenotype’ concept, have organically extended into other animals’ extended phenotype. (Recent reports of dolphins mimicking human speech are also very interesting in this context. The reverse flow also occurs, as animal communications are repackaged as ringtones or ambient music.)

Continue reading

dont forget the crystals

Magnesium sulfate crystals, clingin’ to a petri dish. Chillin’.

Another quick lab snap. These are some nice crystals I grew. I was washing an earlier, less photogenic crystal garden with alcohol, and catching the runoff in a petri dish. I let it evaporate and was greeted with this happy little accident! The crystals are magnesium sulfate, available as Epsom salt at most pharmacies.

haxor hijinx: a DIY hotplate

I have, once again, found myself at the helm of a DIY lab, this one with a chemical wetlab focus. I’m sure this will provide lots of material in the future; right now, I want to share a protip I came up with the other night. I have been using soda can alcohol stoves for heat, but this isn’t always appropriate. You can’t heat flammable mixtures, and they leave soot on my glassware. But I don’t have a hotplate yet! What’s a gutterpunk labnerd to do?

Don’t forget the boiling chips!

It’s won’t spin a stir bar, but a clothes iron will do fine as a hotplate! You can see that I’ve secured this one to the lab bench with wood and a clamp for extra stability.

 

 

Thermodynamics and Poker

There is a companion article which discusses this project’s role in decentralized community and citizen science at ArkFab.

You can find the current paper at Scribd here or download it here.

A while back, I got the idea to investigate how the entropy of a poker tournament evolves with time. In thermodynamics, entropy is a measure of how ‘spread out’ energy is amongst the states available to it. When the energy in a system is concentrated in one place (like a hot cup of coffee in a cold room), the entropy of the system is low. When the energy is spread out (a few hours later, both the room and the coffee are the same temperature) the entropy of the system is high.  Although originally defined for distributions of physical energy, entropy can be defined more generally to study arbitrary distributions – for example the distribution of capital, in the form of chips, between players in a poker tournament.

Just by looking at the formal structure of the game, you can tell some things about how entropy behaves. For example, it is formally required that entropy falls to zero with time. On the one hand, this is a fancy way of saying, ‘one person will eventually win the tournament’; on the other hand, it is interesting to consider that this is the exact opposite of what happens in the physical, thermodynamic world. The entropy of a closed thermodynamic system necessarily increases with time: hot coffee in a cold room will cool down, but warm coffee in a warm room will never heat up. However, the entropy of a closed poker table necessarily decreases. It has a second law of thermodynamics that runs in the opposite direction from ours.

Beyond a bottom-up analytical approach, I wanted to see how real-life tournaments behave. Although online gaming has generated a wealth of data, accessing the data is difficult, and I could find only one other paper which investigated the phenomenon. This was ‘Universal statistical properties of poker tournaments’ by Clement Sire (Sire 2007). The author notes that most of the game-theoretic work on poker has been on largely restricted to optimal betting strategies in head-to-head tournaments. Sire builds a relatively simple model of player behavior: a player bets according to a simple evaluation of their hand and the table, and goes all-in if their hand is evaluated to be above a certain quality. This model predicts that tournaments will have certain statistical properties; this prediction is born out in real-life tournaments.

I was only able to get two suitable datasets, so it’s hard to draw solid conclusions about what is going on. However,  there are interesting observations to be made. Here’s a visualisation of one tournament:

Tournament fortunes and entropy for one set of data. The top graph shows the holdings of each player; the bottom graph shows the entropy of the tournament as a function of time (green). The red lines are the upper and lower bounds for the tournament; this is a function of the number of players, which is in turn a function of time.

For one thing, the entropy remains close to its theoretical maximum value, generally ~90% of the absolute maximum. In the tournament pictured, entropy appears to increase to a maximum, and then slowly decline, before the loss of a player abruptly changes the distribution of chips (the sudden changes in the stair-step of the max/min entropy.)  Furthermore, when the tournament entropy is normalized by its maximum entropy, there is a significant upwards trend (p = 0.012). Over the course of the tournament, the entropy increases towards its theoretical maximum. Additionally, it is interesting to me that, in between the losses of players, entropy appears to increase, reach a maximum, and then decrease again before collapsing. (It’s more clear in this image) I interpret this as the redistribution of the winnings of the leaving player (eg, of cyan to black and then to the rest of the table in hands 1-25) followed by a concentration of chips which eventually pushes a player out (yellow vs. the rest, hands 25-40).

However, none of these observations held in the second tournament. One possibility is that, because the second tournament was faster paced, players were eliminated much faster, and these frequent perturbations are obscuring the pattern. On the other hand, it’s entirely possible that the first tournament was a fluke. The only way to resolve this question is with more data!

One reason I am interested in this question has to do with a series of papers written by Arto Annilla from the University of Helsinki. He’s shown that protein folding, genomics, abiogenesis, ecological succession – pretty much every aspect of nature – is not merely constrained by the second law of thermodynamics, but a direct consequence of it. Most relevant to this project, I think, is his analysis of economies and ecosystems. The ultimate goal of each, he argues, is not only to increase entropy to a maximum, but to do so as fast as possible. To the extent that poker tournaments can be thought of as a toy model of an economy, they may provide empirical insights into thermoeconomics. Of course, we’ve already seen that tournament entropy is formally constrained to decrease with time, though it would be interesting to see the behavior of a tournament which is not driven by a rising minimum bet, as these are. The first tournament may show an upward trend after perturbation from quasiequilibrium conditions, and the relative entropy may show a tournament-scale increase. (Or, it may not. Argh! Why oh why must n=2?!) Continue reading

DIY Spectro II

There is a more philosophically focussed companion article over at ArkFab.

At long last, second generation DIY spectro has arrived!

The spectrophotometer. Yes, that is an invisibility cloak. You can't see the stuff that's under it can you? Then that stuff is invisible!

If you recall, when last we left our humble spectrophotometer, it was a shambling mess of stone-age technology. Now, its a shambling mess of information-age technology!

Let’s take a closer look… Continue reading

A Detective Story

About to run the Final Qualifying Round for some second generation DIY Spectro, I placed the first blank into the cuvette holder, and pressed start (or rather ran python tryna.py; I’m tryna measure a spectrum, gosh!). The machine hummed into action, now that the motor control wire was plugged into slot 9, which the computer was communicating with rather than slot 2, which the computer was not.

A few minutes later I got the results: Nothing. The machine was not seeing any light. At all.

See, normally when I scan a blank sample, the detector (which has different sensitivities at different colors) shows a characteristic hill shape:

Some typical blank sample runs. Horizontal axis is motor position in degrees (0-180) and vertical axis is detector response. Black is the mean of the time series coming from the detector at a given motor position (1 sec data per degree); green is the standard deviation of the time series, and red denotes the maxima/minima. Time series were preprocessed to remove annoying serial communications glitches.

But when I ran this blank: Continue reading