Temperature is typically thought of as the average energy of individual atoms or molecules within a given collection. For atoms of similar mass, this “kinetic temperature” would basically be their speed at equilibrium. For a group of molecules, we have just a little extra accounting to do. Their total energy is also partitioned into the relative motions of their constituent atoms oscillating about their bonds, typically either bending or translating motions.
These familiar ideas of temperature work pretty well for most solids, liquids, and gases, and conform to the general expectation that it should always be greater than absolute zero. What are we to make of a recent claim by a group of German researchers that they have created an experimental system where negative (as in below absolute zero) temperatures can actually be observed and measured?
Despite the near universal desire to find the other-worldly in the everyday, there is unfortunately no real new bizarro with the idea of negative temperature. Negative temperatures were first created back in 1951 by Ed Purcell, who won the Nobel Prize the next year. Among other related pursuits, he had previously been the first person to observe nuclear magnetic resonance (NMR) — the heart of the modern MRI scanner — which uses a large magnetic field to polarize nuclear spins. In fact the negative temperature systems Purcell created were nuclear spins in a crystal of lithium fluoride that was itself at room temperature. The novelty of the negative temperature system created by the German group is that instead of nuclear spins, they used ultracold atoms. They describe their system as having “motional degrees of freedom,” in contrast to nuclear spins which do not move in any conventional sense.
Describing negative temperature as “hotter than infinity” or simply appealing to more nebulous definitions via entropy and the second law of thermodynamics, as is often done, is not going to cut it for our purposes. It is not that we lack the sophistication to discuss entropy, but more that an understanding in more familiar terms will give greater satisfaction.
Entropy is a convenient mathematical construct which indicates that if heat is added to a system, the atoms become less ordered. In other words, they have more states, or shall we say, options, available to them. All a negative absolute temperature really means is that with the addition of heat, instead of becoming more random, atoms become more ordered. This can occur, for example, if the number of high energy spots available is limited, and therefore likely to be quickly filled.
If, for example, we had a bunch of numbered lottery balls blowing around inside a big chamber and turned up the blower speed so that they might reach to the whole upper extent of the chamber, their entropy and temperature could be observed to have increased. If, however, we had also secretly applied some sticky silicone rubber to the underside of the roof of the chamber, the balls having enough energy could reach the silicone layer and become immobilized, thereby effectively lowering this measure of temperature.
Could creating negative temperatures really be as simple as this?
The problem with invoking entropy and trying to actually count all the states available to a system is just that — counting all those states. That is something easier said than done, and rarely even possible to say exactly. To illustrate the confusion, consider Claude Shannon, the Bell Labs employee who founded information theory. Shannon developed a formula to quantify signal attenuation in early telephone lines. He initially chose to call his measure “uncertainty,” but changed it to “entropy” after a meeting with John Von Neumann — Von Neumann, himself a founder of modern computing, had observed, “no one really understands entropy anyway so you will always have advantage in debate.”