Add new comment

Maths in a minute: Entropy

Entropy is a strange thing. Some people say it measures the amount of disorder in a physical system. Others say that it's a measure of information. And yet others talk about it in the context of steam engines. So what is it and how are these different contexts linked?

Steam engines

Classical definition of entropy

For a reversible process that involves a heat transfer of size $Q$ at a temperature $T$ the change in entropy $\Delta S$ is measured by

  \[ \Delta S = Q/T. \]    

A reversible process is one in which no energy is dissipated (through friction etc). To see how this formula can be applied to more realistic irreversible processes and for an example calculation see here.

Let's start at the beginning. The nineteenth century saw the rise of the steam engine, but also pressed home an inconvenient fact: those engines were horribly inefficient. This inspired a young French engineer, Sadi Carnot, to work out theoretical limits on the efficiency of heat engines (which function by converting into work the heat that flows from a hot to a cold reservoir). In 1824 Carnot published a book, beautifully titled Reflections on the motive power of fire, in which he showed that no heat engine, no matter how perfect, is ever going to be 100% efficient. Some of the heat that is being transferred in the engine will always go to waste.

Some forty years after Carnot, Rudolf Clausius put his mind to getting to grips with this inherent inability to do work that is present in any engine. He found a mathematical expression (see the box) to quantify the amount of energy in a physical system that is unavailable to do work. And he called this quantity the entropy of the system.

Disorder

Carnot had come up with his thoughts about heat engines even though he thought that heat was a fluid, which it isn't. Thanks to Clausius, Lord Kelvin and James Clerk Maxwell (among others) we now know that heat is a form of energy that comes from the molecules and atoms that make up a material. These vibrate, rotate or, in a liquid or gas, move around randomly, bouncing off each other as they go. The more vigorously they do this, the higher their average kinetic energy, and the hotter the material they're part of. You can see this whenever something melts. In an ice cube, for example, individual molecules are locked into a rigid lattice, but once you heat it up they start jiggling around and eventually break their chains, making the water warmer and also liquid.

Christmas pyramid

A Christmas pyramid: the fan blade at the top is powered from the heat of the candles.

Maxwell, Ludwig Boltzmann and others went on to realise that entropy can be regarded as a measure of disorder in a system. To get an idea of how this might work, imagine a room with a burning candle in it. The heat of the candle can be converted into work. For example, you could use the hot air rising from it to power one of those Christmas toys that have a fan blade at the top. Now imagine the same room after the candle has burnt out and the temperature is uniform throughout. You can't get any work out of this situation, so if you think of entropy as measuring the inability to do work, then it's clear that the room has a higher entropy when the candle has burnt out than when it is still burning.

At the molecular level that second situation, with the candle burnt out, is also much less orderly. The fact that the air has a uniform temperature throughout means that fast and slow moving molecules are thoroughly mixed up: if they were separated out in some way, then you'd have a temperature gradient in the room. In fact, the thermal equilibrium the room finds itself in is also the state of maximal disorder. When the candle is still burning, by contrast, fast molecules are concentrated around the flame, making for a much more orderly situation.

Microscopic definition of entropy

Suppose that a gas in a particular macrostate, for example, it has a particular temperature or pressure. Write $W$ for the number of configurations its individual molecules can be in to preserve that macrostate. Then the entropy $S$ is

  \[ S = k\ln {W}, \]    

where $k$ is Boltzmann’s constant

  \[ k = 1.38062 \times 10^{-23} J/K. \]    

Here $J$ is Joule, the unit for energy, and $K$ is temperature in Kelvin.

The formula is engraved in Boltzmann’s tomb stone in Vienna.

It works when all the configurations of molecules are equally probable to occur. There is a generalisation of this formula which works when they are not all equally probable. It’s

  \[ S = -k \sum _ i p_ i\ln {p_ i}, \]    

where the $p_ i$ is the probability of configuration $i$.

Maxwell and Boltzmann came up with a formula which quantifies the amount of disorder in a system made up of many components, such as a gas. It's based on the idea that, the less ordered the system, the more ways there are of rearranging its small components without making a difference to what the system looks like as a whole (see the box). It turns out that this definition of entropy in terms of disorder is equivalent to Clausius' original definition in terms of temperature and energy.

Information

So what's the link to information? If a system is very ordered, then you don't need much information to describe it. For example, you can describe the regular arrangement of the molecules in a frozen ice cube in a sentence, but to give an exact description of a gas, which has molecules buzzing around randomly, you need to know the precise location and velocity of each individual molecule, and that's a lot of information. The more disorder there is, the higher the entropy and the more information you need to describe the system.

This is how the concept of entropy links the (in)efficiency of engines to disorder and information. Entropy is also implicated in a fundamental law of nature: the second law of thermodynamics says that the entropy of an isolated system can never, ever decrease. It can only stay the same or increase. In terms of engines this means that any engine will never become more efficient of its own accord, which chimes with intuition. In terms of disorder, it means that any system left to its own devices will only ever get messier, which also chimes with intuition (think of your kitchen or your desk). And as we have seen, messy things are harder to describe than orderly ones, which gives the information angle of the second law of thermodynamics.

You can find out more about the second law of thermodynamics in this Maths in a Minute piece, about the history of entropy in Satanic Science (on which this article is based), and about entropy more generally in these articles.


This article now forms part of our coverage of the cutting-edge research done at the Isaac Newton Institute for Mathematical Sciences (INI) in Cambridge. The INI is an international research centre and our neighbour here on the University of Cambridge's maths campus. It attracts leading mathematical scientists from all over the world, and is open to all. Visit www.newton.ac.uk to find out more.

INI logo

Filtered HTML

  • Web page addresses and email addresses turn into links automatically.
  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.