icon

Maths in a minute: Entropy

Share this page

Maths in a minute: Entropy

Entropy is a strange thing. Some people say it measures the amount of disorder in a physical system. Others say that it's a measure of information. And yet others talk about it in the context of steam engines. So what is it and how are these different contexts linked?

Steam engines

Classical definition of entropy

For a reversible process that involves a heat transfer of size $Q$ at a temperature $T$ the change in entropy $\Delta S$ is measured by $$\Delta S = Q/T.$$

A reversible process is one in which no energy is dissipated (through friction etc). To see how this formula can be applied to more realistic irreversible processes and for an example calculation see here.

Let's start at the beginning. The nineteenth century saw the rise of the steam engine, but also pressed home an inconvenient fact: those engines were horribly inefficient. This inspired a young French engineer, Sadi Carnot, to work out theoretical limits on the efficiency of heat engines (which function by converting into work the heat that flows from a hot to a cold reservoir). In 1824 Carnot published a book, beautifully titled Reflections on the motive power of fire, in which he showed that no heat engine, no matter how perfect, is ever going to be 100% efficient. Some of the heat that is being transferred in the engine will always go to waste.

Some forty years after Carnot, Rudolf Clausius put his mind to getting to grips with this inherent inability to do work that is present in any engine. He found a mathematical expression (see the box) to quantify the amount of energy in a physical system that is unavailable to do work. And he called this quantity the entropy of the system.

Disorder

Carnot had come up with his thoughts about heat engines even though he thought that heat was a fluid, which it isn't. Thanks to Clausius, Lord Kelvin and James Clerk Maxwell (among others) we now know that heat is a form of energy that comes from the molecules and atoms that make up a material. These vibrate, rotate or, in a liquid or gas, move around randomly, bouncing off each other as they go. The more vigorously they do this, the higher their average kinetic energy, and the hotter the material they're part of. You can see this whenever something melts. In an ice cube, for example, individual molecules are locked into a rigid lattice, but once you heat it up they start jiggling around and eventually break their chains, making the water warmer and also liquid.

Christmas pyramid

A Christmas pyramid: the fan blade at the top is powered from the heat of the candles.

Maxwell, Ludwig Boltzmann and others went on to realise that entropy can be regarded as a measure of disorder in a system. To get an idea of how this might work, imagine a room with a burning candle in it. The heat of the candle can be converted into work. For example, you could use the hot air rising from it to power one of those Christmas toys that have a fan blade at the top. Now imagine the same room after the candle has burnt out and the temperature is uniform throughout. You can't get any work out of this situation, so if you think of entropy as measuring the inability to do work, then it's clear that the room has a higher entropy when the candle has burnt out than when it is still burning.

At the molecular level that second situation, with the candle burnt out, is also much less orderly. The fact that the air has a uniform temperature throughout means that fast and slow moving molecules are thoroughly mixed up: if they were separated out in some way, then you'd have a temperature gradient in the room. In fact, the thermal equilibrium the room finds itself in is also the state of maximal disorder. When the candle is still burning, by contrast, fast molecules are concentrated around the flame, making for a much more orderly situation.

Microscopic definition of entropy

Suppose that a gas in a particular macrostate, for example, it has a particular temperature or pressure. Write $W$ for the number of configurations its individual molecules can be in to preserve that macrostate. Then the entropy $S$ is $$S = k\ln{W},$$ where $k$ is Boltzmann's constant $$k = 1.38062 \times 10^{-23} J/K.$$ Here $J$ is Joule, the unit for energy, and $K$ is temperature in Kelvin. The formula is engraved in Boltzmann's tomb stone in Vienna. It works when all the configurations of molecules are equally probable to occur. There is a generalisation of this formula which works when they are not all equally probable. It's $$S = -k \sum_i p_i\ln{p_i},$$ where the $p_i$ is the probability of configuration $i$.

Maxwell and Boltzmann came up with a formula which quantifies the amount of disorder in a system made up of many components, such as a gas. It's based on the idea that, the less ordered the system, the more ways there are of rearranging its small components without making a difference to what the system looks like as a whole (see the box). It turns out that this definition of entropy in terms of disorder is equivalent to Clausius' original definition in terms of temperature and energy.

Information

So what's the link to information? If a system is very ordered, then you don't need much information to describe it. For example, you can describe the regular arrangement of the molecules in a frozen ice cube in a sentence, but to give an exact description of a gas, which has molecules buzzing around randomly, you need to know the precise location and velocity of each individual molecule, and that's a lot of information. The more disorder there is, the higher the entropy and the more information you need to describe the system.

This is how the concept of entropy links the (in)efficiency of engines to disorder and information. Entropy is also implicated in a fundamental law of nature: the second law of thermodynamics says that the entropy of an isolated system can never, ever decrease. It can only stay the same or increase. In terms of engines this means that any engine will never become more efficient of its own accord, which chimes with intuition. In terms of disorder, it means that any system left to its own devices will only ever get messier, which also chimes with intuition (think of your kitchen or your desk). And as we have seen, messy things are harder to describe than orderly ones, which gives the information angle of the second law of thermodynamics.

You can find out more about the second law of thermodynamics in this Maths in a Minute piece, about the history of entropy in Satanic Science (on which this article is based), and about entropy more generally in these articles.


This article now forms part of our coverage of the cutting-edge research done at the Isaac Newton Institute for Mathematical Sciences (INI) in Cambridge. The INI is an international research centre and our neighbour here on the University of Cambridge's maths campus. It attracts leading mathematical scientists from all over the world, and is open to all. Visit www.newton.ac.uk to find out more.

INI logo

Comments

Permalink

The remark about the 2nd Law of Thermodynamics is wrong. The entropy of the closed system can decrease provided that this is compensated for by a larger (in absolute terms) increase in the entropy of surroundings. The entropy of the isolated system can never decrease. The second Law says that change in the entropy of the system AND surroundings TOGETHER can never decrase.

Permalink

Since entropy is the amount of disorder in a closed system, a similar system having a region with order being established in it and being maintained there, must be surrounded by a region where the amount of disorder is greater than were all of the volume of both regions to initially exist in a more regular situation. In an economics situation, as work is performed and goods (having greater order) are produced, the amount of disorder outside the production part of the system becomes more, and its entropy becomes greater, even though within in our (limited) experience we are progressing in the amount of wealth and ordered goods that our civilization has taken us. Does this mean that eventually (after many thousands of years) that the quantity of pollution and scraped goods will become so great that our very existence will become doubtful?