Most definitely not a photo of the Plus desk...
Occasionally our colleague Owen, who we share the Plus office with, despairs of our messy desk and tidies it up. Our newly tidied desk is very ordered and hence, in the language of physics, has low entropy. Entropy is a measure of disorder of a physical system. And, as Owen knows from personal experience, the entropy of our desk is certain to increase as it will become messier and messier each time we appear in the office. Essentially this comes down to a probabilistic argument — there are so many more ways for our desk to be messy and just a few limited ways for it to be tidy. So unless someone intervenes and tidies it up (which we must admit isn't our strong point) the entropy is certain to increase.
Really it isn't our fault — you can't fight the laws of physics and this is one of the most fundamental ones: the second law of thermodynamics. The entropy of an isolated system never decreases. The law explains not only why desks never tidy themselves when left alone, but also why ice melts in your drink. All systems evolve to maximal entropy: the highly structured ice-cubes in the warmer liquid form an inherently more ordered system than one where the ice has melted and all the particles of the ex-cubes and drink have mingled together. The highest entropy state of a system is also its equilibrium.
The second law of thermodynamics comes from the area of statistical mechanics which describes the behaviour of large numbers of objects using statistical principles. One obvious place this is useful is in the behaviour of gases or liquids. We could try to write down (or simulate in a computer) the Newtonian equations that describe each and every gas particle and all possible interactions between them, but that would just be silly: there are around 3x1022 molecules in a litre of air so we would need a huge number of equations just to describe the behaviour of each of these individually, let alone their interaction. Instead you can predict the bulk behaviour of the whole system using statistics.
For example, if you take the lid off a jar of gas in an empty box you intuitively know that the gas won't stay in the jar, it will gradually spread till it evenly fills all the space available. Out of all the possible arrangements of gas particles in the box, only a tiny number of correspond to the gas remaining inside the now open jar. These are far outnumbered by the possible arrangements of gas molecules spread through the whole box. The fact that the gas molecules invariably spread out and don't move back into the jar is not a certainty, it's just overwhelmingly more likely.
It may seem strange at first that a law of nature, such as the second law of thermodynamics, is based on statistical likelihood — after all, laws are about certainties and likelihoods incorporate the fact that there is uncertainty. To illustrate just how unlikely a violation of this law is, the French mathematician, Émile Borel, used an intriguing metaphor: he said that if a million monkeys typed for ten hours a day for a year, it would be unlikely that their combined writings would exactly equal the content of the world's richest libraries — and that a violation of the laws of statistical mechanics would be even more unlikely than that. The British physicist Arthur Eddington captured the strange link between chance and certainty beautifully when he wrote, "When numbers are large, chance is the best warrant for certainty. Happily in the study of molecules and energy and radiation in bulk we have to deal with a vast population, and we reach a certainty which does not always reward the expectations of those who court the fickle goddess."
This article now forms part of our coverage of the cutting-edge research done at the Isaac Newton Institute for Mathematical Sciences (INI) in Cambridge. The INI is an international research centre and our neighbour here on the University of Cambridge's maths campus. It attracts leading mathematical scientists from all over the world, and is open to all. Visit www.newton.ac.uk to find out more.
"let alone when their interaction."???
Oops, thanks for spotting that typo. We've corrected it.
It now reads: '...let their interaction...'.
As every good rule has an exception, let's wait for an exception to the second observation of thermodynamics (I prefer this term to 'law') to change it.
How does this really equate to crystals forming on cooling. Surely the formation of crystal structures is a lower energy state than that of the disordered atoms. Therefore instead of entropy increasing it decreases as the atoms take up there place in a lattice. I know I am missing something
On cooling the crystals gives off heat through the release of crystal binding energy, and that heat, being random kinetic energy, is an increase in entropy.