Skip to main content
Home
plus.maths.org

Secondary menu

  • My list
  • About Plus
  • Sponsors
  • Subscribe
  • Contact Us
  • Log in
  • Main navigation

  • Home
  • Articles
  • Collections
  • Podcasts
  • Maths in a minute
  • Puzzles
  • Videos
  • Topics and tags
  • For

    • cat icon
      Curiosity
    • newspaper icon
      Media
    • graduation icon
      Education
    • briefcase icon
      Policy

      Popular topics and tags

      Shapes

      • Geometry
      • Vectors and matrices
      • Topology
      • Networks and graph theory
      • Fractals

      Numbers

      • Number theory
      • Arithmetic
      • Prime numbers
      • Fermat's last theorem
      • Cryptography

      Computing and information

      • Quantum computing
      • Complexity
      • Information theory
      • Artificial intelligence and machine learning
      • Algorithm

      Data and probability

      • Statistics
      • Probability and uncertainty
      • Randomness

      Abstract structures

      • Symmetry
      • Algebra and group theory
      • Vectors and matrices

      Physics

      • Fluid dynamics
      • Quantum physics
      • General relativity, gravity and black holes
      • Entropy and thermodynamics
      • String theory and quantum gravity

      Arts, humanities and sport

      • History and philosophy of mathematics
      • Art and Music
      • Language
      • Sport

      Logic, proof and strategy

      • Logic
      • Proof
      • Game theory

      Calculus and analysis

      • Differential equations
      • Calculus

      Towards applications

      • Mathematical modelling
      • Dynamical systems and Chaos

      Applications

      • Medicine and health
      • Epidemiology
      • Biology
      • Economics and finance
      • Engineering and architecture
      • Weather forecasting
      • Climate change

      Understanding of mathematics

      • Public understanding of mathematics
      • Education

      Get your maths quickly

      • Maths in a minute

      Main menu

    • Home
    • Articles
    • Collections
    • Podcasts
    • Maths in a minute
    • Puzzles
    • Videos
    • Topics and tags
    • Audiences

      • cat icon
        Curiosity
      • newspaper icon
        Media
      • graduation icon
        Education
      • briefcase icon
        Policy

      Secondary menu

    • My list
    • About Plus
    • Sponsors
    • Subscribe
    • Contact Us
    • Log in
    • Maths in a minute: The second law of thermodynamics

      2 June, 2016
      5 comments
      Not the Plus desk

      Most definitely not a photo of the Plus desk...

      Occasionally our colleague Owen, who we share the Plus office with, despairs of our messy desk and tidies it up. Our newly tidied desk is very ordered and hence, in the language of physics, has low entropy. Entropy is a measure of disorder of a physical system. And, as Owen knows from personal experience, the entropy of our desk is certain to increase as it will become messier and messier each time we appear in the office. Essentially this comes down to a probabilistic argument — there are so many more ways for our desk to be messy and just a few limited ways for it to be tidy. So unless someone intervenes and tidies it up (which we must admit isn't our strong point) the entropy is certain to increase.

      Really it isn't our fault — you can't fight the laws of physics and this is one of the most fundamental ones: the second law of thermodynamics. The entropy of an isolated system never decreases. The law explains not only why desks never tidy themselves when left alone, but also why ice melts in your drink. All systems evolve to maximal entropy: the highly structured ice-cubes in the warmer liquid form an inherently more ordered system than one where the ice has melted and all the particles of the ex-cubes and drink have mingled together. The highest entropy state of a system is also its equilibrium.

      The second law of thermodynamics comes from the area of statistical mechanics which describes the behaviour of large numbers of objects using statistical principles. One obvious place this is useful is in the behaviour of gases or liquids. We could try to write down (or simulate in a computer) the Newtonian equations that describe each and every gas particle and all possible interactions between them, but that would just be silly: there are around 3x1022 molecules in a litre of air so we would need a huge number of equations just to describe the behaviour of each of these individually, let alone their interaction. Instead you can predict the bulk behaviour of the whole system using statistics.

      For example, if you take the lid off a jar of gas in an empty box you intuitively know that the gas won't stay in the jar, it will gradually spread till it evenly fills all the space available. Out of all the possible arrangements of gas particles in the box, only a tiny number of correspond to the gas remaining inside the now open jar. These are far outnumbered by the possible arrangements of gas molecules spread through the whole box. The fact that the gas molecules invariably spread out and don't move back into the jar is not a certainty, it's just overwhelmingly more likely.

      It may seem strange at first that a law of nature, such as the second law of thermodynamics, is based on statistical likelihood — after all, laws are about certainties and likelihoods incorporate the fact that there is uncertainty. To illustrate just how unlikely a violation of this law is, the French mathematician, Émile Borel, used an intriguing metaphor: he said that if a million monkeys typed for ten hours a day for a year, it would be unlikely that their combined writings would exactly equal the content of the world's richest libraries — and that a violation of the laws of statistical mechanics would be even more unlikely than that. The British physicist Arthur Eddington captured the strange link between chance and certainty beautifully when he wrote, "When numbers are large, chance is the best warrant for certainty. Happily in the study of molecules and energy and radiation in bulk we have to deal with a vast population, and we reach a certainty which does not always reward the expectations of those who court the fickle goddess."

      You can read more about entropy and typing monkeys on Plus.


      This article now forms part of our coverage of the cutting-edge research done at the Isaac Newton Institute for Mathematical Sciences (INI) in Cambridge. The INI is an international research centre and our neighbour here on the University of Cambridge's maths campus. It attracts leading mathematical scientists from all over the world, and is open to all. Visit www.newton.ac.uk to find out more.

      INI logo

      • Log in or register to post comments

      Comments

      Gary

      25 June 2016

      Permalink

      "let alone when their interaction."???

      • Log in or register to post comments

      Marianne

      27 June 2016

      In reply to Word by Gary

      Permalink

      Oops, thanks for spotting that typo. We've corrected it.

      • Log in or register to post comments

      Charles Morelli

      14 July 2016

      In reply to Oops, thanks for spotting by Marianne

      Permalink

      It now reads: '...let their interaction...'.

      As every good rule has an exception, let's wait for an exception to the second observation of thermodynamics (I prefer this term to 'law') to change it.

      • Log in or register to post comments

      JimmyBulloch

      1 July 2016

      Permalink

      How does this really equate to crystals forming on cooling. Surely the formation of crystal structures is a lower energy state than that of the disordered atoms. Therefore instead of entropy increasing it decreases as the atoms take up there place in a lattice. I know I am missing something

      • Log in or register to post comments

      J.

      25 December 2021

      In reply to Crystals and thermodynamics by JimmyBulloch

      Permalink

      On cooling the crystals gives off heat through the release of crystal binding energy, and that heat, being random kinetic energy, is an increase in entropy.

      • Log in or register to post comments

      Read more about...

      thermodynamics
      entropy
      Maths in a minute
      INI
      University of Cambridge logo

      Plus Magazine is part of the family of activities in the Millennium Mathematics Project.
      Copyright © 1997 - 2025. University of Cambridge. All rights reserved.

      Terms