## entropy

What would you think if the nice café latte in your cup suddenly separated itself out into one half containing just milk and the other containing just coffee? Probably that you, or the world, have just gone crazy. There is, perhaps, a theoretical chance that after stirring the coffee all the swirling atoms in your cup just happen to find themselves in the right place for this to occur, but this chance is astronomically small.

Our messy desk is proof of the second law of thermodynamics...

If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Clause Shannon put this idea to use in one of the greatest scientific works of the century.

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's *entropy*, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

Can you measure information? It's a tricky question — but people have tried and come up with very interesting ideas.

To create energy from information you would need to break the second law of thermodynamics — that's impossible in the real world, but could theories that do break it shed light on why nature is the way it is?

In the latest online poll of our *Information about information* project you told us that you'd like an answer to this question. We asked Seth Lloyd, an expert on information at the Massachusetts Institute of Technology, and here is an answer. We also bring you two related articles from FQXi who are our partners on this project. Happy reading!

Fields medallist Cédric Villani talks to us about our solar system, chaos, and what it's like being a mathematical superstar.