## entropy

What would you think if the nice café latte in your cup suddenly separated itself out into one half containing just milk and the other containing just coffee? Probably that you, or the world, have just gone crazy. There is, perhaps, a theoretical chance that after stirring the coffee all the swirling atoms in your cup just happen to find themselves in the right place for this to occur, but this chance is astronomically small.

In memory of Stephen Hawking we look at the equation he was most proud of.

Fundamental physics says time is symmetric - so why does time move forwards for us in a block universe?

Is time real? Are we just puppets living out a future already written? Marina Cortês explains why she thinks time is fundamental and that we don't live in a block universe.

Marina Cortês is one of a growing number of physicists who believe time is fundamental. We ask her about the alternatives theories to the block universe, where time comes first.

Our messy desk is proof of the second law of thermodynamics...

If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Clause Shannon put this idea to use in one of the greatest scientific works of the century.

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's *entropy*, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

Can you measure information? It's a tricky question — but people have tried and come up with very interesting ideas.