There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity.

When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

Take a trip into the never-ending.

Two mathematicians' visit to the desert sheds new light on avalanches.

The London Mathematical Society starts its 150th anniversary year with a bang.

Play with our applets to explore the conic sections and their different definitions.

A Klein bottle can't hold any liquid because it doesn't have an inside. How do you construct this strange thing and why would you want to?