Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

Take a trip into the never-ending.

Two mathematicians' visit to the desert sheds new light on avalanches.

The London Mathematical Society starts its 150th anniversary year with a bang.

Play with our applets to explore the conic sections and their different definitions.

A Klein bottle can't hold any liquid because it doesn't have an inside. How do you construct this strange thing and why would you want to?

The company 23andMe made headlines by launching its DNA testing service in the UK. But how are the risks of developing a disease calculated?

A little trig helps to find the relative distance to the Sun and Moon.