Articles

When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

In the TV game show Two Tribes teams can have unequal sizes. Is that fair?

Folding a piece of paper in half might be easy, but what about into thirds, fifths, or thirteenths? Here is a simple and exact way for fold any fraction, all thanks to the maths of triangles.

Why the humble average can be grossly misleading.

How to approximate the English language using maths.