Articles

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.

Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question.

In the TV game show Two Tribes teams can have unequal sizes. Is that fair?

Folding a piece of paper in half might be easy, but what about into thirds, fifths, or thirteenths? Here is a simple and exact way for fold any fraction, all thanks to the maths of triangles.

Why the humble average can be grossly misleading.