Articles

There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity.

Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question.

If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Clause Shannon put this idea to use in one of the greatest scientific works of the century.

An idea called Occam's razor states that the simplest answer is always the best. But is this really true? Computer scientist Noson Yanofsky is trying to find out, applying Kolmogorov complexity to a branch of mathematics known as category theory.

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

In the TV game show Two Tribes teams can have unequal sizes. Is that fair?

Folding a piece of paper in half might be easy, but what about into thirds, fifths, or thirteenths? Here is a simple and exact way for fold any fraction, all thanks to the maths of triangles.

Why the humble average can be grossly misleading.