Information theory

There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity.

When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

Can you measure information? It's a tricky question — but people have tried and come up with very interesting ideas.

How to approximate the English language using maths.

Why there is a limit to how much better computer chips can get and what it's got to do with black holes.

Books, brains, computers — information comes in many guises. But what exactly is information?

There's no doubt that information is power, but could it be converted into physical energy you could heat a room with or run a machine on? In the 19th century James Clerk Maxwell invented a hypothetical being — a "demon" — that seemed to be able to do just that. The problem was that the little devil blatantly contravened the laws of physics. What is Maxwell's demon and how was it resolved?