Article

Information is noisy

When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.
Article

Information is bits

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.
Article

Introducing the Klein bottle

A Klein bottle can't hold any liquid because it doesn't have an inside. How do you construct this strange thing and why would you want to?
Article

The limits of information

Why there is a limit to how much better computer chips can get and what it's got to do with black holes.