Content about “
information about information

Article

Information: Baby steps

If I tell you that it's Monday today, then you know it's not any of the other six days of the week. Perhaps the information content of my statement should be measured in terms of the number of all the other possibilities it excludes? Back in the 1920s this consideration led to a very simple formula to measure information.
Article

Information is sophistication

Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question.
Article

Information is surprise

If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise?

Article

Information is complexity

There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity.
Article

Information is noisy

When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.
Article

Information is bits

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.
Collection

Can you measure information?

Can you measure information? It's a tricky question — but people have tried and come up with very interesting ideas.
Collection

Are there limits to information?

With recent advances in information technology it seems that there is no limit to how much smaller and better computer chips can get. But is this really true?
Article

The limits of information

Why there is a limit to how much better computer chips can get and what it's got to do with black holes.
Article

What is information?

Books, brains, computers — information comes in many guises. But what exactly is information?
Article

Why did nature choose quantum theory?

To create energy from information you would need to break the second law of thermodynamics — that's impossible in the real world, but could theories that do break it shed light on why nature is the way it is?