# information about information

A brief introduction to |
If I tell you that it's Monday today, then you know it's not any of the other six days of the week. Perhaps the information content of my statement should be measured in terms of the number of all the other possibilities it excludes? Back in the 1920s this consideration led to a very simple formula to measure information. |

Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question. |
If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Clause Shannon put this idea to use in one of the greatest scientific works of the century. |

There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called |
When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors. |

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's |
Can you measure information? It's a tricky question — but people have tried and come up with very interesting ideas. |

With recent advances in information technology it seems that there is no limit to how much smaller and better computer chips can get. But is this really true? |
Why there is a limit to how much better computer chips can get and what it's got to do with black holes. |