Articles

Will computers ever replace human mathematicians?

What are mathematical proofs, why do we need them and what can they say about sheep?

David Spiegelhalter's new book Sex by numbers takes a statistical peek into the nation's bedrooms. In this interview he tells us some of his favourite stories from the book. Read the article or watch the video!

If I tell you that it's Monday today, then you know it's not any of the other six days of the week. Perhaps the information content of my statement should be measured in terms of the number of all the other possibilities it excludes? Back in the 1920s this consideration led to a very simple formula to measure information.

If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Clause Shannon put this idea to use in one of the greatest scientific works of the century.

Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.