Author: Marianne Freiberger
Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information. 
When you transmit information longdistance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors. 
There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity. 
Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question. 
Two mathematicians' visit to the desert sheds new light on avalanches. 

The London Mathematical Society starts its 150th anniversary year with a bang. 
Play with our applets to explore the conic sections and their different definitions. 
A Klein bottle can't hold any liquid because it doesn't have an inside. How do you construct this strange thing and why would you want to? 
The company 23andMe made headlines by launching its DNA testing service in the UK. But how are the risks of developing a disease calculated? 