When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.
Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.
An idea called Occam's razor states that the simplest answer is always the best. But is this really true? Computer scientist Noson Yanofsky is trying to find out, applying Kolmogorov complexity to a branch of mathematics known as category theory.
Folding a piece of paper in half might be easy, but what about into thirds, fifths, or thirteenths? Here is a simple and exact way for fold any fraction, all thanks to the maths of triangles.
Can a mathematical model of zombies' movements allow the human race to survive impending doom?
Observers are, of course, vital in physics: we test our theories by comparing them to our observations. But in cosmology, as Jim Hartle explains, we could be one of many possible observers in the Universe and knowing which one we are is vital in testing our theories.
Play with our applets to explore the conic sections and their different definitions.