List by Author: Marianne Freiberger

Information is noisyWhen you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.
Information is bitsComputers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.
What is infinity?Take a trip into the never-ending.
Two-faced conic sectionsPlay with our applets to explore the conic sections and their different definitions.
Introducing the Klein bottleA Klein bottle can't hold any liquid because it doesn't have an inside. How do you construct this strange thing and why would you want to?
The Sun, the Moon and trigonometryA little trig helps to find the relative distance to the Sun and Moon.
Ebola: Evidence from numbersWhy maths is an important tool in the fight against Ebola.
The limits of informationWhy there is a limit to how much better computer chips can get and what it's got to do with black holes.
Stubborn equations and the study of symmetryAn impossible equation, two tragic heroes and the mathematical study of symmetry.
The multiverse: Science or speculation?If you like to have your mind blown cosmology is a great field to go into. But is it science?
In a lower dimensionCould the world be simpler than our senses suggest?
Who made the laws of nature?What gives an equation the right to call itself a law?