Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.
Play with our applets to explore the conic sections and their different definitions.
A Klein bottle can't hold any liquid because it doesn't have an inside. How do you construct this strange thing and why would you want to?
A little trig helps to find the relative distance to the Sun and Moon.
Why there is a limit to how much better computer chips can get and what it's got to do with black holes.
An impossible equation, two tragic heroes and the mathematical study of symmetry.
If you like to have your mind blown cosmology is a great field to go into. But is it science?
Why cosmologists worry about isolated brains that randomly fluctuate into existence.