Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question.
There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity.
On the face of it the Universe is a fairly complex place. But could mathematics ultimately lead to a simple description of it? In fact, should simplicity be a defining feature of a "theory of everything"? We ponder the answers.
A traditional view of science holds that every system — including ourselves — is no more than the sum of its parts. To understand it, all you have to do is take it apart and see what's happening to the smallest constituents. But the mathematician and cosmologist George Ellis disagrees. He believes that complexity can arise from simple components and physical effects can have non-physical causes, opening a door for our free will to make a difference in a physical world.
The human brain faces a
difficult trade-off. On the one hand it needs to be complex to ensure high performance, and on the other it needs to minimise "wiring cost" — the sum of the length of all the connections —
because communication over distance takes a lot of energy. It's a problem well-known to computer scientists. And it seems that market driven human invention and natural selection have come up with similar solutions.
One of the amazing things about life is its sheer complexity. How can a bunch of mindless cells combine to form something as complex as the human brain, or as delicate, beautiful and highly organised as the patterns on a butterfly's wing? Maths has some surprising answers you can explore yourself with this interactive activity.