Everyone knows what time is. We can practically feel it ticking away,
marching on in the same direction with horrifying regularity. Time has
enslaved the Western world and become our most precious commodity. Turn it
over to the physicists however, and it begins to
morph, twist and even crumble away. So what is
What would you think if the nice café latte in your cup suddenly separated itself out into one half containing just milk and the other containing just coffee? Probably that you, or the world, have just gone crazy. There is, perhaps, a theoretical chance that after stirring the coffee all the swirling atoms in your cup just happen to find themselves in the right place for this to occur, but this chance is astronomically small.
If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Clause Shannon put this idea to use in one of the greatest scientific works of the century.
Computers represent information using bits — that's 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.
To create energy from information you would need to break the second law of thermodynamics — that's impossible in the real world, but could theories that do break it shed light on why nature is the way it is?
In the latest online poll of our Information about information project you told us that you'd like an answer to this question. We asked Seth Lloyd, an expert on information at the Massachusetts Institute of Technology, and here is an answer. We also bring you two related articles from FQXi who are our partners on this project. Happy reading!