# Information theory

Mark Braverman has won the Abacus Medal for developing the theory of information complexity. He told us about the role of communication in computation, and why a mathematical view can help you understand how to solve problems while sharing as little information as possible.

Elliott Lieb has been awarded the 2022 Gauss Prize for outstanding contributions to physics, chemistry, and pure mathematics.

We talk to Chiara Marletto about a new way of looking at the physical world that may solve some of the problems physicists are currently struggling with.

A new way of looking at the physical world promises to shed light on some of the problems physics as we know it can't deal with.

This article explores how constructor theory may be able to provide answers to the questions posed in the first part of the article.

We explore some problems physics as we know it has trouble dealing with and a new theory that may provide answers.

Information is supremely powerful, yet it can't be described by traditional physics. Constructor theory provides a potential answer.

A brief introduction to *bits* and why they're not the same as 0s and 1s.

If I tell you that it's Monday today, then you know it's not any of the other six days of the week. Perhaps the information content of my statement should be measured in terms of the number of all the other possibilities it excludes? Back in the 1920s this consideration led to a very simple formula to measure information.

Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question.

If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Clause Shannon put this idea to use in one of the greatest scientific works of the century.