# Add new comment

As COP28, the 2023 United Nations Climate Change Conference, kicks off we look at how maths can help understand the climate crisis.

How do you create dramatic film out of mathematics? We find out with writer and director Timothy Lanzone.

Mathematics plays a central role in understanding how infectious diseases spread. This collection of articles looks at some basic concepts in epidemiology to help you understand this fascinating and important field, and set you up for further study.

Find out why the formula we use to work out conditional probabilities is true!

- We talk about a play that explores the fascinating mathematical collaboration between the mathematicians GH Hardy and Srinivasa Ramanujan.

Cited from https://en.wikipedia.org/wiki/Entropy_(information_theory) : "in the view of (Edwin Thompson) Jaynes (1957), thermodynamic entropy, as explained by statistical mechanics, should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics, with the constant of proportionality being just the Boltzmann constant."

I find this quite appealing. Considering that "bit" in information theory is unitless, it's just number, this hypothesis may be true. And with this model, we may be able to unify physics and computation in the future[0][1].

We can view current physical laws as constrains applied to the logical world of "bits". Each law gives a set of more constrains on how the world(particles, wave, or any other physical representations) should behave. Our world may be just one of many possible worlds described by math.

[0]: https://en.wikipedia.org/wiki/Limits_of_computation

[1]: https://physics.stackexchange.com/questions/403016/is-there-a-physical-…