Add new comment
-
Want facts and want them fast? Our Maths in a minute series explores key mathematical concepts in just a few words.
Teo tells us about his work in artificial intelligence, his travels around the world, and how inspiration sometimes strikes in the pub.
Clouds make the weather, yet their detail isn't taken into account in weather forecasts. Artificial intelligence might be able to help.
Predicting the weather is hard. But with more data and computing power becoming available, artificial intelligence may be able to help.
How does your phone know what the weather's going to be like?
How a little insect can cause chaos.
Cited from https://en.wikipedia.org/wiki/Entropy_(information_theory) : "in the view of (Edwin Thompson) Jaynes (1957), thermodynamic entropy, as explained by statistical mechanics, should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics, with the constant of proportionality being just the Boltzmann constant."
I find this quite appealing. Considering that "bit" in information theory is unitless, it's just number, this hypothesis may be true. And with this model, we may be able to unify physics and computation in the future[0][1].
We can view current physical laws as constrains applied to the logical world of "bits". Each law gives a set of more constrains on how the world(particles, wave, or any other physical representations) should behave. Our world may be just one of many possible worlds described by math.
[0]: https://en.wikipedia.org/wiki/Limits_of_computation
[1]: https://physics.stackexchange.com/questions/403016/is-there-a-physical-…