Add new comment

Permalink

Cited from https://en.wikipedia.org/wiki/Entropy_(information_theory) : "in the view of (Edwin Thompson) Jaynes (1957), thermodynamic entropy, as explained by statistical mechanics, should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics, with the constant of proportionality being just the Boltzmann constant."

I find this quite appealing. Considering that "bit" in information theory is unitless, it's just number, this hypothesis may be true. And with this model, we may be able to unify physics and computation in the future[0][1].
We can view current physical laws as constrains applied to the logical world of "bits". Each law gives a set of more constrains on how the world(particles, wave, or any other physical representations) should behave. Our world may be just one of many possible worlds described by math.

[0]: https://en.wikipedia.org/wiki/Limits_of_computation
[1]: https://physics.stackexchange.com/questions/403016/is-there-a-physical-…

Filtered HTML

  • Web page addresses and email addresses turn into links automatically.
  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.