I'd like to have more on the relationship between Entropy and Information. On first blush, they seem like opposites. More entropy, more chaos and disorder; more information, more order and regularity. And yet, I have read more and more about the relation between certain interpretations of information and entropy, how a lot of complex stored information will raise the entropy of the medium storing that info. But does it work the other way? If a gridboard of black and white squares (digital, 1 or 0) has a high entropy (pretty mixed up in appearance, nothing all that different between one section and another), is there necessarily information behind the way those squares are? Maybe I'm asking, does information have to be "informative" to be information? If you take the word in syllables, you get IN -- FORM -- ATION. Implying that something is being formed from within. Implying that information is relative, if the tree falls in the forest and nothing else is there to be informed about it . . .

Oh, and then there is all that stuff about black holes and entropy and information, how the info that went into a black hole is proportional to the surface area of the black hole horizon and not the volume . . . cool stuff, but what is it all pointing to? Just what do they mean by "info", when a book falls into a black hole? Is it just the quantum state of every particle in the book, or does it include the letters and words and numbers on the book pages, the size of the pages, the thickness of the cover, etc? Thx, JIM G.

This question is for testing whether you are a human visitor and to prevent automated spam submissions.