If I tell you that it's Monday today, then you know it's not any of the other six days of the week. Perhaps the information content of my statement should be measured in terms of the number of all the other possibilities it excludes? Back in the 1920s this consideration led to a very simple formula to measure information.
Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Should a measure of information assign a low value to it? The concept of sophistication addresses this question.
If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise?
There are many ways of saying the same thing — you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity.