Skip to main content
Home
plus.maths.org

Secondary menu

  • My list
  • About Plus
  • Sponsors
  • Subscribe
  • Contact Us
  • Log in
  • Main navigation

  • Home
  • Articles
  • Collections
  • Podcasts
  • Maths in a minute
  • Puzzles
  • Videos
  • Topics and tags
  • For

    • cat icon
      Curiosity
    • newspaper icon
      Media
    • graduation icon
      Education
    • briefcase icon
      Policy

      Popular topics and tags

      Shapes

      • Geometry
      • Vectors and matrices
      • Topology
      • Networks and graph theory
      • Fractals

      Numbers

      • Number theory
      • Arithmetic
      • Prime numbers
      • Fermat's last theorem
      • Cryptography

      Computing and information

      • Quantum computing
      • Complexity
      • Information theory
      • Artificial intelligence and machine learning
      • Algorithm

      Data and probability

      • Statistics
      • Probability and uncertainty
      • Randomness

      Abstract structures

      • Symmetry
      • Algebra and group theory
      • Vectors and matrices

      Physics

      • Fluid dynamics
      • Quantum physics
      • General relativity, gravity and black holes
      • Entropy and thermodynamics
      • String theory and quantum gravity

      Arts, humanities and sport

      • History and philosophy of mathematics
      • Art and Music
      • Language
      • Sport

      Logic, proof and strategy

      • Logic
      • Proof
      • Game theory

      Calculus and analysis

      • Differential equations
      • Calculus

      Towards applications

      • Mathematical modelling
      • Dynamical systems and Chaos

      Applications

      • Medicine and health
      • Epidemiology
      • Biology
      • Economics and finance
      • Engineering and architecture
      • Weather forecasting
      • Climate change

      Understanding of mathematics

      • Public understanding of mathematics
      • Education

      Get your maths quickly

      • Maths in a minute

      Main menu

    • Home
    • Articles
    • Collections
    • Podcasts
    • Maths in a minute
    • Puzzles
    • Videos
    • Topics and tags
    • Audiences

      • cat icon
        Curiosity
      • newspaper icon
        Media
      • graduation icon
        Education
      • briefcase icon
        Policy

      Secondary menu

    • My list
    • About Plus
    • Sponsors
    • Subscribe
    • Contact Us
    • Log in
    • Can you measure information?

      24 March, 2015
      FQXi logo

      It's a tricky question — whether or not you find something informative depends on your personal point of view and there are many different ways of expressing the same thing. Objectivity seems impossible. Yet, as communication technology became ever more important over the last century or so, objective measures became necessary and people's attempts to find them have led to some very interesting ideas. The articles below present some highlights in information theory. You can read them in sequence, but they also work stand-alone.

      These articles are part of our Information about information project. We also bring you a related article from FQXi who are our partners on this project. Happy reading!

      Information: Baby steps — If I tell you that it's Monday today, then you know it's not any of the other six days of the week. Perhaps the information content of my statement should be measured in terms of the number of all the other possibilities it excludes? Back in the 1920s this consideration led to a very simple formula to measure information.

      Information is surprise — If I tell you something you already know, then that's not very informative. So perhaps information should be measured in terms of unexpectedness, or surprise? In the 1940s Claude Shannon put this idea to use in one of the greatest scientific works of the century.

      Information is bits — Computers represent information using bits, represented as 0s and 1s. It turns out that Claude Shannon's entropy, a measure of information invented long before computers became mainstream, measures the minimal number of bits you need to encode a piece of information.

      Information is noisy — When you transmit information long-distance there is always a chance that some of it gets mangled and arrives at the other end corrupted. Luckily, there are clever ways of encoding information which ensure a tiny error rate, even when your communication channel is prone to errors.

      Information is complexity — There are many ways of saying the same thing; you can use many words, or few. Perhaps information should be measured in terms of the shortest way of expressing it? In the 1960s this idea led to a measure of information called Kolmogorov complexity.

      Information is sophistication — Kolmogorov complexity gives a high value to strings of symbols that are essentially random. But isn't randomness essentially meaningless? Shouldn't a measure of information assign a low value to it? The concept of sophistication addresses this question.

      The following article first appeared on the FQXi website.

      Quantifying Occam — An idea called Occam's razor states that the simplest answer is always the best. But is this really true? Computer scientist Noson Yanofsky is trying to find out, applying Kolmogorov complexity to a branch of mathematics known as category theory.


      Read more about...
      history of mathematics
      error-correcting code
      Information theory
      entropy
      complexity
      information about information
      • Log in or register to post comments

      Anonymous

      25 March 2015

      Permalink

      I carry a plank of wood along to the carpenter and ask her/him to cut me another to the same length. Next time I'm a bit more savvy and mark off a length of string against the plank, roll it up and put it my pocket, and take that along to the carpenter instead. Third time I'm even cuter, I measure the length in numbers on a tape measure, and just go along with nothing but tiny puffs of air on my lips or marks on a bit of paper.

      Each step progressively liberates information from its cumbersome physical representation or embodiment for the purposes of transmission, storage and eventual reconstitution. Living things do the same with DNA.

      So surely one kind of measurement we're concerned with must be a ratio demonstrating the degree to which this liberation is accomplished, such as gigabytes per chip or per second in electronics, or quantity per number in arithmetic. For example that ratio is 37 for the number 111 since we have just three symbols expressing the quantity one hundred and eleven which in primitive times would have had to be transmitted as a hundred and eleven symbols.

      This approach also means that information can be instructions as well as descriptions, those two words simply represent different directions: to and from physical embodiment.

      Chris G

      • Log in or register to post comments

      Read more about...

      history of mathematics
      error-correcting code
      Information theory
      entropy
      complexity
      information about information
      University of Cambridge logo

      Plus Magazine is part of the family of activities in the Millennium Mathematics Project.
      Copyright © 1997 - 2025. University of Cambridge. All rights reserved.

      Terms