Markov chains are exceptionally useful tools for calculating probabilities and are used in fields such as economics, biology, gambling, computing (such as Google's search algorithm), marketing and many more. They can be used when the probability of a future event is only dependent on a current event. The Russian mathematician Andrei Markov was the first to work in this field (in the late 1800s).
The picture above is an example of a situation which can be modelled using a Markov chain. We have two states: Beach (state 1) and Home (state 2). In our happy life we spend all the hours of the day in either one of these two states. If we are on the beach then the probability we still remain on the beach in one hour is 0.6 and the probability we go home in one hour is 0.4. If we are at home the probability we remain at home in one hour is 0.8 and the probability we go to the beach is 0.2.
First we need to represent our information in a matrix form. A general 2×2 matrix is written as
For our Markov chain we define the following matrix:
Where will we be in the future?
For someone starting on the beach, the probability they'll be on the beach in two hours is the probability they stay on the beach that whole time, , plus the probability that they go home and then come back to the beach, . Here we see the benefit of Markov chains in that they allow us to utilise computer power to now calculate where someone will be in the future – simply by taking the power of the matrix. To find the all the probabilities after two hours I can square my matrix:
Using the rules of matrix multiplication this then gives:
I can then carry on with matrix multiplication to work out where someone will likely be for any given number of hours in the future:
A more demanding beach life
We can see that things get much more complicated, even by adding just one extra state. Now we have 3 possible states: Beach (state 1), Home (state 2) and SCUBA (state 3). This time we need a 3×3 matrix:
After 24 hours we have the following probability matrix:
Hopefully this is a quick example to demonstrate the power of Markov chains in working with probabilities. There is a lot more to explore!
About the article
Andrew Chambers has an MSc in mathematics and is a teacher at the British International School Phuket, Thailand. He has written news and features articles for the Guardian, Observer and Bangkok Post, and was previously shortlisted for the Guardian's International Development Journalist of the Year Awards. Combining his love of both writing and mathematics, his website ibmathsresources.com contains hundreds of ideas for mathematical investigations (and some code breaking challenges) for gifted and talented students.
This article first appeared on ibmathsresources.com.
A lovely elegant exposition. Thank you!
Nice demonstration, easily understood!