A controversy is afoot. Some people claim that the Hebrew text of the Old Testament contains encoded predictions which are valid today. For example, the 1995 murder of the Israeli Prime Minister Mr Rabin is apparently foretold. Other people deny these claims. They agree that such messages may be found but say that similar coincidences will necessarily occur whenever a large book is studied in such a way. Feelings are running rather high.

"Coincidences" are familiar to all of us. They can involve startling conjunctions of events, such as chance meetings with old friends, or coincidences of personal habits or of anniversaries. Should we be surprised by coincidences?

Think of it this way. There is a great number of events which could possibly occur (*N*, say) and which might deserve the title "coincidence". Each of these may have a very small probability of occurring in any given week (*p*, say). Now *N* is large, and *p* is small, but the average number of occurrences (i.e., the product *Np*) may be of reasonable size. It follows
"by the laws of chance" that coincidences will definitely occur sooner or later. Indeed, the world would be more surprising if coincidences never occurred.

## Laws of chance

What are the "laws of chance"? Mathematicians give credit to Andrei Kolmogorov for first publishing the laws or *axioms* of probability theory in 1937. Helped by his axioms, probability theory was transformed from a questionable activity of gamblers into a reputable topic of pure and applied mathematics.

## Birthday problem

The laws of probability may be used to answer many questions. For example, how many people are required in order to have a better than evens chance that two or more of them have the same birthday. This is an example of what is called an "ill posed question". That is to say, we cannot give a mathematical answer until we know all the assumptions.

If we assume that each individual's birthday is equally likely to be any of the 365 days of the year, and that one person's birthday has no effect on anybody else's, then the answer is that just 23 people suffice.

This is rather a small number, and tends to come by surprise. Indeed, one talk-show host denied the answer on the grounds that he had asked an audience of 100 people whether anyone had the same birthday as him, and the answer was "no". This is a good example of getting the question wrong. The answer to the new question, "how many people are required in order that there is a better than evens
chance that someone has *your* birthday", is 253.

## Runs of heads

Surprises emerge from calculations of probabilities. One US professor challenges his students to write down a sequence of 36 heads and tails, using any rule whatsoever. For example, they might flip a coin 36 times and record the outcomes (say, HTHHHHTTHHHTTHHT...), or they might just write some "fixed" sequence (perhaps 18 heads followed by 18 tails).

The professor claims to be able to tell which sequences are truly random, and which have been artificially contrived. The students are challenged to defeat him.

What does he do when faced with a given sequence? There are many rules he might use. For example, he might count the number of heads, and decide that the sequence is random whenever this number lies between 15 and 21. But the second sequence given above passes this test.

He will seek a rule which is a reasonable diagnostic for the problem, but which is unlikely to be guessed in advance by his students. His actual strategy is to decide that a sequence is random if the length of the longest run of heads is 4, 5, or 6. He certainly makes mistakes, but he guesses correctly most of the time.

Toss a coin *n* times, and call the length of the longest run of heads *L*_{n}. It turns out that *L*_{n} is generally close to log_{2}(*n*) when *n* is large.

log_{2}(n) is the numberxsuch that, when 2 is raised to thexth power the answer isn.

You might also like to look at the coin-flipping puzzle for an explanation of this relationship. See "Coin-tossing" elsewhere in this issue.

### Rare events

In our coin-flipping example, substantially longer runs than log_{2}(*n*) are exceedingly unlikely, while substantially shorter runs are commonplace.

An unusually long run of heads is an example of a "rare event". Rare events can be important for ordinary people. For example, a very high tide is a rare event, but it can cause enormous flood damage in countries such as Holland. Inhabitants of low countries invest a lot of money in protection against such events, by building dams for flood protection, even though the chance of such an event is small. How high should they build their barriers?

Insurance companies need to assess the chances and costs of major catastrophes, even though such events are rare. These companies employ actuaries to do this work, and these people use statistics and probability in their day to day work.

High-tide in Cornwall during a freak storm, January 1998

(Source: APEX picture agency).

Major falls in the Stock Exchange are rare, and can be associated with gigantic effects on the worldwide market. Many individuals and companies are heavily involved in gambling on such markets, and they try to understand the forces upon them.

### Probability paradoxes

You can hone your probability skills by thinking about some of the famous paradoxes of probability theory. One of these is a "proof" by Lewis Carroll of the following statement:

if an urn contains two balls which are either red or black, then it must contain exactly one red ball and exactly one black ball.

Probabilists are fond of problems involving balls in urns. This is partly because many early probabilists, including Poisson, Laplace and Pascal were French; the expression *aller aux urnes* is still in current use in France, meaning *to vote*.

One especially famous paradox is the *prisoners' paradox*, a variation of which is used for this issue's puzzle (see "Three doors").

One version of the prisoners' paradox is as follows.

Three prisoners, named Martin, Naomi, and Olga, are held incommunicado in a distant country. They know that one of them will be shot, and the others freed, but they do not know which. Martin surmises that the chance he will be shot is 1/3. He asks the warder who will be shot, but the warder refuses to answer. Instead the warder tells him that Naomi will be freed. What now is the chance that Martin will be shot?

Could it be 1/2, since either Martin or Olga will be shot? Could it be 1/3, since one (at least) of the others will necessarily be freed?

What do you think?

### Probability in employment

Almost all industries and employers need access to individuals who understand probability and statistics. Here is a very selective list of examples:

- insurance companies employ actuaries to assess risks and costs,
- financial companies employ mathematicians to analyse the finance market,
- spaceship manufacturers employ people to do "risk analysis",
- computer companies need to understand the behaviour of large imperfect systems, such as operating systems and software,
- telephone companies need to understand communication networks whose inputs are largely out of their control (See "Call routing in telephone networks" in Issue No. 2),
- law companies need access to experts on randomness, in matters such as DNA fingerprinting.

### Further reading

There are many introductions to probability theory, including:

- G.R. Grimmett and D.J.A. Welsh,
*Probability Theory, an Introduction*, Oxford University Press, 1986, - C.M. Grinstead and J. L. Snell,
*Introduction to Probability*, American Mathematical Society (available in principle via the web), - S.M. Ross,
*A First Course in Probability Theory*, Collier Macmillan, 1984 - D.R. Stirzaker,
*Elementary Probability*, Cambridge University Press, 1994.

Try these web links:

Biographies of the mathematicians mentioned in the article are available from the "MacTutor history of mathematics archive":

- Andrei Kolmogorov
- Charles Dodgson (alias Lewis Carroll)
- SimÃ©on Poisson
- Pierre-Simon Laplace
- Blaise Pascal

## About the author

Geoffrey Grimmett is Professor of Mathematical Statistics in the Statistical Laboratory of the University of Cambridge.