Here's a dilemma. Suppose you and a friend have been arrested for a crime and you're being interviewed separately. The police offer each of you the same deal. You can either confess, incriminating your partner, or remain silent. If you confess and your partner doesn't, then you get 2 years in jail (as a reward for talking), while your partner gets 10 years. If you both confess, then you both get 8 years (reduced from 10 years because at least you talked). If you both remain silent, you both get 5 years, as the evidence is only sufficient to convict you of a lesser crime.
What should your strategy be? As a selfish and rational individual, you should talk. If your partner also talks, then your confession gets you 8 years instead of 10. If your partner doesn't talk, then it gets you 2 years instead of 5. Talking is your dominant strategy, it leaves you better off than silence, no matter what your partner does.
The trouble is that your partner, just as selfish and rational as you, will come to the same conclusion. You'll both decide to talk and get 8 years each. Paradoxically, your dominant strategy will leave both of you worse off than silence would have done.
The prisoner's dilemma is one of game theory's most famous games because it illustrates why people might refuse to cooperate when they would be better off doing so. One real-life situation that is similar to the dilemma is an arms race between two countries, in which both countries increase their military might when it would be better for both to disarm.
The dilemma has been used extensively in mathematical research into altruism. Mathematical research into altruism? Yes, that's right! Using the dilemma as the basis for computer simulations in which simulated individuals can either cooperate or defect has shown how altruism can evolve as a survival strategy, even in large societies.
Return to the Plus Advent Calendar