icon

Maths in a minute: The two envelopes problem

Share this page
Martin Hairer t

Martin Hairer talking at the Heidelberg Laureate Forum 2017. Foto: Bernhard Kreutzer for HLFF (c) Pressefoto Kreutzer

Plus is currently in Heidelberg, Germany, visiting the Heidelberg Laureate Forum 2017. The Forum gives young researchers in maths and computer science the chance to meet some of the best minds in their fields — and it gives us the opportunity to meet and interview both the laureates and the next generation. We'll be bringing you some in-depth coverage soon, but in the meantime we leave you with a brain teaser we learnt from Fields medallist Martin Hairer yesterday.

Imagine you have two envelopes. They both contain money, one twice as much as the other. You can pick one and keep the money inside. But just before you open your chosen envelope you are given the chance to change your mind. What should you do?

Write $x$ for the amount that's in your chosen envelope. This means that the amount of money in the other envelope is either $2x$ or $x/2$. The probability that it's $2x$ is $1/2$ and so is the probability that it's $x/2$. So the expected amount you'll get is

  \[ \frac{1}{2}\left(2x+\frac{x}{2}\right) =x+\frac{x}{4} = \frac{5x}{4}. \]    

Since that’s bigger than $x$, you should swap envelopes. But what if you are given another chance to swap envelopes after you have changed your mind once? By the same reasoning as above you should swap back again. And then, by the same argument again, you should swap a third time, and so on, forever. You end up in an infinite loop of swapping and never get any money at all. Is there something wrong with the reasoning? If yes, then what? Post your thoughts as a comment below!

Read more about...

Comments

Permalink

Your envelope has either x or 2x with equal probability or an expected value of 1.5x. If you have x and switch you get 2x and if you have 2x and switch you get x, each with equal probability --- you still have an expected value of 1.5x. So, no advantage or disadvantage in switching.

I agree - no advantage to switching. I think this is a variation of the Monty Hall problem - where you have three choices to make, but after you choose he opens one of the remaining curtains to show the prize and asks if you want to switch - which you always should to better your odds from 1/3 to 1/2. In this case there are only two options and no new information - why change!

That's not quite right.

If you assume x is the amount in your current envelope, switching will either double your money to 2x, or halve it to x/2, leading to an expected gain when switching as shown in the article.

You need to look a little deeper and think about what the implications are of saying that for any x the envelope has either x or 2x with equal probability.

Ken

If x and 2x are the amounts in the envelopes, then the expected value at the beginning of the game is (3/2)x.
You choose an envelope.
If you switch envelopes, the expected value of the money in the envelope you switch to is STILL (3/2)x.

The error is that when you say "switching can double your money to 2x or halve it to x/2", you are making a tacit assumption that you have information about your current choice - that you know what x itself is in some sense. This isn't the Monty Hall problem at all - in the MH problem you've gained information before you are offered the option of changing doors. Here, you've gained NO information in between, so there's no advantage to switching.

I think your last sentence contains a simple but key point Wessen. Here's my take on it:

We're told to call our choice of envelope "x", and presumably nothing else.

And to consider that the other envelope has either twice as much or half as much.

So far so good. But then comes the false step. If we now go back and use the very same original designation "x" in expressing these two possibilities as x and 2x we're caught up in a self-contradiction, since there are now at least two possibilities compared to one at the beginning. And if we add that beginning one to the present two we get three which we inevitably calculate as x, 2x, 1/2x. (And of course if we then continue and use the term x to designate our choice between these three, we're definitely sliding down a long road of confusion.)

Put it another way. It's as if in one envelope instead of cash we have a piece of paper with "x" written on it, and in the other just a piece with "2x" written on it. And if we then we go ahead and call our actual choice of envelope "x" as well, the logical clash (or collapse) appears to allow the possibilities to multiply.

This explanation needs no reference to probabilities or expected values.

So few commenters are thinking about the distribution - they are all getting hung up on the algebra and what 'x' might be etc. But if you think about the distribution you realise the whole problem is broken. It is not possible for x & 2x (or x/2 for that matter) to have equal probability because for a uniform probability distribution there has to be a maximum amount, after which the probability will be 0.

Indeed, if the amounts in the two envelopes are x and 2x then your expected payoff is 1.5x, but here x is not the envelope you chose, it is the amount in the smallest envelope. So I agree with everything here, except the last sentence about the advantage of switching - noting that the expected payoff is 1.5x doesn't tell us anything about the advantage of switching.

(Just stumbled on this from two years ago.) I disagree with this reply. In your analysis, what is x when you say "your envelope is either x or 2x". Please identify the variable "x" first. i.e. Start with "let x be ....." It seems you are letting the symbol "x" represent two different things here, two different variables.

But if one says "Let x be the value in the envelope chosen", then it IS correct to say the other envelope either has value x/2 or value 2x, and therefore an expected value of 5x/4. So, change!

Permalink

I think the point is that you should only switch if the expected value is greater than the expected value of not switching. Not switching is 5x/4 (or 3x/2 if the problem description is taken literally) and so is switching. So your return is the same either way. Maybe we get lulled into thinking there's a trick because the expected value is greater than x. But that's not important - there's more than 2x on the table, split between two envelopes, so we should expect an expected value greater than x.

This is different to the Monty Hall problem in that the expected value of switching is *different* to the expected value of sticking with the original choice, and only because the situation has changed (a door is opened). That's what you can't keep applying it.

Permalink

The trick appears to be about what exactly it is that is denoted by letter "x" in the calculation in question, and what is implicitly assumed about relevant probabilities.

Let e1 and e2 be random variables denoting amounts in the envelopes marked 1 and 2, respectively. Then, obviously:

p(e1 > e2) = p(e2 > e1) = 1/2

However, the probability to find more in envelope 2 than in envelope 1 given amount x in envelope 1 must be conditional on x:

p(e2 > e1 | e1 = x) = p(e2 > x)

It depends on x, and is either 1 or 0 depending on whether x is the minimum or the maximum of its two possible values. It cannot be 1/2 irrespective of x as assumed by the calculation in question.

Permalink

It is not true that the probabilities of the other amount being 2x and x/2 are both equal to 1/2.
In fact they depend on the size of x relative to whatever probability distribution arises from the process by which the envelopes were filled.

Permalink

Let us say that the envelopes contain Y and 2Y "amounts of money". You can only call it x after you have choosen an envelope already. So there are 2 cases and you have to analyse what you will get if you switch:

you have picked Y and called it x
you have picked 2Y and called it x

and this solves the paradox?

Permalink

I think the description of "expected value" should be clarified into "expected value of other envelope". Because if we are talking about expected value of both envelopes, there would be no paradox because the expected value of the whole game would be similar even whether you switch or not switch the envelope.

But the paradox came when we try to calculate the "expected value of other envelope", which has been calculated correctly above.

Permalink

One weak point seems to me to be formula for the expected amount.

I'd have thought it would be less appropriate for a one off choice between hanging on to an envelope and swapping it for just one other containing either half or double, and more for one in which there were a long series of such envelopes each either doubling or halving the previous, making it more and more likely as you go on that if you were to stop you'd be 25% ahead of that first envelope. If you started by swapping £10 for the chance of getting £5 or £20, and then getting and swapping back £5 for £10 or £2.50; or instead getting and swapping back £20 for £10 or £40, the formula suggests to me the longer you go on doing this the more likely you are to have 5x£10/4 = £12.50 when you arbitrarily stop.

Swapping under these conditions becomes more reasonable as you go on, given you never know what's in the envelope you have now and thus able to retire earlier with bigger winnings. But at least you never come out an overall loser from when you started, at worst just ahead by a ridiculously small fraction of a penny.

Permalink

write a and 2a for the amount of money in those two envelope, write x for the amount of money that is in your chosen envelope. The x can be a or 2a in both cases with the probability of 1/2. When the x is a, the amount of another envelope must be 2a, vice versa. If you change your mind to choose another envelope, in fact the expected amount you will get is:
"the expected value when your chosen envelope is a " plus "the expected value when your chosen envelope is 2a", which is 1/2*(2x) (when x equals a ) plus 1/2*x/2 (when x equals 2a).

so the expected amount you will get is 1/2*(2x(x=a)+x/2(x=2a))=1/2(2a+a)=3/2*a.

At the same time, the expected value of the first envelope is 1/2*a+1/2*2a=3/2*a.

In conclusion, you do not have to change your mind if you choose one envelope.

The key point is that the " x " in the chosen envelope is a random variable, the x in 2x or x/2 for the amount of other envelope is two different value .

Permalink

Presumably twice as much money is heavier...so if you've swapped it once, you can just compare the weights to know whether you should swap it back or not :-)

Permalink

I think the article should more clearly state that (5x/4) is expected amount of "taking other envelope". The expected amount of the whole game has not changed, no matter you swap/not-swap the envelope, but the paradox happens during the moment you consider switching (taking other envelope).

Permalink

Isn't the problem in calling the amount in your first envelope x. Imagine that the quantities are 1 and 2 pounds in the envelope. In one scenario, x = 1 and by swapping you end up with 2 pounds, but in the other scenario x = 2 and by swapping you end up with one pound, x is variable and not a fixed value.

What is described here would be the situation if you had to decide between a guaranteed one pound OR picking one of two envelopes which contain either 2 pounds or 50 pence (either doubling or halving the one pound). In that situation if you looked at expected winnings you would go for the second option.

Permalink

A hint:

there are two amounts, the smaller, S, and the larger, L.
When asked to "write x for the amount that's in your chosen envelope" what does x equal?

Permalink

I think the basic confusion is a contradiction between two uses of the word "choose", and hence of "swap".

I can dither in winter between Ibiza and Majorca for my summer holidays, choosing Ibiza one day, swapping it for Majorca the next, and swapping back again to Ibiza the day after, and so on endlessly, or at least for a week or two. But although I've used the word "choose" here, it's equally true to say the opposite, namely that I've actually failed to choose as long as I fail to take any action that changes things irreversibly, such as make a booking. And if I haven't made a choice, how can I swap it?

Similarly if my choice of envelope doesn't result in any irreversible change, for example going on to be offered a further choice between a further pair of envelopes, one which is half and the other double that original choice, then to say that I've chosen and swapped anything at all is to say no more than I've dithered, just as in my holiday resort example.

Permalink

Nice one!! Similar to the three doors problem. If I am correct, the point is the implication: if the mean value is bigger than one -> switch envelop.

Before choosing any of the envelop, according to how the amount in the envelop is parametrised we can have mean value of 3/4x (envelop1 = x, envelop2 = 0.5 x) or 3/2x (envelop1 = x, envelop2 = 2 x). Using the same implication above, in the first case is not convenient to play the game (pessimistic parametrisation!), in the second it is convenient (optimistic parametrisation!).

This probably makes sense only if x is the quantity that we put in the game from our pocket, and in the game A we can receive x or 0.5x while in the game B we can receive x or 2x.

If, after "normalising", there are two quantities that we can win: 1 dollar or 2 dollars and the probability of winning one or the other is 0.5, then the mean value is always 3/2, either when we start the game choosing an envelop for the first time, or if we are asked to change it during the game, as no new information are acquired from one situation to the other.

So, I would not waste time changing envelop an infinite amount of time; instead I would play as many time as possible :-)

Permalink

The probability that you have the larger amount is indeed 1/2. The probability that you have the smaller amount is also 1/2. But the joint probability that you have the larger amount *AND* that amount is X is undetermined.

Let me explain with an example. Say I prepare six envelopes. I put a $10 bill inside one, and set it aside. I distribute one $20 bill and four $5 bills between the rest, pick one at random, and give it to you along with the one I set aside. I can now truthfully say "They both contain money, one twice as much as the other."

The probability that you have the smaller envelope is 50%. But this is broken down into a 40% chance that you have $5 and the smaller envelope, a 10% chance that you have $10 and the smaller envelope, and a 0% chance that you have $20 and the smaller envelope. Similarly, there is a 0% chance that you have $5 and the larger envelope, a 40% chance that you have $10 and the larger envelope, and a 10% chance that you have $20 and the larger envelope. If we call the amount in your envelope X, then X is clearly either $5, $10, or $20. BUT THERE IS NO VALUE OF X WHERE THE PROBABILITY THAT X IS THE SMALLER AMOUNT IS THE SAME AS THE PROBABILITY THAT X IS THE LARGER AMOUNT.

If I let you look in your envelope, and you see a $10 bill, there is only a (10%)/(10%+40%)=20% chance that it is the smaller amount. This is called a conditional probability; it found by dividing the probability of the outcome you know ($10) and the outcome you are interested in (smaller) happening together, by the total probability of the outcome you know. Similarly, the chance that it is the larger amount is (40%)/(10%+40%)=80%.

The point is that the solution suggested in the article is wrong. When you assign the unknown X to your amount, you can't use the simple probabilities of picking the smaller or larger envelope. You need to use the joint probability of picking the smaller envelope *AND* the pair containing (X, 2X) with the joint probability of picking the larger envelope *AND* the pair containing (X/2, X). If you don't know the relative likelihoods of (X/2, X) and (X, 2X), you can't make this calculation.

But if you call the total amount in both envelopes 3X, then you don't need to know any relative likelihoods. There is a 50% chance that switching will gain X, and a 50% chance that switching will lose X.

Permalink

There are always the same two envelopes. Whether one of the envelopes is in my hand or on the moon makes no difference to the probabilities involved. A sequence of choices makes no difference to the probabilities involved - there are always the same two envelopes to choose between. The average value of the contents of the envelopes also has no relevance to the probabilities involved. My choice - no matter how many times I do it - always has a 50% chance x and a 50% chance of 2x

Permalink

This is different to the Monty Hall problem that makes use of Bayes Theorem, where after the initial choice of door information is gained by Monty opening a door to reveal nothing is behind it.
In this envelopes situation, no information is gained after making our initial choice, about the probability of the contents in the other envelope - the expected value is the same as it was at the outset so there is no probabilistic reasoning to employ to justify altering our decision.

Permalink

Consider different prior probs: in particular assign to ln(2x) probability 1/2 and to ln(x/2) probability 1/2. Then, the expected value of the other envelop becomes
ln(2x) 1/2 + ln(x/2) 1/2 = ln(x)
which is the same as the logarithm of the value on your envelop.

Permalink

If I instead of taking the envelope, (equivalently) toss a coin. And let us say I choose 'Heads'. Question: This situation above is it not akin to then changing my mind to 'Tails' -- and so on?

What is the fundamental reason that one should keep flipping from 'Heads' to 'Tails' and back-n-forth, and assume there is more money being made from (equivalent) future iterations.

This is a Ponzi scheme. :-)

The Ponzi scheme is a fraudulent one which is initiated to grab new investors to pay off dues of the old investors. Portfolio Executives manage the flow of funds within the scheme. Furthermore keep a record of new investments and monitor payments. Basically its an age-old customary scheme that continued till date. Its name was coined after Charles Ponzi, the father and originator of the racket. He was known for starting the era of embezzlement in the 1920s.

Permalink

What if the envelopes contained, say $1 and $100? As far as I understand it, the math above goes out the window, but the decision of whether or not to swap envelopes doesn't change. It's basically 50/50 odds.

Permalink

x is the amount in your chosen envelope. Let L = lower amount and H = higher amount. H=2L.
Expected value of x is 1/2*L + 1/2*H = 1/2*L + 1/2*2L = 3/2 * L
Expected value of the other envelope is also 3/2 * L by symmetry, so there's no point in switching.

The (deliberate) flaw in the formula they set out is that they say the amount in the other envelope is either 2x or x/2, but this means that one envelope is worth 4 times the other, which we know isn't the case!

Permalink

Very interesting. If I swap twice do I get 5x/4 squared? Maybe I will get someone to swap on my behalf and go on an expensive holiday knowing full well it will be paid for when I get back!

Permalink

It's a function of x, which you don't know.

Permalink

I call my choice of envelope x.
Then the other must be 2x or 1/2x.
I like the sound of that, so I swap.
So my first choice is now the other envelope and is 2(2x or 1/2x) or 1/2(2x or 1/2x) = 4x or x or x or 1/4x = 4x or x or 1/4x
I like the sound of that even better, so I swap again
So the other envelope is now 2(4x or x or 1/4x) or 1/2(4x or x or 1/4x) = 8x or 2x or 1/2x or 2x or 1/2x or 1/8x = 8x or 2x or 1/2x or 1/8x.
Yes, you got it, I swap again ... and again till the first term is like (2^100)x or something, then I open the envelope.
That's how I get to be one of those mysterious billionaires.

Permalink

Instead of thinking about the two outcomes of the situation, it is better evaluated by the amount of change that will be experienced by the switch. You can either hold the 2x or the x with equal odds, so if you don't make the switch, it is fifty/fifty either way. If you hold the x and switch to the 2x, that is a net gain of x. However, if you hold the 2x and switch to the x, that is a net loss of x. Averaging these outcomes, you get 0.5(x - x)= 0, which shows that there is no net gain or loss in making the switch.