The Ask Dr. Math site includes a Frequently Asked Questions page with extended discussions of common topics like Fractions, Order of Operations, and Prime Numbers. Some topics get a lot of push-back from readers who disagree, some from curiosity, others with virulence. The next few posts will examine our answers to some of these challenges, starting with a perennial favorite, the Monty Hall Problem. (I’m focusing on individual answers rather than the FAQ itself, because the latter is already all in one place. You should be sure to read each FAQ, which will be more coherent than my compilation, though it will miss the argumentation.)
The problem and how to think about it
So, what is the Monty Hall Problem? We can start with this amicable question from 1996, which began with a clear statement of the problem and its correct answer:
Probability: Let's Make a Deal I have been having difficulty understanding certain aspects of the "Let's Make a Deal" problem, and I was hoping you could help me. I teach middle and high school mathematics classes. I will state the problem, indicate my understanding of the solution and pose the related question that perplexes me. Here is the problem: There are three closed doors at the conclusion of a game show. A contestant chooses one of the doors, behind which he or she hopes lies the GRAND PRIZE. The host of the show opens one of the remaining two doors to reveal a wimpy prize. The contestant is then given the choice to stay with the original choice of a door or to switch and choose the remaining door that the host did not open. The problem is: Should the contestant stick with the original choice of doors or switch and choose the other door? The solution is that the contestant should switch and choose the other door. The reason for this is as follows. Let's say that the contestant chose door #1. This means that the probability that the grand prize is behind door #1 is 1/3 while the probability that the grand prize is behind one of the other two doors is 2/3. Once door #3 is opened, the probabilities do not change.
So, put simply, when host Monty Hall opens one of the other doors and shows that it is not the prize, giving you a chance to change your mind, you should take it, because you are more likely to have chosen wrongly than rightly. If this scenario is repeated many times, you will win twice as often if you choose to switch.
Bob continued with an extended scenario:
This would be analogous to a situation where I had a choice of picking one lottery ticket among 100 tickets where the winning ticket was among the 100. Clearly, the probability that the winning ticket is among the other 99 is 99/100, so that if all but one of those 99 tickets were revealed to be losing tickets, it would be in my interest to switch, as the winning ticket was (and still is) among the other 99.
This strategy of using an extreme example to clarify a situation is a good one, which we have used ourselves.
What if two people play?
So Bob knew the “correct” answer; but he saw an apparent flaw:
Here is my question: Suppose that there is a second contestant from the beginning of the door choices. Contestant A chooses door #1. Contestant B chooses door #2. Door #3 is opened and a silly prize is behind it. According to the above logic: 1) both contestants should switch choices of doors; and 2) the probability that the grand prize is behind door #2 is 2/3 AND the probability that the grand prize is behind door #1 is 2/3. Clearly, something is amiss. I suspect that the error lies with the language I have used, the referent for the probabilities or a misuse of definitions. I would appreciate your help, please. Thank you.
It can’t really be better for both to switch, can it? There is a very subtle error here, which Doctor Aaron answered:
Interesting problem. The problem is that you can't use the logic from the one-player game to explain the two-player game. In the one player game, we have only 2 distinct sets: the door I have chosen (door 1), and the doors I have not chosen. Because the probability that the Grand Prize is behind a given door is 1/3 for each door, the probability that it is behind my door is 1/3, while the probability that it is behind one of the two remaining doors is 2/3. When I get information that of the doors I have not chosen, the prize cannot be behind door 3, the probability is still 2/3 for the doors I have not chosen, but the only unknown of these is door 2 so the probability the prize is behind door 2 is 2/3, therefore I should switch. This logic is sound, but it does not hold in the two player game.
In the classic problem, the door is opened with knowledge that it will not be the winner; it doesn’t tell us anything new about which of the two sets (chosen/unchosen) holds the prize, and so does not affect the probability that the original choice was wrong.
In the two player game, we have 3 distinct sets of doors: The door I chose (#1), the door player B chose (#2) and the door that neither of us chose (#3), each of which has a 1/3 probability of containing the prize. Now Monty has less freedom in choosing which door to reveal - he must reveal the door that neither of us has chosen. This gives us information about an entire set of doors instead of about a single door within a set. In the 2-player case, the information we gain is sufficient to update the probabilities, so each of the remaining doors has a 50/50 chance of containing the prize.
Here, if Monty is required to open a door, he is forced to give us genuine new information: either the third door is the winner, or not. So we can reevaluate our choice.
For a thorough analysis of the one- and two-player versions, with the same conclusion, see
Card Game Analogous to Monty Hall?
Doctor Steve also answered a different part of Bob’s question, which I have omitted for the sake of space; but the following piece of his answer deserves inclusion:
I tend to like rephrasing the problem along the following lines: You pick a door. Monty offers to let you stay with your pick or take both of the others. It's clearly a good deal to get the other two doors instead of the one you first picked. When Monty shows you door #3, in essence he is merely showing you which of those two doors you should look behind if you're going to find a prize. We get new information from Monty's choice because we know he is going to pick a door with a poor choice. This means that 2 out 3 times he will have to avoid showing you the good prize behind the other door. He has a choice between two doors and he uses information about what is behind them to make that choice. His choice passes along some of the information to us if we are clever enough to figure it out.
The entire probability of the two doors we didn’t pick has fallen upon the one he didn’t open; he’s given us a bonus!
Why isn’t the probability 1/2?
Now we’ll turn from a teacher wanting to defend the answer against a doubt, to a student who seriously questions its validity. Note that this exchange took place on the third day of operation for Ask Dr. Math (Nov. 3, 1994)!
Monty Hall Strikes Again Basically the problem goes like this. There are three cups, one of which is covering a coin. I know the whereabouts of the coin, but you don't. You pick a cup, and I take one of the remaining cups, one which DOESN'T contain a coin. Both you and I know the cup I pick doesn't contain a coin. You then have the option to swap your cup with the third, remaining cup, or keep your first choice. What is the probability of the coin being in the cup if you keep your first choice, or if you decide to swap them? To summarise: Three cups with a coin under one. You pick one, I pick one that DOESN'T have the coin. You then either stay with your choice, or swap it with the remaining cup. What is the probability of getting the coin, either way? BTW the answer I get is 50% either way - though it has been suggested that you have a 2/3 chance if you swap cups.... I disagree with this, but I don't think my maths is capable of giving a definitive answer (which is why you're reading this!) (This problem has placed a cool Australian $50 on the line here in a bet - so I need an answer proving me right! :-) My mate would never let me forget it if he was right...)
Sean doesn’t explicitly state why he thinks the probability is 1/2, but it’s probably because there are two choices that seem, superficially, to be equally likely. Doctor Phil took this, focusing on outcomes that are clearly equally likely:
This question is a famous brain teaser that is usually described in terms of the game show Let's Make a Deal. Indeed, you do have a 2/3 chance of winning if you swap cups. The answer is anti-intuitive. Basically, the solution is based as follows: Let's say it was under cup 1. Option 1: You originally choose cup 1. Then, you get shown one of the empty cups (let's say cup 2 or 3) at which point if you CHANGE (to the remaining empty cup, either cup 2 or 3), you LOSE, but if you REMAIN with cup one, you WIN. Option 2: You originally choose cup 2. Then, you get shown the remaining empty cup (cup 3), at which point if you CHANGE (to cup 1), you WIN, but if you REMAIN with cup 2, you LOSE Option 3: You originally choose cup 3. Then, you get shown the remaining empty cup (cup 2), at which point if you CHANGE (to cup 1), you WIN, but if you REMAIN with cup 3, you LOSE. Each of the three options has an equal chance of occurring, based upon your original random pick. If you change 1/3 of the time, you lose (in the case of option 1, which has a 1/3 probability of occurring), while if you change 2/3 of the time (with options 2 or 3, which have a 2/3 probability of occurring), you win. Therefore, the proper choice is always to change! Sorry about that 50 dollars. :( I think the place where things skew to the anti-intuitive is the fact that you will -always- get shown an EMPTY cup by the game-show host/tricker, not necessarily a random cup.
So, 1/3 of the time you chose right and shouldn’t change; 2/3 of the time you chose wrong, and will win if you change.
Monty Hall Fallacy?
Now let’s look at a 2001 question claiming that our answer, quoted from our FAQ, is fallacious:
Monty Hall Logic The Dr. Math FAQ states that "We will assume that there is a winning door and that the two remaining doors, A and B, both have goats behind them. There are three options: "The contestant first chooses the door with the car behind it. She is then shown either door A or door B, which reveals a goat. If she changes her choice of doors, she loses. If she stays with her original choice, she wins. "The contestant first chooses door A. She is then shown door B, which has a goat behind it. If she switches to the remaining door, she wins the car. Otherwise, she loses. "The contestant first chooses door B. She is then is shown door A, which has a goat behind it. If she switches to the remaining door, she wins the car. Otherwise, she loses." There are, in fact, four options. The first one described is actually two separate options: 1A. the contestant chooses the car, is shown a goat behind A or 1B. the contestant chooses the car, is shown a goat behind B There are three choice points, not just two. There is a logical fallacy in suggesting there is any connection between the two contestant choice points. There is a mathematical fallacy in here also. At the first choice, I have a 1/3 chance of choosing the right door. Let's presume I choose door A. If the prize is behind door C, Monty has a 100% chance of opening door B. If the prize is door B, he has a 100% chance of opening door C. If the prize is in door A, he has a 50% chance of opening B and a 50% chance of opening C. He cannot open either the door I've chosen or the door with the prize. This is left out of the published "analysis." So, we are left with these possibilities: A B C 1 prize open empty 2 prize empty open 3 empty prize open 4 empty open prize Hmmm... I now have a 50% probability of having the right door. The mathematical fallacy is in considering the first two possibilities as a single possibility when they are, in fact, distinct. There are three choice points, not just two. The host has a choice also, but that choice is sometimes 100% (if I've chosen an empty door) and sometimes 50% (if I've chosen the correct door).
Do you see the fallacy in George’s thinking? Doctor Anthony answered, just going back to the most basic explanation of the correct answer, which I will omit. Doctor Rick added a response to George’s own reasoning:
You say yourself that I have a 1/3 probability of choosing the correct door first. But then you argue that because Monty can make two choices if I have chosen the correct door, and only one choice if I haven't, therefore there is a 1/2 probability that I have chosen the correct door. You are therefore saying that the probability that I have chosen the correct door changes depending on what Monty is going to do after I have made the choice! No, there is a 1/3 probability that I have chosen the correct door, regardless of what Monty does next. Suppose Monty flips a coin to decide which door to open next if he has two choices. The possibilities are then (if the correct door is C): 1. I choose door C, Monty opens door A (probability 1/3 * 1/2 = 1/6) 2. I choose door C, Monty opens door B (probability 1/3 * 1/2 = 1/6) 3. I choose door B, Monty opens door A (probability 1/3) 4. I choose door A, Monty opens door B (probability 1/3) Analyzing the game in terms of four choices doesn't change the remainder of the argument, because the probabilities are the same as in the published solution. Your fallacy is in assuming that, if you can enumerate N possibilities, the probability of each must be equal.
George didn’t take into account the individual probabilities of his cases, which are not all 1/4. Rather, the sum of the probabilities of winning if I switch (cases 3 and 4 in this list) is 1/3 + 1/3 = 2/3 as before.
The last resort: Experiment
Only a small percent of questions made it into our archive. I count at least 300 questions we have received about this problem, only about 6 of them archived. The harshest, or the longest discussions, not surprisingly, are not made public.
For example, in 2010 a former statistics professor wrote under the heading “Your erroneous answer to the Monty Hall problem”, which engaged several of us in a 6-day marathon discussion that finally ended with her declaration, “I just want to say thank you for putting up with my indignant stance. I dreamt the solution last night and felt like I was back in university experiencing a eureka moment. I will read your comments later, but once again thanks for not giving up on me.” (Success!)
Another writer in 2010 (anonymous) wrote, “The popular solution to the Monty Hall and related problems is flawed.” Here is part of Doctor Roy’s response:
I'll use the same argument I use for everybody who comes up with "flaws" in the Monty Hall problem - try it out for yourself. It's as simple as that. I understand that the standard (and correct) solution is often confusing to people. So, the simplest method is to actually play the game and work out the probability that way. I'll guarantee that if you run sufficiently many trials (say 100 or more), you'll find that after Monty gives you a choice, switching is better for you by a measurable amount than sticking with the original door choice. ... Get a friend to play Monty. Have your friend use some random and fair way to choose which door contains the car and which doors contain goats (say dice or a random number generator, if you're getting fancy). Make sure that you don't see the result (use a screen or something). Now, guess a door. Have your friend pick one of the goat doors. Here's the important part - record whether or not you would win if you switched doors or stayed with your current door. If you are correct, it shouldn't matter - you'll get 50/50. If you are not correct, there will be a measurable benefit to switching doors.
Similarly, Doctor Tom said,
My arguments below may not convince you, but if you really need convincing, find a friend who wants to earn some money and play a few hundred rounds with him where he takes the position of Monty Hall and you always stick with your first choice. You get a dollar if you win the car and he gets a dollar if you don't.
While putting this post together, I made a simulation of the problem in Excel, with a column for the location of the car and one for my initial choice (both random), then a third column for Monty’s door (either the remaining door with a goat, or a random choice between two). Then I made a column saying whether I won by switching. Just writing the formula for this made the answer obvious, because it merely had to compare the first two columns – what Monty does makes no difference. The result, over 500 rows, varies with each refresh, but a typical answer was that I won 30.20% of the time by staying, and 69.80% by switching. Pretty close to our prediction!