This post is about how everyone gets the Monty Hall problem wrong, but probably not in the way you’re thinking.

If you’re not familiar with the Monty Hall problem, you should probably read the Wikipedia page about it. Here’s a typical description of the problem, taken from Marily vos Savant’s famous article on the problem:

Suppose you’re on a game show, and you’re given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what’s behind the doors, opens another door, say No. 3, which has a goat. He then says to you, “Do you want to pick door No. 2?” Is it to your advantage to switch your choice?

What you’re supposed to do is think that there’s no advantage to switching. The surprise is supposedly that by switching your odds go up from 1/3 to 2/3.

The logic is that when you first picked, your odds of picking a losing door are 2/3. When you are offered to switch, the odds that you picked a losing door are still 2/3, thus the odds that the door you are offered to switch to is a winning door is also 2/3 because there are only two doors and one of them is the winner. The odds that one of the is the winner is the same as the odds that the other is the loser.

Thus, the claim is, switching doors doubles your odds of winning.

However, this is not true. Why? Let’s try all the possibilities:

Suppose you have chosen the winning door. If you switch, you lose. So at least 1/3 of the time, you definitely lose.

Now suppose you have chosen a losing door. This is not a repeated trial and nothing about the problem says that the host *must* offer to let you choose to switch doors. We are only told that the host let you switch doors. For all we know, it’s entirely possible that the host wants you to lose and only offers to let you switch if he knows that you picked a wining door.

So while we know your odds of choosing a winning door is 1/3 at the beginning and your odds of choosing a losing door is 2/3 at the beginning, we have no idea whether or not those odds have changed based on the host’s actions. It’s entirely possible that your odds of having chosen a winning door are 100% because that’s the only time the host gives you the option to switch.

So, suppose that’s the case. What happens? 2/3 of the time (though not this time) you pick a losing door and the host doesn’t offer you to switch. 1/3 of the time you pick a winning door, including this time, and the host has offered you the option to switch, in which case you definitely lose.

So in the version of the problem Marilyn vos Savant gave, switching can ensure you 100% lose if the host only offers to let you switch if you have chosen a winning door, a possibility not ruled out by the problem description.

As the Wikipedia explains, the “you should switch” conclusion depends on variety of assumptions that are not included in the problem. As the problem is typically stated, and as it was described by vos Savant, *you should not switch*.

To phrase the problem such that switching increases your odds of winning to 2/3, it is vital to specify in the problem that the host *must* open a losing door and *must* offer you an opportunity to switch. Anyone who specifies the problem without doing this doesn’t actually understand the Month Haul problem.