PROBABILITY PARADOXES ?

Monty Hall Problem

Let’s consider the following fixed scenario to help us understand the solution:

Just read the following as plain text, do not think about what is written or try to think about the logic (or your logic) of the problem. One quick read, believing whatever I say is True, it will all make sense in the end.

We choose door number 1 (Without loss of generality)  and the treasure could be behind any of the 3 doors:

Scenario 1:    1’    2(T)    3

Scenario 2:    1’    2    3(T)

Scenario 3:    1’(T)    2    3

Step 1: Looking at all the Scenarios, we can see that both in Scenario 1 and in Scenario 2 the host will open door 3 and door 2 respectively, we see that when we switch the door from 1 -> 2 or 1 -> 3 respectively, we will get the Treasure, i.e. in  ⅔ scenarios we are getting our Treasure by simply switching. Hence it is more beneficial for us to switch.

Now, wait wait wait!! How did this make sense? Where was the part where I thought that the probability was 50-50 and not ⅔ 

Scenario 1:    1’    2(T)    3

Scenario 2:    1’    2   3(T)

Scenario 3:    1’(T)    2    3

Step 2: Let me tell you where. See, when you think about the show host opening a door, say door 2, then you conclude the fact that you are in a universe where the treasure is not behind door number 2, so you remove that scenario out of the picture (in this case Scenario 1)

So it comes down to, either you are in Scenario 2 or Scenario 3, and hence the probability of getting the Treasure in door 1 is ½ .

But wait, that makes sense! Right?

Scenario 1:    1’    2(T)    3

Scenario 2:    1’    2    3(T)

Scenario 3:    1’(T)    2    3

Step 3: Wrong! You forget about the part where the game host is obligated to not open the door with the Treasure. When you choose one of the 3 doors, you are choosing ⅓ chance of getting the treasure, with ⅔ chance of the Treasure being in the 2 other doors combined. Think of it as you have to choose between, sticking with choosing 1 door, or sticking with choosing 2 doors, as the game show host will open 1 of the other 2 doors and you can choose to open the second door as well, you are actually opening both the doors that you did not select in the first place, hence by just randomly selecting one of the 3 doors with ⅓ probability and after that choosing to open both the other doors: ⅓ + ⅓ = ⅔, hence you are choosing to take the ⅔ probability option instead of ⅓ of your chosen door. 

Step 4: But wait! You are anyway choosing 2 doors at the end of the game whether you switch or not, right? So by your logic I am anyway getting to open ⅓ + ⅓ = ⅔ probability scenario, which is equal to the Scenario where I switch, hence 50-50. Right?

Scenario 1:    1’    2(T)    3

Scenario 2:    1’    2    3(T)

Scenario 3:    1’(T)    2    3

Wrong! The probability of you choosing a correct door is ⅓, we all know that, but the probability of us choosing the incorrect door is ⅔! So in 2 out of 3 scenarios, we will end up choosing the door which does not have the Treasure (Scenario 1 and 2) and in only 1 out of 3 scenarios we will end up choosing the Treasure (Scenario 3). So in 2 out of 3 scenarios the Treasure is in the unopened door. And in only 1 out of 3 Scenarios the Treasure is in the door that you have chosen (door 1).

So to conclude, your probability of choosing the wrong door is ⅔ and hence the probability of the treasure being in the rest of the 2 doors is ⅔, as the game show host opens one of the 2 doors, and you choose to open the remaining door by switching, you are basically opening both the doors that you did not choose. And in 2 out of 3 scenarios it will have the Treasure behind it as your chances of choosing the right door in the first try is just 1 out of 3.

Bertrand’s Box Problem

The problem is of 3 boxes with Gold (G) and Silver (S) coins.

Box 1:    2 G

Box 2:    2 S

Box 3:    1G    1 S

Problem is simple. If you choose a box and pick out a coin. If the coin is gold. What is the probability that the next coin in the box is also gold?

You have chosen a box which has at least 1 gold, hence your box is not Box 2 as that box has both silver coins. So your box is either Box 1 or Box 3, hence the other coin can be either silver or gold, with probability 50%, right?

Wrong! It is the same problem as our favorite Monty Hall Problem.

Let me explain, let’s name all the coins:

Box 1:    G1    G2

Box 2:    S1    S2

Box 3:    G3    S3

Now we either have Box 1 or Box 3:

Box 1:    G1    G2

Box 3:    G3    S3

Now, we know the coin we have chosen is Gold. So it can be one of G1, G2, G3.

Scenario 1: G1 is chosen. G1 -> G2

Scenario 2: G2 is chosen. G2 -> G1

Scenario 3: G3 is chosen. G3 -> S3

Hence in 2 out of 3 Scenarios we get a gold coin.

Therefore the probability of getting a gold coin after our first coin is gold, is ⅔ and not ½.

Leave a comment