Salanis
SuperDork
10/31/11 5:21 p.m.
4 envelopes. You pick one with a 25% chance of winning.
If you are offered a chance to trade your 1 for all three, should you? Yes. You trade 25% for 75%.
Of those three you didn't choose, what is the chance that at least two will be empty? 100% You know at least two that you did not pick are empty. The money can only be in one envelope, not split between the three.
When the dealer reveals that two are empty, he has not revealed any new information. You knew two were empty. The only thing he told you was which two are empty. The chance that the money was in one of those three envelopes is still 75%.
Tuna 55 wrote out all the possible permutations. Look back at what he wrote.
wbjones
SuperDork
10/31/11 5:24 p.m.
mtn wrote:
wbjones wrote:
mtn wrote:
Apis_Mellifera wrote:
The two removed aren't actually irrelevant because at the time of your selection they were viable choices. Each envelope has a 1 in 4 chance of containing the money. If you stay with your original selection, your odds are 1 in 4 of having the money. Because you have the opportunity of making a second selection from two choices, the notion is that you now have a 1 in 2 chance. This would be true if the two envelopes were collected and shuffled and the choice was now random. However, the envelopes were not shuffled. Two wrong choices were just removed from the three you didn't choose (which logic dictates to be a known whether they're removed or not). After the initial selection the choices are no longer random thus the probability does not change from 25/75. I think that's were the 50/50 folks are getting tripped up (whether they realize it not).
Bingo. It isn't random, and since somebody is "in on it" the chances are not 50/50.
not sure how "somebody being in on it " changes anything.
I meant that someone knows which ones are empty, and removes two of them.
that is exactly what the OP stated... that two would be removed, and they would be two of the three empty envelopes....
wbjones
SuperDork
10/31/11 5:30 p.m.
Apis_Mellifera wrote:
If you still incorrectly believe the answer is 50/50, then I suggest you google Conditional Probability. This problem is an example of conditional probability.
I studied probability and possibility many yrs ago.... so yes it suggests itself to be an example of conditional probability .. but I still contend that two of the envelopes are nothing more than examples of "noise"
in the original post .. if I'm the one given the choice I know going in that three are empty and I also know that regardless of which I chose two of the emptys will be thrown away... therefore I'm actually just picking one of two ... not one of four.. two of them don't exist
ransom
Dork
10/31/11 5:39 p.m.
wbjones wrote:
in the original post .. if I'm the one given the choice I know going in that three are empty and I also know that regardless of which I chose two of the emptys will be thrown away... therefore I'm actually just picking one of two ... not one of four.. two of them don't exist
You know you will be facing a choice between two envelopes, but you are also determining which one will be the "in-hand" envelope, and which one will be the "on-table" envelope.
Your first decision determines whether the winner will be in your hand or on the table. You have a 25% chance of picking it up, and a 75% chance of leaving it on the table.
After you have taken the action which determines whether the winner is in your hand or on the table, you are asked to pick between the two.
Salanis
SuperDork
10/31/11 6:03 p.m.
wbjones wrote:
in the original post .. if I'm the one given the choice I know going in that three are empty and I also know that regardless of which I chose two of the emptys will be thrown away... therefore I'm actually just picking one of two ... not one of four.. two of them don't exist
Yes they do. You could still have picked them. The dealer will never throw away an envelope that you have picked. They are viable initial choices. If the winning envelope is C, that does not mean that B and D get removed. You could still pick one.
The dealer does not reveal that two are null, the dealer reveals that they have a zero value. Zero and null are not the same thing.
Wow. While there are a number of things to be argumentative or opinionated about, I'm sure a math proof should not be one of them.
I think the majority of the responses are constructive, not argumentative, and as a 50/50 person I genuinely want to figure it out and have that ah-HA moment where it finally clicks for me, so I very much appreciate everyone's input.
I still have some concerns where the stuff gets grouped together. My mind sees the scenario as this: once you have picked an envelope you have essentially made two sets out of the envelopes, one that contains the money and one which does not. For the sake of discussion it does not matter how many units make up each set, just that their value is either $ or 0, when viewed as an entire set. One set (it may be the set containing one envelope, or it may be the set containing three envelopes, all of which are unopened at this point) will be $, and the other set will be 0. There can be no other options. Now, we know that each envelope, when looked at individually - ignoring their status as a set, will have a 25% probability of containing the $. We can also agree that any envelopes that do not contain $, also have no probability whatsoever of containing $, as we have just stated that they do in fact, not. So, and this is definitely where I get tripped up, Once two envelopes are confirmed to be empty we have more information about the sets, and in this example both envelopes are removed from one set. This leaves us with two sets, each consisting of one unit each. I can't wrap my brain around the suggestion that just because one set had contained more units, that at this point, while the known quantity of viable units has been reduced to a number equal to the number of viable units in the other set, it is considered to have a larger probability of containing $. I will read more, I am sure I will figure it out.
Salanis wrote:
4 envelopes. You pick one with a 25% chance of winning. I am with you on this.
If you are offered a chance to trade your 1 for all three, should you? Yes. You trade 25% for 75%. When does this happen? By my count you are offered a chance to trade your one for the unopened one on the table.
Of those three you didn't choose, what is the chance that at least two will be empty? 100% You know at least two that you did not pick are empty. The money can only be in one envelope, not split between the three. I agree on this.
When the dealer reveals that two are empty, he has not revealed any new information. You knew two were empty. The only thing he told you was *which* two are empty. The chance that the money was in one of those three envelopes is still 75%. This also make sense to me.
Tuna 55 wrote out all the possible permutations. Look back at what he wrote. I am headed back to study that some more.
wbjones
SuperDork
10/31/11 7:04 p.m.
nderwater wrote:
If you chose a playing card from a deck, there's a 1/52 chance that it's the ace of spades...
Without turning your card over, are the odds greater that yours is the ace, or that the rest of the deck contains the ace? The odds are 98% (51/52) that the ace is still in the deck.
So then the dealer privately looks through the rest of the deck and chooses 50 cards and turns them over, showing that they are not the Ace.
The last card in his hand has a 51/52 chance of being the ace. Your card has a 1/52 chance. Sure, if you set them both down, shuffle and chose one at random, you chance of getting the ace is 50%. But you haven't done that - you're still holding the card with the 1/52 chance.
just got back from supper... wanted to re-read this one... it actually makes some sense... but so does ECM
EastCoastMojo UltraDork
I think the majority of the responses are constructive, not argumentative, and as a 50/50 person I genuinely want to figure it out and have that ah-HA moment where it finally clicks for me, so I very much appreciate everyone's input.
I still have some concerns where the stuff gets grouped together. My mind sees the scenario as this: once you have picked an envelope you have essentially made two sets out of the envelopes, one that contains the money and one which does not. For the sake of discussion it does not matter how many units make up each set, just that their value is either $ or 0, when viewed as an entire set. One set (it may be the set containing one envelope, or it may be the set containing three envelopes, all of which are unopened at this point) will be $, and the other set will be 0. There can be no other options. Now, we know that each envelope, when looked at individually - ignoring their status as a set, will have a 25% probability of containing the $. We can also agree that any envelopes that do not contain $, also have no probability whatsoever of containing $, as we have just stated that they do in fact, not. So, and this is definitely where I get tripped up, Once two envelopes are confirmed to be empty we have more information about the sets, and in this example both envelopes are removed from one set. This leaves us with two sets, each consisting of one unit each. I can't wrap my brain around the suggestion that just because one set had contained more units, that at this point, while the known quantity of viable units has been reduced to a number equal to the number of viable units in the other set, it is considered to have a larger probability of containing $. I will read more, I am sure I will figure it out.
nderwater wrote:
Sperlo - If I told you that the envelope in your hand was empty, and I knew because i marked the correct envelope - then would you trade?
I dunnooo... It might me some kind of trick!
With four possible envelopes, each must represent a 1/4 chance of winning. The ONE envelope that you choose represents a 1/4 chance of winning, while remaining THREE envelopes must therefore represent a 3/4 chance of winning.
That can be expressed mathematically as A = 0.25 and B+C+D = 0.75, with each letter expressing a variable
When the host shows that two of the envelopes on the table are 0 value, we can substitute those into the second equation (example C=0 and D=0).
Now we have A = 0.25 and B+0+0 = 0.75.
Solving, B = 0.75
This is because B still represents the chances of all the envelopes left on the table.
Since 0.75 > 0.25 it is in your benefit to switch envelopes.
Thanks for posting that list Tuna, that has helped a lot looking at all of the possible outcomes. I'm getting all tripped up on the "it can only be in one envelope" part of the equation and not seeing the "chances are better that it is in the other set" part of the solution. If I had just stopped at the "Chances are better that you didn't pick right the first time" I would have gotten this a few pages back. OK, I'm a 75 percenter now.
Duke
SuperDork
10/31/11 8:50 p.m.
I'm telling you, THE PLANE TAKES OFF.
Duke
SuperDork
10/31/11 8:58 p.m.
Salanis wrote:
Tuna 55 wrote out all the possible permutations. Look back at what he wrote.
Except that in Tuna's permutations, C and D are still valid choices, instead of being discarded as necessarily empty. So of course that adds up to 75%. But it doesn't prove that the remaining odds are 25/75 in favor of the single envelope on the table - it just proves that 3x25=75, which is obvious.
Duke wrote:
Salanis wrote:
Tuna 55 wrote out all the possible permutations. Look back at what he wrote.
Except that in Tuna's permutations, C and D are still valid choices, instead of being discarded as necessarily empty. So of course that adds up to 75%. But it doesn't prove that the remaining odds are 25/75 in favor of the single envelope on the table - it just proves that 3x25=75, which is obvious.
Right and the claim is that since you picked from a group of four, the chance that you picked wrong is high, so if they shuffled the two remaining, it would in fact be 50/50. It's odds and they don't work in your favor either way because it isn't accurate and that's how casinos make money. Tricking you into thinking you have more of a chance with one option over another. I still think the true chance would be 50/50 even though I understand your 25/75 equation, but the equation would be far more advanced than that and I don't really want to cram my head with more useless E36 M3 tonight.
wbjones wrote:
I studied probability and possibility many yrs ago....
two of them don't exist
So did I. They do exist and at one time were equal, viable choices.
Think of it this way: You flip a coin. As it flips in the air you know only one side will land up. Does the other side exist as an option? Yes. After the coin lands the side that didn't land up is now not an option, but prior to selection each outcome had an equal probability.
Prior to selection, each envelope has an equal probability because we aren't told which ones are empty. The second selection is conditional upon the previous selection. Initially you had a 25/25/25/25 option. The only options for the second selection are keeping your original 25 or taking the 25+25+25 that you didn't take previously. The second selection is not random and thus not 50/50.
N Sperlo wrote:
so if they shuffled the two remaining, it would in fact be 50/50.
Tricking you into thinking you have more of a chance with one option over another.
I still think the true chance would be 50/50 even though I understand your 25/75 equation, but the equation would be far more advanced than that an
I don't really want to cram my head with more useless E36 M3 tonight.
It's interesting that you are completely correct on this until you wrote that you still believe it's 50/50, despite having just written why 50/50 is incorrect. The "equation" is actually so simple that people are missing it. As you said, ONLY if the remaining two are shuffled will the odds be 50/50. Until then it remains at 25/75. It's actually not useless. It's critical thinking, logic, and a bit of math. All of which come in handy from time to time.
Salanis
SuperDork
10/31/11 9:36 p.m.
EastCoastMojo wrote:
We can also agree that any envelopes that do not contain $, also have no probability whatsoever of containing $, as we have just stated that they do in fact, not.
I think this is where you are getting confused. They do have a probability of having that money. Just because someone else knows what it is, doesn't change the probability.
Say you flip a coin, someone catches it and looks at but doesn't show you. They know what it is. They ask you to guess. You have a 50% probability of guessing correctly. You say "heads", and turns out it was tails.
The outcome of the coinflip had been decided, but your choice still had a 50% probability of being correct.
Salanis wrote:
4 envelopes. You pick one with a 25% chance of winning. I am with you on this.
If you are offered a chance to trade your 1 for all three, should you? Yes. You trade 25% for 75%. When does this happen? By my count you are offered a chance to trade your one for the unopened one on the table.
Of those three you didn't choose, what is the chance that at least two will be empty? 100% You know at least two that you did not pick are empty. The money can only be in one envelope, not split between the three. I agree on this.
When the dealer reveals that two are empty, he has not revealed any new information. You knew two were empty. The only thing he told you was *which* two are empty. The chance that the money was in one of those three envelopes is still 75%. This also make sense to me.
Therefore, the chance of it being in the unrevealed envelope is 75%.
The location of the money is already determined. The dealer knows it, just like the coin flipper knew what the result of the coin flip was. However, the probability of your initial guess being correct is still 25%.
Yep, after looking at the outcomes, it became apparent that you only had a chance of guessing correctly 4 out of 16 possible outcomes, so there is a much better chance that you will not pick correctly the first time. So, switch, definitely.