yconic - Interesting Brainteaser
Hide Menu

My Feed Money for School Student Help Brands Winners Support Center



Explore yconic
Explore Student Life Topics
Scotiabank
STUDENT CHAMPION
yconic proudly recognizes Student Champion Partners who are providing our community with superior support for their student journeys. Learn More
Student Help Brands

Interesting Brainteaser

A photo of Quant Quant
I was asked this during an interview at a bank (for which I was successful) and thought it would be interesting to see what everyone's response would be.

Suppose you play the following game:

There is a container with 50 yellow balls and 50 blue balls. You have to pick a color. If the ball drawn [at random by an official] is the same color as the color you chose, you win $10,000. If it isn't, you get nothing.

Now you play the following game:

There is a container with an unknown proportion of yellow and blue balls, but you know there are 100 balls. The proportion is chosen randomly beforehand by an official. You have to pick a color. If the ball drawn [at random by an official] is the same color as the color you chose, you win $10,000. If it isn't, you get nothing.

Which game do you prefer? Specifically, how much would you pay to play the first game? How much would you pay to play the second game? Would you pay the same amount? Why or why not?
Was this helpful? Yes 0
12 replies
 
A photo of StudentAtStPats StudentAtStPats
I wouldn't really call it a brain teaser...... But I'd personally go with the second game. You're already taking a chance, so why not take a more interesting one? ;) And I guess it'd make sense to pay at least $5000 for the first game.. The second one, no. I'd pay less just because you don't know what are your exact chances unlike the first one. And that's a really interesting question for an interview.. what'd you answer?
Was this helpful? Yes 0

 
A photo of Quant Quant
There isn't necessarily a right answer in terms of how much you pay for each individual game in absolute terms (as it is a matter of risk preference) but I was told that I came to the right conclusion in terms of how much I would pay for the second game in relation to how much I paid for the first. So I will post my answer later, after a few more people answer, to keep it interesting.

Also, notice that I specified that you know there are exactly 100 balls in the container in the second game. I should have specified that in the beginning. Maybe that will affect your answer.
Was this helpful? Yes 0

 
A photo of Quant Quant

@StudentAtStPats wrote
I wouldn't really call it a brain teaser...... But I'd personally go with the second game. You're already taking a chance, so why not take a more interesting one? ;) And I guess it'd make sense to pay at least $5000 for the first game.. The second one, no. I'd pay less just because you don't know what are your exact chances unlike the first one. And that's a really interesting question for an interview.. what'd you answer?



Since no one else is going to have a stab at this, I will respond to your post.

You actually do know what the odds are for the second game - and it turns out that they are the exact same as in the first game (this can be proven). Just enumerate the different distributions of possible proportions: you can have 100 blue balls and 0 yellow balls, or 99 blue balls and 1 yellow ball, 98 blue balls and 2 yellow balls, .... all the way to 0 blue balls and 100 yellow balls. This is a total of 101 possible distributions. And because the distribution is chosen at random, they are all equally likely to occur each with probability 1/101. If you sum all of the possible distributions and you weigh them each by a factor of 1/101, you end up with 50/50 odds. So even though in the first game the risks are well-defined and in the second game there is uncertainty about the risks, the odds turn out to be identical, and you should pay the same amount for both games.
Was this helpful? Yes 0

 
A photo of superstar2011 superstar2011

@Quant wrote

@StudentAtStPats wrote
I wouldn't really call it a brain teaser...... But I'd personally go with the second game. You're already taking a chance, so why not take a more interesting one? ;) And I guess it'd make sense to pay at least $5000 for the first game.. The second one, no. I'd pay less just because you don't know what are your exact chances unlike the first one. And that's a really interesting question for an interview.. what'd you answer?



Since no one else is going to have a stab at this, I will respond to your post.

You actually do know what the odds are for the second game - and it turns out that they are the exact same as in the first game (this can be proven). Just enumerate the different distributions of possible proportions: you can have 100 blue balls and 0 yellow balls, or 99 blue balls and 1 yellow ball, 98 blue balls and 2 yellow balls, .... all the way to 0 blue balls and 100 yellow balls. This is a total of 101 possible distributions. And because the distribution is chosen at random, they are all equally likely to occur each with probability 1/101. If you sum all of the possible distributions and you weigh them each by a factor of 1/101, you end up with 50/50 odds. So even though in the first game the risks are well-defined and in the second game there is uncertainty about the risks, the odds turn out to be identical, and you should pay the same amount for both games.



I agree with expecting 50/50 odds for the second game, but the distribution of the balls in the box here isn't necessarily uniform (it could be, but that's a wild assumption). There is no proof that each box combo has a 1/101 probability. By the way, if it is, it is not true that you know what distribution you're about to get, or can expect. Or in any case, for that matter.

It's more likely to be a binomial distribution than anything else. That is, 100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50. Think about it. You can only make 1 combination of 100 blue and 0 yellow (choosing a blue ball each time). But you can make about 1.0*10^29 different combinations of 50 blue and 50 yellow (that is, 100C50 on your scientific calculator).

Also, your last sentence there is a normative statement. I would leave it at it would make sense for you to pay the same amount to play the second game, but not that you should. People generally want extra compensation for uncertain risk.
Was this helpful? Yes 0

 
A photo of Quant Quant

@superstar2011 wrote

@Quant wrote

@StudentAtStPats wrote
I wouldn't really call it a brain teaser...... But I'd personally go with the second game. You're already taking a chance, so why not take a more interesting one? ;) And I guess it'd make sense to pay at least $5000 for the first game.. The second one, no. I'd pay less just because you don't know what are your exact chances unlike the first one. And that's a really interesting question for an interview.. what'd you answer?



Since no one else is going to have a stab at this, I will respond to your post.

You actually do know what the odds are for the second game - and it turns out that they are the exact same as in the first game (this can be proven). Just enumerate the different distributions of possible proportions: you can have 100 blue balls and 0 yellow balls, or 99 blue balls and 1 yellow ball, 98 blue balls and 2 yellow balls, .... all the way to 0 blue balls and 100 yellow balls. This is a total of 101 possible distributions. And because the distribution is chosen at random, they are all equally likely to occur each with probability 1/101. If you sum all of the possible distributions and you weigh them each by a factor of 1/101, you end up with 50/50 odds. So even though in the first game the risks are well-defined and in the second game there is uncertainty about the risks, the odds turn out to be identical, and you should pay the same amount for both games.



I agree with expecting 50/50 odds for the second game, but the distribution of the balls in the box here isn't necessarily uniform (it could be, but that's a wild assumption). There is no proof that each box combo has a 1/101 probability. By the way, if it is, it is not true that you know what distribution you're about to get, or can expect. Or in any case, for that matter.

It's more likely to be a binomial distribution than anything else. That is, 100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50. Think about it. You can only make 1 combination of 100 blue and 0 yellow (choosing a blue ball each time). But you can make about 1.0*10^29 different combinations of 50 blue and 50 yellow (that is, 100C50 on your scientific calculator).

Also, your last sentence there is a normative statement. I would leave it at it would make sense for you to pay the same amount to play the second game, but not that you should. People generally want extra compensation for uncertain risk.



Your claim that "100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50" makes absolutely no sense. The question was posed so that each distribution would be equally likely. In the second game, it clearly states that the distribution is chosen at random. So by the principle of insufficient reason, the probability of each outcome is 1/n, where n represents the number of possible distributions. It's not as if this is even up for debate. The fact that the distribution is chosen "at random" (and that the possibilities are indistinguishable other than by name, which is a given) should be sufficient to conclude that the probability of each occuring is 1/101. But if you want an illustration of a mechanism that would randomly select a given distribution, here is an example: you have a ball for each possible distribution placed inside a container, and the official randomly picks one out to determine the proportion for blue and yellow balls. Thus, they are equally likely. It's not as if there are more balls with a 50/50 distribution than, say, a 100 blue/0 yellow distribution.

Everything else follows from this.
Was this helpful? Yes 0

 
This post was deleted

 
A photo of superstar2011 superstar2011

@Quant wrote

@superstar2011 wrote

@Quant wrote

@StudentAtStPats wrote
I wouldn't really call it a brain teaser...... But I'd personally go with the second game. You're already taking a chance, so why not take a more interesting one? ;) And I guess it'd make sense to pay at least $5000 for the first game.. The second one, no. I'd pay less just because you don't know what are your exact chances unlike the first one. And that's a really interesting question for an interview.. what'd you answer?



Since no one else is going to have a stab at this, I will respond to your post.

You actually do know what the odds are for the second game - and it turns out that they are the exact same as in the first game (this can be proven). Just enumerate the different distributions of possible proportions: you can have 100 blue balls and 0 yellow balls, or 99 blue balls and 1 yellow ball, 98 blue balls and 2 yellow balls, .... all the way to 0 blue balls and 100 yellow balls. This is a total of 101 possible distributions. And because the distribution is chosen at random, they are all equally likely to occur each with probability 1/101. If you sum all of the possible distributions and you weigh them each by a factor of 1/101, you end up with 50/50 odds. So even though in the first game the risks are well-defined and in the second game there is uncertainty about the risks, the odds turn out to be identical, and you should pay the same amount for both games.



I agree with expecting 50/50 odds for the second game, but the distribution of the balls in the box here isn't necessarily uniform (it could be, but that's a wild assumption). There is no proof that each box combo has a 1/101 probability. By the way, if it is, it is not true that you know what distribution you're about to get, or can expect. Or in any case, for that matter.

It's more likely to be a binomial distribution than anything else. That is, 100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50. Think about it. You can only make 1 combination of 100 blue and 0 yellow (choosing a blue ball each time). But you can make about 1.0*10^29 different combinations of 50 blue and 50 yellow (that is, 100C50 on your scientific calculator).

Also, your last sentence there is a normative statement. I would leave it at it would make sense for you to pay the same amount to play the second game, but not that you should. People generally want extra compensation for uncertain risk.



Your claim that "100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50" makes absolutely no sense. The question was posed so that each distribution would be equally likely. In the second game, it clearly states that the distribution is chosen at random. So by the principle of insufficient reason, the probability of each outcome is 1/n, where n represents the number of possible distributions. It's not as if this is even up for debate. The fact that the distribution is chosen "at random" (and that the possibilities are indistinguishable other than by name, which is a given) should be sufficient to conclude that the probability of each occuring is 1/101. But if you want an illustration of a mechanism that would randomly select a given distribution, here is an example: you have a ball for each possible distribution placed inside a container, and the official randomly picks one out to determine the proportion for blue and yellow balls. Thus, they are equally likely. It's not as if there are more balls with a 50/50 distribution than, say, a 100 blue/0 yellow distribution.

Everything else follows from this.



If the distribution is chosen at random, the balls are chosen at random, isn't it?

Here's how you choose 100 blue and 0 yellow.
BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB

Here's how you choose 50 blue and 50 yellow.
BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

YBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

YYBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

Or like this.

BYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBY

and so on...

There may be only 101 possible outcomes in a distribution, but there are 2^100 combinations. There's just so many more ways to "randomly" select a 50/50 distribution.

Tell me how the question was worded (as in give me the original source)
Was this helpful? Yes 0

 
A photo of Quant Quant

@superstar2011 wrote

@Quant wrote

@superstar2011 wrote

@Quant wrote

@StudentAtStPats wrote
I wouldn't really call it a brain teaser...... But I'd personally go with the second game. You're already taking a chance, so why not take a more interesting one? ;) And I guess it'd make sense to pay at least $5000 for the first game.. The second one, no. I'd pay less just because you don't know what are your exact chances unlike the first one. And that's a really interesting question for an interview.. what'd you answer?



Since no one else is going to have a stab at this, I will respond to your post.

You actually do know what the odds are for the second game - and it turns out that they are the exact same as in the first game (this can be proven). Just enumerate the different distributions of possible proportions: you can have 100 blue balls and 0 yellow balls, or 99 blue balls and 1 yellow ball, 98 blue balls and 2 yellow balls, .... all the way to 0 blue balls and 100 yellow balls. This is a total of 101 possible distributions. And because the distribution is chosen at random, they are all equally likely to occur each with probability 1/101. If you sum all of the possible distributions and you weigh them each by a factor of 1/101, you end up with 50/50 odds. So even though in the first game the risks are well-defined and in the second game there is uncertainty about the risks, the odds turn out to be identical, and you should pay the same amount for both games.



I agree with expecting 50/50 odds for the second game, but the distribution of the balls in the box here isn't necessarily uniform (it could be, but that's a wild assumption). There is no proof that each box combo has a 1/101 probability. By the way, if it is, it is not true that you know what distribution you're about to get, or can expect. Or in any case, for that matter.

It's more likely to be a binomial distribution than anything else. That is, 100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50. Think about it. You can only make 1 combination of 100 blue and 0 yellow (choosing a blue ball each time). But you can make about 1.0*10^29 different combinations of 50 blue and 50 yellow (that is, 100C50 on your scientific calculator).

Also, your last sentence there is a normative statement. I would leave it at it would make sense for you to pay the same amount to play the second game, but not that you should. People generally want extra compensation for uncertain risk.



Your claim that "100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50" makes absolutely no sense. The question was posed so that each distribution would be equally likely. In the second game, it clearly states that the distribution is chosen at random. So by the principle of insufficient reason, the probability of each outcome is 1/n, where n represents the number of possible distributions. It's not as if this is even up for debate. The fact that the distribution is chosen "at random" (and that the possibilities are indistinguishable other than by name, which is a given) should be sufficient to conclude that the probability of each occuring is 1/101. But if you want an illustration of a mechanism that would randomly select a given distribution, here is an example: you have a ball for each possible distribution placed inside a container, and the official randomly picks one out to determine the proportion for blue and yellow balls. Thus, they are equally likely. It's not as if there are more balls with a 50/50 distribution than, say, a 100 blue/0 yellow distribution.

Everything else follows from this.



If the distribution is chosen at random, the balls are chosen at random, isn't it?

Here's how you choose 100 blue and 0 yellow.
BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB

Here's how you choose 50 blue and 50 yellow.
BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

YBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

YYBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

Or like this.

BYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBY

and so on...

There may be only 101 possible outcomes in a distribution, but there are 2^100 combinations. There's just so many more ways to "randomly" select a 50/50 distribution.

Tell me how the question was worded (as in give me the original source)



It was worded exactly how I stated it in the OP, and the original source was my interviewer (this was for a position in credit risk.)

I understand what you're trying to say about the different combinations, but the question says the distribution is chosen randomly, not combinations of each of the balls. You aren't choosing each ball randomly, you are only selecting the proportion. The question is clear about this, and there is no reason for you to assume otherwise.

In fact, I think this is a variation on a problem from economics, which is suppose to demonstrate the difference between risk vs uncertainty.

Anyways, ask one of your profs to take a look at this problem. They will confirm what I am saying.
Was this helpful? Yes 0

 
This post was deleted

 
A photo of superstar2011 superstar2011

@Quant wrote

@superstar2011 wrote

@Quant wrote

@superstar2011 wrote

@Quant wrote

@StudentAtStPats wrote
I wouldn't really call it a brain teaser...... But I'd personally go with the second game. You're already taking a chance, so why not take a more interesting one? ;) And I guess it'd make sense to pay at least $5000 for the first game.. The second one, no. I'd pay less just because you don't know what are your exact chances unlike the first one. And that's a really interesting question for an interview.. what'd you answer?



Since no one else is going to have a stab at this, I will respond to your post.

You actually do know what the odds are for the second game - and it turns out that they are the exact same as in the first game (this can be proven). Just enumerate the different distributions of possible proportions: you can have 100 blue balls and 0 yellow balls, or 99 blue balls and 1 yellow ball, 98 blue balls and 2 yellow balls, .... all the way to 0 blue balls and 100 yellow balls. This is a total of 101 possible distributions. And because the distribution is chosen at random, they are all equally likely to occur each with probability 1/101. If you sum all of the possible distributions and you weigh them each by a factor of 1/101, you end up with 50/50 odds. So even though in the first game the risks are well-defined and in the second game there is uncertainty about the risks, the odds turn out to be identical, and you should pay the same amount for both games.



I agree with expecting 50/50 odds for the second game, but the distribution of the balls in the box here isn't necessarily uniform (it could be, but that's a wild assumption). There is no proof that each box combo has a 1/101 probability. By the way, if it is, it is not true that you know what distribution you're about to get, or can expect. Or in any case, for that matter.

It's more likely to be a binomial distribution than anything else. That is, 100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50. Think about it. You can only make 1 combination of 100 blue and 0 yellow (choosing a blue ball each time). But you can make about 1.0*10^29 different combinations of 50 blue and 50 yellow (that is, 100C50 on your scientific calculator).

Also, your last sentence there is a normative statement. I would leave it at it would make sense for you to pay the same amount to play the second game, but not that you should. People generally want extra compensation for uncertain risk.



Your claim that "100 blue balls and 0 yellow balls or 100 yellow balls and 0 blue balls are a hell a lot less likely to be the chosen distribution than it being 50/50" makes absolutely no sense. The question was posed so that each distribution would be equally likely. In the second game, it clearly states that the distribution is chosen at random. So by the principle of insufficient reason, the probability of each outcome is 1/n, where n represents the number of possible distributions. It's not as if this is even up for debate. The fact that the distribution is chosen "at random" (and that the possibilities are indistinguishable other than by name, which is a given) should be sufficient to conclude that the probability of each occuring is 1/101. But if you want an illustration of a mechanism that would randomly select a given distribution, here is an example: you have a ball for each possible distribution placed inside a container, and the official randomly picks one out to determine the proportion for blue and yellow balls. Thus, they are equally likely. It's not as if there are more balls with a 50/50 distribution than, say, a 100 blue/0 yellow distribution.

Everything else follows from this.



If the distribution is chosen at random, the balls are chosen at random, isn't it?

Here's how you choose 100 blue and 0 yellow.
BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB

Here's how you choose 50 blue and 50 yellow.
BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

YBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

YYBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

Or like this.

BYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBYBY

and so on...

There may be only 101 possible outcomes in a distribution, but there are 2^100 combinations. There's just so many more ways to "randomly" select a 50/50 distribution.

Tell me how the question was worded (as in give me the original source)



It was worded exactly how I stated it in the OP, and the original source was my interviewer (this was for a position in credit risk.)

I understand what you're trying to say about the different combinations, but the question says the distribution is chosen randomly, not combinations of each of the balls. You aren't choosing each ball randomly, you are only selecting the proportion. The question is clear about this, and there is no reason for you to assume otherwise.

In fact, I think this is a variation on a problem from economics, which is suppose to demonstrate the difference between risk vs uncertainty.

Anyways, ask one of your profs to take a look at this problem. They will confirm what I am saying.



Alright, then I guess I must have misunderstood what you meant by the distribution being chosen randomly. I apologize for that. I guess what you meant was there are 101 boxes, each with 100 balls of different proportions, and one is picked beforehand. Now I see what makes this distribution uniform. In either case, though, it makes sense to assume a 50/50 chance for the second game.
Was this helpful? Yes 0

 
A photo of Quant Quant

@superstar2011 wrote

I guess what you meant was there are 101 boxes, each with 100 balls of different proportions, and one is picked beforehand.



Yes, that's certainly another way too look at it.

I think this is similar to, or even a variation of, the Ellsberg paradox, which demonstrates how people make decisions under uncertainty (where probabilities may not be known, as is most often the case in the real world) and how it violates the assumptions of expected utility theory, since most people would probably not pay the same amount for each game even though the expected value of winnings is the same.

Also, 50/50 odds for the second game is not an assumption, it's a conclusion. This can be easily proven.



by the law of total probability, where 1/101 represents the probability of obtaining each of the 101 distributions, and n/100 represents the associated probability of drawing a blue ball for each distribution. P(drawing yellow ball) would just be the complement, which is also 1/2.
Was this helpful? Yes 0

 
A photo of superstar2011 superstar2011
Yeah, I used the wrong word, lol

What I should have written was that you can prove that either distribution works out to a 50/50 chance you picking a ball of either colour, so you can expect a probability of 1/2, regardless of whether the distribution is uniform or binomial.
Was this helpful? Yes 0