Counting in Probability

Question: In the card game bridge, the 52 cards are dealt out equally to 4 players – called East, West, North and South. If North and South have total of 8 spades among them, what is the probability that East has 3 of the remaining 5 spades?

Discussion: East gets 3 spades out of remaining 5 in \binom{5}{3} ways. The sample space should have all possible ways East can get spades. All possible outcomes are \binom{5}{0} + \binom{5}{1} + \binom{5}{2} + \binom{5}{3} + \binom{5}{4} + \binom{5}{5} i.e., East gets no spade, East gets exactly one spade, …and so on till East gets all 5 remaining spades.

Therefore the desired probability is \frac{\binom{5}{3}}{\binom{5}{0}+\binom{5}{1}+\binom{5}{2}+\binom{5}{3}+\binom{5}{4}+\binom{5}{5}} = 0.3125. But this does not agree with the answer Sheldon Ross gets! So, where is the problem?

The problem is in counting. The number of ways three spades can be selected from five, in this case is not exactly \binom{5}{3}. It is in fact, \binom{5}{3}\binom{21}{10} because there could be more ways to rearrange the rest of the cards. I cannot ignore this factor since it is different proportionately when compared with the elements in the denominator. In the denominator, the number of ways East can get no spades is not \binom{5}{0}. Instead, it is \binom{5}{0}\binom{21}{13} because now he can choose 13 cards from remaining 21 non-spade cards. Similarly, when he has one spade, he can rearrange rest of the cards in \binom{21}{12} ways. Of course, East has five choices to pick that one particular spade. Therefore, East has \binom{5}{1}\binom{21}{12} ways to have one spade!

So, this works:

\frac{\binom{5}{3}\binom{21}{10}}{\binom{5}{0}\binom{21}{13}+\binom{5}{1}\binom{21}{12}+\binom{5}{2}\binom{21}{11}+\binom{5}{3}\binom{21}{10}+\binom{5}{4}\binom{21}{9}+\binom{5}{5}\binom{21}{8}} = 0.339.

Two morals from this story: 1) In most introductory probability questions, we make the mistake of counting incorrectly. 2) There are often multiple ways of solving the same problem. Try the problem yourself in your own way and double check with the answer to ensure your approach is correct. Do not read probability book like a novel.

Advertisements

Power of algebra in probability

Algebra plays a big role in the analysis of complicated problems. An otherwise very complicated solution to work through can become trivial with the right use of algebra. Last week, I was solving the clock solitaire problem for the case that you may pick a random pile and then a random card in each pile. Problem was to argue if the probability is still 1/13.

Turns out that if you start with any pile with 13 options. If you choose 1st card in the last pile, the ways you may get total arrangements is 13. You know that you have (1/13)(1/4) possibility of doing this. Now consider the second card in this pile as your starting card. This way, you can imagine 4 arrangements of cards. Similarly, having selected any pile, you have 4 ways of picking the first card, 3 ways of picking second and so on. Its mind blowing to count all the possibilities without the power of algebra.

The probability of winning in this context is

Image

Look at this from the perspective of total possible arrangements and the arrangements that win. Picking each random card is same as one rearrangement with the selected card at the top of pile. This leads us to 4 possibilities for first place (since you may pick any one card in the pile for first place), 3 for second, 2 for third and 1 for the last. So, we now have all the ammunition required to apply the principle of deferred decisions.

Monty hall over 5 doors

Its commonplace to note that you should switch in the case of 3 doors. How to carry forward or generalize this idea for multiple doors?

Let’s assume car is behind door 5 and there are five doors. Also, monty knows that you have selected, lets say, door 3. So, if monty has to open two doors and ask you if you would like to switch, here’s how we can compute monty’s probability of opening the doors 1 and 2 (let’s assume this is event A).

P(A) = P(A|carin1)P(carin1) + P(A|carin2)P(carin2) + …. + P(A|carin5)P(carin5)

P(A|carin5) = 1/ 3C2

This is because, if car is in 5, monty has three choices (doors 1, 2 and 4) since 5 has car and 3 is what you selected. So, monty can choose any two in 3C2 ways with equal probability. So, the probability is 1/ 3C2.

Therefore, P(A) = 0 + 0 + (1/ 4C2)(1/5) + (1/3C2)(1/5) + (1/3C2)(1/5)

Now, all we need is just a way to turn the problem around from P(carin5| monty_opens_1_and_2) to a function of P(monty_opens_1_and_2 | carin5). To turn this around, we can use the conditional probability definition:

P(X|Y) = P(Y|X)P(X)/P(Y)

The probability of winning by staying in the same door (in our case 3) is:

P(carin3|monty_opens_1_and_2) = P(monty_opens_1_and_2|carin3)P(carin3) / X

where X = (P(monty_opens_1_and_2|carin1) + …. P(monty_opens_1_and_2|carin5))

If I work out, the P(carin3|monty_opens_1_and_2) turns out to be 1/5.

Same for switch case, gives 2/5.

So, it turns out that we should still switch!

Markov Logic Networks

Intelligent systems are concerned about inferences. Given they know some facts, how can they infer things? The progress in this domain is just mind-blowing. In this post, I ponder the thought behind MLN or Markov Logic Networks.

Coding in prolog introduced me to the world of first order logic and how facts can be represented using FOL. Prolog goes beyond just representational challenges and can act as an intelligent engine behind your java programs. How does it do that? Well, if elaborate concrete rules can be coded using FOL, its just about finding contradicting example to prove false or satisfy all formulae to prove true. Prolog, eventhough slow in some respects, achieves the same.

However, with the world of prolog, there are only facts i.e., either something is true or false. In real world, this may not hold all the time. For instance, let’s assume that you expect a bus at 10AM. At least, if you are in India, how many times has the bus come exactly at 10? We are forced to mix some probabilities to facts, all the time!

How wonderful it would be if we have a tool where FOL and probabilities are merged together!? MLN seems to do exactly this. MLN sees a clique (a graph of FOL formulae) as a FOL with assigned weights. This removes the hard 1 or 0 truth of FOl and brings value.

As with any probabilistic method, things are very confusing to start with. Uwash paper here and wikipedia throw some light. Enjoy MLN!