Monty Hall problem.
This problem became popular after the famous american show, where a participant at random choose one of three doors. Behind one door there is a car, behind others there are goats. Host knows where a car is and after your choosing leave your door closed, then opens the door with a goat from one of two left doors. And now the question from the host to a player: “Would you change the door?”.

For example, you chose first door. The host opened second door and show you a goat, then asked you to stay with the first door or change to the third.

Probability says yes, you should change! There are many answers to this problem in the internet, but they are so complicated, I try to make it simple. Look at the probability tree:
Monty-Hall

If you choose strategy “Stay” you win 1 car and 2 goats.
If you choose strategy “Change” you win 2 cars and 1 goat.
So it is better to change, you raise your chances to win!

Two envelopes problem.
There are two envelopes with money. In one envelope the sum is as twice as the sum in another. You do not know where is the envelope with the twice sum. Then you choose at random an envelope and open it, see the sum. Would you take the second envelope instead of the opened? For example, you see $10, now if you change you can get $5 or $20 in the second envelope or stay with $10.

There appears the problem.

1. Consider the case when there are two sum of money $10 and $20 in the envelopes, look at the picture what is happening when you make decision:
2-envelopes
As you can see there is no deference.

2. Consider another case were you see $10 in the envelope, hence in the second envelope there may be $5 or $20:
2-envelopes2
Expected value is P(stay)=1/2 * 10=5, ~~ P(change)=1/4 * 5+1/4 * 20 = 7.5.
So in this example it is better to change and your gain will be 1.5 times more with change strategy. The secret is that if you change you can lose $5 or add $10, what is not the same.

Simple program on C# shows next results for two considered methods (at every play there are random money in the envelopes):
program
In this simulation ultimate sums are approximately equal in the first method, in the second method if you change envelope you will raise the sum.

Advertisements

Slot machines

August 25, 2009

200544828-001

Brief history of a slot machine.

In the late 19th century first slot machines were developed. It had a wheel with different pictures and a handle for start playing. Player gave money to a bartender and tried to get winning picture on a rotated wheel.

In 1985 Charles Fey, a car mechanic, introduced new slot machine – The Liberty Bell slot machine. It had 3 reels and 10 pictures on each reel. The name was given due to winning combination, the highest pay-off was with 3 bell pictures on the reels stopped in a row.

In 1907 Herbert Mills started mass production of new slot machines, where player could put a coin in a machine and play. Mills added famous fruit symbols. It was so convenient to play and there was not need any high knowledge that game had wide spread. Then machines were advanced, there became 20, then 22 signs on a reel.

In mid 80th last century appeared slot machines with merely computer chips without evaluating a win by rotated mechanism. And they are working now. How does it work?
There is a chip with random number generator. The cheap generates 3 numbers and then gets a picture from a programmed table.

Consider the huge table in Red White & Blue slot machine:
table3
If the machine’s chip generates three numbers 57, 59, 57. It means that computer orders to a motor to rotate reels and on the first reel will be “2 bar”, on the second reel will be “1 bar”, on the third reel will be “blank”. Also there can be LCD panels without real reels.

What are the pay-offs?
table4

On the next table there are information how often each picture appears:
table5

Now we can calculate probabilities to win and to lose, for instance. the probability that there can be three white 7 in a row is P(E)=\frac{6*1*7}{64*64*64}=0.000011:
table6

The probability to lose is P(lose)=0.826. It means that at 1000 attempts to win at average in long run: a player loses 826 times and wins 174 times. However, in long run you can back your money with the probability P(money)=0.865. That is what the producer of the machine has published.

If each attempt does not depend on any previous attempt, then one could claim that at every time odds to win are equal. No, they are not. Consider example with a coin. If you bet on Tails, what is the probability that there will be 6 Heads? The probability that there will be 6 Heads is \frac{1}{64} and the probability that in all 6 tosses there will be at least 1 Tails is \frac{63}{64}. At the same time the probability that there will be Tails or Heads in each attempt is \frac{1}{2}, because of independent events.

So it looks as moot point. Think, it is clear in statistic analyse that if you go to play 40 times in Red White & Blue machine the probability that you win at least one time is P(10)=1- (0.8265)^{10}=0.851.

However if somebody played and had 50 loses and you sat to play after him, and you played 10 times too.
The probability that you win at least one time in 60 attempts is P(60)=1- (0.8265)^{60}=0.999. If the machine is not cheating you can raise your odds to win, though there are random win in every attempt.

It looks silly if manufactures do not manipulate the slot machines and produce them fair. The best way to get good money from any game is to make addiction. Good way to do it is to give to a player initially frequent small pay-off and then rare large pay-off. However with fair random chip a player is able to win initially large pay-off and so will not be interested in subsequent games.

One more point where slot machines owners can manipulate is the highest pay-off. For a usual player amount $1 million to win is huge, $3 million is huge and $20 million is huge, usual player does not feel the real difference. Thus it is better to give 3 times to win $3 million and promote wide advertisement with winners than give 1 time to win $40 million.

In mid 50s 50% revenues from all casinos in state Nevada were made by slot machines. In late 80s this were up to 70%. Today a slot machine is one of the best way to make profit .

Statistic data: The Wizards of Odds, Slot Machines (http://wizardofodds.com/slots).

Basic Probability, part 6

August 24, 2009

The another type of presentation the probability is a table with occasions and their frequencies. For example, we have been watching basketball game and the team is stable with 5 players, and we calculated that players scores in a row are following:
table1

If we are watching a random scene of the games and our team scored. The probability that this score has done by player 3 is P(E)=\frac{number~ of~all~scores~by~ player~3}{number~of ~total ~scores}=\frac{300}{100}=\frac{3}{10}.

Expectation.

Consider an example, there is a one-armed bandit machine in a casino and it works on following idea: on every 400 attempts to win machine gives to win 2 times with $25 and 1 time with $45. If each attempt costs $1, what we can expect as average winning in long run?

To answer on this question we need to know the notion “expectation”. Expectation or expected value is a long run average and is calculated as:

E(X)=X_1 *P(X_1)+X_2*P(X_2)+...+X_n*P(X_n),
where X_1 is the first occasion in numbers and P(X_1) is the probability for the first occasion, and so on for the rest occasions.

Now let us look at our example in a casino. Denote the first occasion as lose, so X_1 = -1 doll. and P(X1)=\frac{397}{400}, the second occasion is win $25, so X_2 =25 doll. and P(X1)=\frac{2}{400}, the third occasion is win $45, so X_3 =45 doll. and P(X1)=\frac{1}{400}. Put all information together in the table:
table2

Calculate expected value or expectation:
E(X)=-1*\frac{397}{400}+25*\frac{2}{400}+45*\frac{1}{400}=-\frac{302}{400}
Round the value to -\frac{3}{4}.
And what does it mean? It means that if we are going to play with the machine for a long time every arm dropped will cost to us $\frac{3}{4}. It is the average value that we loose.

Pretend the following case.
We has sat all the night in a casino and played in one-armed bandit. We dropped the arm 400 times, on 25th and 142nd attempt we won $25, on 177th drop we won $45, the rest attempts were lose.

You can see that sometimes we lose, sometimes we won, however it looks like we came to the machine and gave $ \frac{3}{4} for every attempt without losing $1 or winning $25 and $45.

Now let us see the view from the casino owners. When one is going to play in our casino in one-armed bandit and saying “I am going to win”, he(she) is wrong. We are going to win! And at an average on every drop we win $ \frac{3}{4} or approximately 66 cents.

Basic Probability, part 5

August 24, 2009

rez

Let us revise some information.

Pretend we at random choose two cards from a deck at the same time and we want to know what is the probability that we choose a queen or a king. As we know we these two events are mutually exclusive, because one card cannot be and a king and a queen at the same time, so we need to add two probabilities: P(queen ~ or ~ king)=\frac{4}{52}+\frac{4}{52}=\frac{8}{52}=\frac{2}{13}.

Independent events.

And now consider the case when we toss a coin an the choose at random on card from a deck. What is the probability that we get Heads and an ace? As we know we need multiple probabilities of those two events: P(H ~ or ~ Ace)=\frac{1}{2}*\frac{4}{52}=\frac{1}{26}. Current two events are called independent, because the second event “choose an ace” is not depend on the first event “get Heads”. The probability of getting an ace is P(Ace)=\frac{4}{52} and it is independent weather we toss coin or not. The same for coin.

The multiplication rule 1:
If event A and event B are independent, then
P(A~ and~ B)=P(A)*P(B).

Dependent events.
Let us get a queen from an ordinary deck with 52 cards. The probability of this event is P(Queen~ 1)=\frac{4}{52}. Now put the queen in a pocket. And try to get another queen from the deck. Notice that we have 51 cards and 3 queens! So the second event to get a queen has probability P(Queen ~2)=\frac{3}{51}.

These two events are called dependent events, the second event is dependent from the first. The probability that event B occurs given that event A has already occurred is denoted as P(B~|~A).

In our case with two queens the probability that we get a queen in the first choose and it is not replaced, and we get a queen in the second choose is:

P(Queen ~1 ~ and~Queen~2)=P(Queen~1)*P(Queen~2|~Queen~1)=\\ = \frac{4}{52}*\frac{3}{51}=\frac{1}{221}.

The multiplication rule 2:

If event A and event B are dependent, then
P(A~ and~ B)=P(A)*P(B~|~A).

Basic Probability, part 4

August 23, 2009

All previous examples were with mutually exclusive events. These are events that do not have common cases or cannot occur at the same time. For example, the event “select a king” and the event “select an ace” from the deck are mutually exclusive, because we cannot select them both if we take one card.

The additional rule 1:
If two events (A and B) are mutually exclusive, then
P(A~or~B)=P(A)+P(B).

Thus the probability for the event “select a king or a queen” from a deck is
P(Queen~or~King)=P(Queen)+P(King)=\frac{4}{52}+\frac{4}{52}=\frac{8}{52}=\frac{2}{13}.

The example of not mutual exclusive events is “select an ace” and “select the spades” in a deck, because there are 4 aces, 13 spades cards, and there is one common case – the spades ace! The spades ace belongs to the group “aces” and to the group “spades” at the same time.

The additional rule 2:
If two events (A and B) are not mutually exclusive, then
P(A~or~B)=P(A)+P(B)-P(A~ and~ B).

In our case the probability of the event “select an ace or an spades card” is
P(Ace~or~Spades)=P(Ace)+P(Spades)-P(Ace~ and~ Spades ~at ~the~ same~ time)=\frac{4}{52}+\frac{13}{52}-\frac{1}{52}=\frac{16}{52}=\frac{4}{13}.

Why do we need to subtract the probability of common cases P(A ~ and ~ B)?
Consider this issue in the next example. Look at the picture:
16

Let us find the probability of the event “randomly select a triangle or a yellow figure”. There are 12 figures at all and as we can see for our ask there are follow 7 figures:
33_2

So, the probability of the event “randomly select a triangle or a yellow figure” is P(Triangle~ or~ Yellow~figure)=\frac{7}{12}.

Good, but what do occur when we add the probability of the event “select a triangle” to the probability of the event “select a yellow figure”? There are 5 triangles and 5 yellow figures, so after add them up we get 10 figures, while we need 7 figures. The three excess figures occurs, because they are common and we have three yellow triangles, that belong to the group “triangles” and to the group “yellow figures” at the same time. Notice that after adding we count common figures two times: first, when we count triangles and second, when we count yellow figures:
33_3

Thus we need to subtract the number of common figures.

The probability of the event “randomly select a triangle or a yellow figure” is:
P(Triangle~ or~ Yellow~figure)=P(Triangle)+P(Yellow~figure)- \\-P(Triangle~ and~Yellow~figure)=\frac{5}{12}+\frac{5}{12}-\frac{3}{12}=\frac{7}{12}.

Basic Probability, part 3

August 23, 2009

Dice01

Dice are very old instruments for games, thus they had been found in Egyptian tombs, dated to 2000 BC. Dice gamble were greatly popular in ancient Greek, Rome, Asia, then in medieval Europe and it has been being popular today. In ancient times the throw of dice was believed to be controlled by the gods. Famous was the game with throwing two dice and guessing the sum of the numbers. People noticed that the number 7 as the sum turns out more frequently and they thought that the cause is the gods favour. Number 7 is the lucky number!

Today with maths help we can understand why the sum 7 appears more frequently.

Consider all possible cases in rolling two dice. The first die has 6 sides and 6 numbers, the same is for the second die – 6 sides with 6 numbers. All possible sums of these numbers are from 1+1=2 to 6+6=12.

The picture below presents the table with all possible cases, each intersection presents the sum of the numbers of two dice:
15

As you can see the sum 7 appears more frequently than any other sum. There are 6 cases with the sum 7 and there are 36 possible cases at all. Hence the probability that the sum of two rolled dice will be 7 is \frac{6}{36}=\frac{1}{6}.

The probability for the sum 7 is the greatest. For example, for the sum 5 the probability is \frac{4}{36}=\frac{1}{9}, since there are only four cases we can get the sum 5.

Moreover, the probability of getting the sum 7 is equal to the probability of getting the sum (2 or 3 or 11 or 12), since P(Sum ~2 ~or~ 3~ or~ 11~ or~ 12)= \frac{1}{36}+\frac{2}{36}+\frac{2}{36}+\frac{1}{36}=\frac{6}{36}=\frac{1}{6}.

Ancient people used the number 7 of the sum two dice to check cheating. If the sum 7 appeared often then dice are fair, if not then dice are loaded. Brilliant method!

Basic Probability, part 2

August 23, 2009

Another way to consider the probability is graphical way.

Consider the following case:
there are a coin and the five balls in a bag with numbers 1,2,3,4,5 respectively. We toss a coin and then take out from the bag a ball.
Let us draw first event – tossing the coin, there can be H with the probability \frac{1}{2} and T with the probability \frac{1}{2}:
12
Pretend that we get H, then we take a ball with the probability \frac{1}{5} for each number. Notice that the occasion there will be the same if we get T, and then take a ball with the probability \frac{1}{5} :
12_2

As we know to calculate the probability of two events we need to multiply them. Thus the probability of the event “T on the coin and the ball number 1” is P(E)=\frac{1}{2}* \frac{1}{5}=\frac{1}{10}.
The probability of the event “T on the coin and the ball number 2” is P(E)=\frac{1}{2}* \frac{1}{5}=\frac{1}{10}.
The probability of the event “H on the coin and the ball number 3” is P(E)=\frac{1}{2}* \frac{1}{5}=\frac{1}{10}, and so on.

12_3

And as we know the probability of all cases is equal 1:
the probability of getting H or T on the coin is P(E)=\frac{1}{2}+ \frac{1}{2}=1;
the probability of taking the ball with the number 1 or 2 or 3 or 4 or 5 5 is P(E)=\frac{1}{5}+ \frac{1}{5}+ \frac{1}{5}+ \frac{1}{5}+ \frac{1}{5}=1;
the probability of getting H or T on the coin and then taking the ball with the number 1 or 2 or 3 or 4 or 5 is P(E)=\frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}+ \frac{1}{10}=1. You can see it on the previous picture.

Score a goal.

Get an another example. We are watching football game and we know that the forward scored 300 goals in 1000 attacks. So the property that he score a goal in an attack is P(E)=\frac{3}{10}. If there were 2 attacks, what is the probability that the forward scored the goal on the first attack and missed on the second attack. To get answer we need to calculate the probability of the missing in an attack, it is P(Miss)=1-P(Score)=1- \frac{3}{10}=  \frac{7}{10}, since the forward could miss or score and nothing else. Denote “Score a goal” as S, “Miss a goal” as M, and show all possible variants on the picture:

14

As before we multiplied the probabilities. The probability that the forward scored the goal on the first attack and missed on the second attack is P(SM)=\frac{21}{100}.

For 3 or more events, for instance, when 3 attacks occurred we need to draw a picture with a larger tree (scheme). The idea is the same.

Score a goal in the five attempts.

Consider the case when we play the basketball and have 5 attempts to throw the ball. If we know that we can get a score with the probability P(S)=\frac{2}{3}, what is the probability that we at least take one score in all 5 attempts. What does it mean? It means that we need to find all possible variants of the scoring except one – when we miss every time in all five attempts. In other words, we find P(E)=1 - P(MMMMM), and it will be the answer.
To calculate P(MMMMM) let us find the probability of missing in an attempt, it is P(M)=1-P(S)=1-\frac{2}{3}=\frac{1}{3}.
Hence the probability that we lose all attempts is P(MMMMM)=P(M)*P(M)*P(M)*P(M)*P(M)=\\ \frac{1}{3}*\frac{1}{3}*\frac{1}{3}*\frac{1}{3}*\frac{1}{3}=\frac{1}{243}.

Then the probability that we at least take one score in all 5 attempts is 1-\frac{1}{243}=\frac{242}{243}. And it is very high, so probably we take a score.