## Monty Hall problem and Two envelopes problem

### August 29, 2009

**Monty Hall problem.**

This problem became popular after the famous american show, where a participant at random choose one of three doors. Behind one door there is a car, behind others there are goats. Host knows where a car is and after your choosing leave your door closed, then opens the door with a goat from one of two left doors. And now the question from the host to a player: “Would you change the door?”.

For example, you chose first door. The host opened second door and show you a goat, then asked you to stay with the first door or change to the third.

Probability says yes, you should change! There are many answers to this problem in the internet, but they are so complicated, I try to make it simple. Look at the probability tree:

If you choose strategy “Stay” you win 1 car and 2 goats.

If you choose strategy “Change” you win 2 cars and 1 goat.

So it is better to change, you raise your chances to win!

**Two envelopes problem.**

There are two envelopes with money. In one envelope the sum is as twice as the sum in another. You do not know where is the envelope with the twice sum. Then you choose at random an envelope and open it, see the sum. Would you take the second envelope instead of the opened? For example, you see $10, now if you change you can get $5 or $20 in the second envelope or stay with $10.

There appears the problem.

1. Consider the case when there are two sum of money $10 and $20 in the envelopes, look at the picture what is happening when you make decision:

As you can see there is no deference.

2. Consider another case were you see $10 in the envelope, hence in the second envelope there may be $5 or $20:

Expected value is

So in this example it is better to change and your gain will be 1.5 times more with change strategy. The secret is that if you change you can lose $5 or add $10, what is not the same.

Simple program on C# shows next results for two considered methods (at every play there are random money in the envelopes):

In this simulation ultimate sums are approximately equal in the first method, in the second method if you change envelope you will raise the sum.

## Slot machines

### August 25, 2009

**Brief history of a slot machine.**

In the late 19th century first slot machines were developed. It had a wheel with different pictures and a handle for start playing. Player gave money to a bartender and tried to get winning picture on a rotated wheel.

In 1985 Charles Fey, a car mechanic, introduced new slot machine – The Liberty Bell slot machine. It had 3 reels and 10 pictures on each reel. The name was given due to winning combination, the highest pay-off was with 3 bell pictures on the reels stopped in a row.

In 1907 Herbert Mills started mass production of new slot machines, where player could put a coin in a machine and play. Mills added famous fruit symbols. It was so convenient to play and there was not need any high knowledge that game had wide spread. Then machines were advanced, there became 20, then 22 signs on a reel.

In mid 80th last century appeared slot machines with merely computer chips without evaluating a win by rotated mechanism. And they are working now. How does it work?

There is a chip with random number generator. The cheap generates 3 numbers and then gets a picture from a programmed table.

Consider the huge table in Red White & Blue slot machine:

If the machine’s chip generates three numbers 57, 59, 57. It means that computer orders to a motor to rotate reels and on the first reel will be “2 bar”, on the second reel will be “1 bar”, on the third reel will be “blank”. Also there can be LCD panels without real reels.

What are the pay-offs?

On the next table there are information how often each picture appears:

Now we can calculate probabilities to win and to lose, for instance. the probability that there can be three white 7 in a row is

The probability to lose is It means that at 1000 attempts to win at average in long run: a player loses 826 times and wins 174 times. However, in long run you can back your money with the probability That is what the producer of the machine has published.

If each attempt does not depend on any previous attempt, then one could claim that at every time odds to win are equal. No, they are not. Consider example with a coin. If you bet on Tails, what is the probability that there will be 6 Heads? The probability that there will be 6 Heads is and the probability that in all 6 tosses there will be at least 1 Tails is At the same time the probability that there will be Tails or Heads in each attempt is because of independent events.

So it looks as moot point. Think, it is clear in statistic analyse that if you go to play 40 times in Red White & Blue machine the probability that you win at least one time is

However if somebody played and had 50 loses and you sat to play after him, and you played 10 times too.

The probability that you win at least one time in 60 attempts is If the machine is not cheating you can raise your odds to win, though there are random win in every attempt.

It looks silly if manufactures do not manipulate the slot machines and produce them fair. The best way to get good money from any game is to make addiction. Good way to do it is to give to a player initially frequent small pay-off and then rare large pay-off. However with fair random chip a player is able to win initially large pay-off and so will not be interested in subsequent games.

One more point where slot machines owners can manipulate is the highest pay-off. For a usual player amount $1 million to win is huge, $3 million is huge and $20 million is huge, usual player does not feel the real difference. Thus it is better to give 3 times to win $3 million and promote wide advertisement with winners than give 1 time to win $40 million.

In mid 50s 50% revenues from all casinos in state Nevada were made by slot machines. In late 80s this were up to 70%. Today a slot machine is one of the best way to make profit .

Statistic data: The Wizards of Odds, Slot Machines (http://wizardofodds.com/slots).

## Basic Probability, part 6

### August 24, 2009

The another type of presentation the probability is a table with occasions and their frequencies. For example, we have been watching basketball game and the team is stable with 5 players, and we calculated that players scores in a row are following:

If we are watching a random scene of the games and our team scored. The probability that this score has done by player 3 is

**Expectation.**

Consider an example, there is a one-armed bandit machine in a casino and it works on following idea: on every 400 attempts to win machine gives to win 2 times with $25 and 1 time with $45. If each attempt costs $1, what we can expect as average winning in long run?

To answer on this question we need to know the notion “expectation”. **Expectation or expected value **is a long run average and is calculated as:

where is the first occasion in numbers and is the probability for the first occasion, and so on for the rest occasions.

Now let us look at our example in a casino. Denote the first occasion as lose, so doll. and , the second occasion is win $25, so doll. and , the third occasion is win $45, so doll. and . Put all information together in the table:

Calculate expected value or expectation:

Round the value to

And what does it mean? It means that if we are going to play with the machine for a long time every arm dropped will cost to us $ It is the average value that we loose.

Pretend the following case.

We has sat all the night in a casino and played in one-armed bandit. We dropped the arm 400 times, on 25th and 142nd attempt we won $25, on 177th drop we won $45, the rest attempts were lose.

You can see that sometimes we lose, sometimes we won, however it looks like we came to the machine and gave $ for every attempt without losing $1 or winning $25 and $45.

Now let us see the view from the casino owners. When one is going to play in our casino in one-armed bandit and saying “I am going to win”, he(she) is wrong. We are going to win! And at an average on every drop we win $ or approximately 66 cents.

## Basic Probability, part 5

### August 24, 2009

Let us revise some information.

Pretend we at random choose two cards from a deck at the same time and we want to know what is the probability that we choose a queen or a king. As we know we these two events are mutually exclusive, because one card cannot be and a king and a queen at the same time, so we need to add two probabilities:

**Independent events.**

And now consider the case when we toss a coin an the choose at random on card from a deck. What is the probability that we get Heads and an ace? As we know we need multiple probabilities of those two events: Current two events are called **independent**, because the second event “choose an ace” is not depend on the first event “get Heads”. The probability of getting an ace is and it is independent weather we toss coin or not. The same for coin.

**The multiplication rule 1:**

If event A and event B are independent, then

**Dependent events.**

Let us get a queen from an ordinary deck with 52 cards. The probability of this event is Now put the queen in a pocket. And try to get another queen from the deck. Notice that we have 51 cards and 3 queens! So the second event to get a queen has probability

These two events are called dependent events, the second event is dependent from the first. The probability that event B occurs given that event A has already occurred is denoted as

In our case with two queens the probability that we get a queen in the first choose and it is not replaced, and we get a queen in the second choose is:

**The multiplication rule 2:**

If event A and event B are dependent, then

## Basic Probability, part 4

### August 23, 2009

All previous examples were with mutually exclusive events. These are events that do not have common cases or cannot occur at the same time. For example, the event “select a king” and the event “select an ace” from the deck are mutually exclusive, because we cannot select them both if we take one card.

**The additional rule 1:**

If two events (A and B) are mutually exclusive, then

Thus the probability for the event “select a king or a queen” from a deck is

The example of not mutual exclusive events is “select an ace” and “select the spades” in a deck, because there are 4 aces, 13 spades cards, and there is one common case – the spades ace! The spades ace belongs to the group “aces” and to the group “spades” at the same time.

**The additional rule 2:**

If two events (A and B) are not mutually exclusive, then

In our case the probability of the event “select an ace or an spades card” is

Why do we need to subtract the probability of common cases

Consider this issue in the next example. Look at the picture:

Let us find the probability of the event “randomly select a triangle or a yellow figure”. There are 12 figures at all and as we can see for our ask there are follow 7 figures:

So, the probability of the event “randomly select a triangle or a yellow figure” is

Good, but what do occur when we add the probability of the event “select a triangle” to the probability of the event “select a yellow figure”? There are 5 triangles and 5 yellow figures, so after add them up we get 10 figures, while we need 7 figures. The three excess figures occurs, because they are common and we have three yellow triangles, that belong to the group “triangles” and to the group “yellow figures” at the same time. Notice that after adding we count common figures two times: first, when we count triangles and second, when we count yellow figures:

Thus we need to subtract the number of common figures.

The probability of the event “randomly select a triangle or a yellow figure” is:

## Basic Probability, part 3

### August 23, 2009

Dice are very old instruments for games, thus they had been found in Egyptian tombs, dated to 2000 BC. Dice gamble were greatly popular in ancient Greek, Rome, Asia, then in medieval Europe and it has been being popular today. In ancient times the throw of dice was believed to be controlled by the gods. Famous was the game with throwing two dice and guessing the sum of the numbers. People noticed that the number 7 as the sum turns out more frequently and they thought that the cause is the gods favour. Number 7 is the lucky number!

Today with maths help we can understand why the sum 7 appears more frequently.

Consider all possible cases in rolling two dice. The first die has 6 sides and 6 numbers, the same is for the second die – 6 sides with 6 numbers. All possible sums of these numbers are from 1+1=2 to 6+6=12.

The picture below presents the table with all possible cases, each intersection presents the sum of the numbers of two dice:

As you can see the sum 7 appears more frequently than any other sum. There are 6 cases with the sum 7 and there are 36 possible cases at all. Hence the probability that the sum of two rolled dice will be 7 is

The probability for the sum 7 is the greatest. For example, for the sum 5 the probability is since there are only four cases we can get the sum 5.

Moreover, the probability of getting the sum 7 is equal to the probability of getting the sum (2 or 3 or 11 or 12), since

Ancient people used the number 7 of the sum two dice to check cheating. If the sum 7 appeared often then dice are fair, if not then dice are loaded. Brilliant method!

## Basic Probability, part 2

### August 23, 2009

Another way to consider the probability is graphical way.

Consider the following case:

there are a coin and the five balls in a bag with numbers 1,2,3,4,5 respectively. We toss a coin and then take out from the bag a ball.

Let us draw first event – tossing the coin, there can be H with the probability and T with the probability :

Pretend that we get H, then we take a ball with the probability for each number. Notice that the occasion there will be the same if we get T, and then take a ball with the probability :

As we know to calculate the probability of two events we need to multiply them. Thus the probability of the event “T on the coin and the ball number 1” is .

The probability of the event “T on the coin and the ball number 2” is .

The probability of the event “H on the coin and the ball number 3” is , and so on.

And as we know the probability of all cases is equal 1:

the probability of getting H or T on the coin is ;

the probability of taking the ball with the number 1 or 2 or 3 or 4 or 5 5 is ;

the probability of getting H or T on the coin and then taking the ball with the number 1 or 2 or 3 or 4 or 5 is . You can see it on the previous picture.

**Score a goal**.

Get an another example. We are watching football game and we know that the forward scored 300 goals in 1000 attacks. So the property that he score a goal in an attack is If there were 2 attacks, what is the probability that the forward scored the goal on the first attack and missed on the second attack. To get answer we need to calculate the probability of the missing in an attack, it is since the forward could miss or score and nothing else. Denote “Score a goal” as S, “Miss a goal” as M, and show all possible variants on the picture:

As before we multiplied the probabilities. The probability that the forward scored the goal on the first attack and missed on the second attack is

For 3 or more events, for instance, when 3 attacks occurred we need to draw a picture with a larger tree (scheme). The idea is the same.

**Score a goal in the five attempts.**

Consider the case when we play the basketball and have 5 attempts to throw the ball. If we know that we can get a score with the probability what is the probability that we at least take one score in all 5 attempts. What does it mean? It means that we need to find all possible variants of the scoring except one – when we miss every time in all five attempts. In other words, we find and it will be the answer.

To calculate let us find the probability of missing in an attempt, it is

Hence the probability that we lose all attempts is

Then the probability that we at least take one score in all 5 attempts is And it is very high, so probably we take a score.