A First Look at Probability Basic Principles of Probability

Chapter 3:
A First Look at Probability
Hildebrand, Ott and Gray
Basic Statistical Ideas for Managers
Second Edition
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
1
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Learning Objectives for Ch. 3
• Probability Interpretations
•
•
•
Classical approach
Relative frequency approach
Subjective approach
• Manipulation of probabilities
• Complements law
• Conditional probability
•
Multiplication law
• Statistical Independence
• Probability Trees
• Simulation
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Section 3.1
Basic Principles of Probability
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3
3.1 Basic Principles of Probability
• Synonyms for probability: chance, likelihood.
• Only consider random experiments
• Characteristics of random experiments
• Can specify all possible outcomes.
• Cannot predict a specific outcome with certainty.
Example: Today’s closing price of a security relative to yesterday.
• Probability measures the uncertainty of the outcomes.
4
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
• Three interpretations of probability
1. The classical interpretation
N possible mutually exclusive equally likely outcomes;
the probability of an event E equals the ratio of the
number of outcomes (NE) pertaining to E to the total
number of outcomes (N):
P(E) = NE /N
Example: Roll a fair die one time and observe the up face
P(rolling a 6) = 1/6
Roll a pair of fair dice and observe the up faces
P(both up faces are 6’s) = 1/36 = (1/6) (1/6) {why?
5
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
Exercise 3.3:
An automobile dealer sells two brands of new cars. One, C, is
primarily American in origin; the other, G, is primarily Japanese. The
dealer performs repair work under warranty for both brands. Each
warranty job is classified according to the primary problem to be
fixed. If there is more than one problem in a given job, all problems
are listed separately. Records for the past year indicate the
following numbers of problems:
Problem Area
Brand C
G
Total
a.
Engine Transmission Exhaust Fit/Finish Other
Total
106
21
127
541
182
723
211
115
326
67
16
83
133
24
157
24
6
30
What is the probability that a randomly chosen problem comes from
brand C?
P(randomly chosen problem comes from brand C) = 541/723
= .748
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
6
3.1 Basic Principles of Probability
2. The relative frequency interpretation
If an experiment has been repeated n times under identical conditions, and ne
of these trials have resulted in event E, then the relative frequency of event
E is ne/n. Assume the experiment can be repeated indefinitely under identical
conditions. Under these assumptions, the long-run relative frequency with
which event E occurs is the probability of event E:
P(E) = limit (ne/n) as n
∞
≈ ne/n, for n sufficiently large.
Example: Flipping a coin to find the P (head).
Number of heads
In each of 5 tosses
3
0.6
Relative
frequency
0.5
_
_
..
.
P(E)
3
2
.
:
5
10
n
15
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
7
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
3. The subjective interpretation
The subjective probability of an event E is reached when you are
indifferent between wagering on this event or on the drawing of a red
bead from an urn in which a fraction ne/n of the beads are red.
Example: What is the probability that the Ravens will win the Super Bowl
in 2007? Suppose you say .2.
Event (Ravens Win Super Bowl)
P(E) = .2
Event (Pick red bead)
P(Pick red bead) = .2
Urn has 10 beads,
2 are red.
• Are you indifferent between wagering on either event?
• If not, then P(Ravens winning Super Bowl in 2007) ≠ .2
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
8
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
• Let A and B denote any events in a random experiment.
• P(A) ≥ 0
• Addition Law for Mutually Exclusive Events
If A and B are mutually exclusive events
(A and B = empty set), then
P(A or B) = P(A) + P(B)
s
A
B
Venn diagram
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
9
3.1 Basic Principles of Probability
Exercise 3.3:
Problem Area
Engine Transmission Exhaust Fit/Finish Other Total
106
211
67
133
24
541
21
115
16
24
6
182
127
326
83
157
30
723
Brand C
G
Total
b. Serious problems are those involving the engine or transmission. What
is the probability that a randomly chosen problem is serious?
P (Serious problem) = P(E) + P(T)
= 127/723 + 326/723
= 453/723
= .627
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
10
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
• General Addition Law
P(A or B) = P(A) + P(B) – P(A and B)
S
A
B
(A and B) has been counted twice
Subtract once
Exercise 3.3:
What is the probability that a randomly chosen problem comes
from Brand C or is an engine problem?
C denotes "Brand C”. P(C) = 541/723
E denotes “Engine problem". P(E) =127/723
P(C or E) = P(C) + P(E) – P(C and E)
= 541/723 + 127/723 - 106/723
= 562/723 = .777
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
11
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
• Complements Law
• Not A is called the complement of A.
• Notation : A
P(not A) = 1 – P(A)
or P(A) = 1 – P(not A)
S
A
A
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
12
3.1 Basic Principles of Probability
Example: Birthday Problem
There are n people in a room. What is the probability that at
least two of them have the same birthday?
Assumptions:
• No birthdays fall on Feb 29;
• All 365 days are equally likely for each person; and
• Birthdays are independent
A = {at least two people have same birthday}
A = {no two people have the same birthday}
P( A ) = (365)(364)(363)…[365 – (n-1)] / (365)n
= .29 for n = 30
⇒P(A) = .71
n
23
30
50
P(A)
.51
.71
.97
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
13
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
• Conditional Probability
Conditional probability of event A occurring given that
event B has occurred, denoted by P(A|B), is:
P(A|B) = P(A and B) / P(B),
provided P(B) > 0.
S
A
B
A and B
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
14
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.1 Basic Principles of Probability
Exercise 3.4:
Brand
C
G
Total
Problem Area
Engine Transmission
Exhaust
106
211
67
21
115
16
127
326
83
Fit/Finish Other Total
133
24
541
24
6
182
157
30
723
a. For the auto dealer’s data in Ex 3.3, what is the probability that a randomly
chosen problem is an engine problem, given that it comes from brand C?
P(Engine problem | Brand C) = P(E|C) = P(E and C)/P(C)
= (106/723)/(541/723)
=.196
where P(Brand C and Engine Problem) = P(C and E) = 106/723
and P(Brand C) = P (C) = 541/723
• Note: P (E|C) = .196 while P(E) = .176
The occurrence of event C affects P(E)
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
15
3.1 Basic Principles of Probability
Exercise 3.4:
b. Construct a table of conditional probabilities of problem areas, given
the brand.
Table of conditional probabilities of problem areas, given the brand
Brand C
G
Problem Area
Engine Transmission Exhaust Fit/Finish Other
.196
.390
.124
.246
.044
.115
.632
.088
.132
.033
Sum
1.0
1.0
Are the probability distributions similar for the two brands?
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
16
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Section 3.2
Statistical Independence
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
17
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.2 Statistical Independence
• Multiplication Rule
From the definition of conditional probability,
P(A and B) = P(B) P(A|B)
Obviously,
P(A and B) = P(A) P(B|A)
Example: Randomly pick (without replacement) 2 cards
from a standard deck. Find probability of 2 hearts.
A = {1st card is a heart}, B = {2nd card is a heart}
P (A and B) = P(A) P(B|A) = (13/52) (12/51)
• The concept of statistical independence is introduced after
Section 3.3
• The multiplication rule is useful in Probability Trees.
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
18
Section 3.3
Probability Trees and Simulation
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
19
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.3 Probability Trees and Simulation
• Probability trees
In a probability tree, the probability for a specific path is
found by using the multiplication rule.
Exercise 3.44:
A purchasing dept finds that 75% of its special orders are received on
time. Of those orders that are on time, 80% meet specifications
completely; of those orders that are late, 60% meet them.
T = {Order is on time}
P(T) = .75
M = {Meets specifications}
P(M|T) = .80
P (M| T ) = .60
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
20
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.3 Probability Trees and Simulation
P(T) =
P(T) =
P(M|T) =
.80
P(M |T) =
.20
P(M|T ) =
.60
P(M| T ) =
.40
.75
.25
P(T and M) = .60
P(T and M ) = .15
P(T and M) = .15
P(T and M ) = .10
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
21
3.3 Probability Trees and Simulation
Exercise 3.44:
a. Find the probability that an order is on time and meets
specifications.
P(T and M) = .60
c. Find the probability that an order meets specifications.
P(M) = P(M and T) + P(M and T )
= .60 + .15
= .75
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
22
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.3 Probability Trees and Simulation
• Simulation
An approach to finding approximate answers to
probability problems
Example (simple):
Suppose a coin is biased so that P(H) = 0.4.
Use simulation to find an estimate for P(H).
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
23
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.3 Probability Trees and Simulation
Typical output follows, where ‘1’ denotes a ‘head’:
1000 replications
0
1
1
1
0
10000 replications
1
1
0
1
0
…
…
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
24
3.3 Probability Trees and Simulation
Estimated probability of a head:
For n = 1000, probability is 0.386
For n = 10000, probability is 0.3965
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
25
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.3 Probability Trees and Simulation
• Procedure for using Minitab:
Click on ‘Calc’ ‘Random Data’ ‘Bernoulli’
Enter number of replications in ‘Generate __ rows of data’
Enter ‘Column’ to store replications
Enter “0.4” for ‘Probability of success’
• Procedure to obtain estimated probability of a head using
Minitab:
Click on ‘Calc’ ‘Column Statistics’
Select “Mean”
Enter column where replications are stored in “Input Variable __”
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
26
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.3 Probability Trees and Simulation
• Why use simulation?
Easier to use simulation to model some processes than to
use analytic methods.
Example: Inventory models where demand is stochastic, and
time to place and receive an order is stochastic.
• Simulation will be used in Chapter 6 to demonstrate the
Central Limit Theorem.
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
27
3.2 (Revisited) Statistical Independence
• Statistical Independence
A and B are independent events if and only if
P(A|B) = P(A)
Example: Problems 3.3 and 3.4
Are events “Brand C” and “Engine Problem” independent?
P(Engine Problem) = P(E) = 127/723 = .176
P(Engine Problem|Brand C) = P(E|C) = .196
Conclusion: The events “Brand C” and “Engine Problem”
are not independent because
P(E|C) ≠ P(E)
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
28
2005 Brooks/Cole, a division of Thomson Learning, Inc.
3.2 (Revisited) Statistical Independence
• Multiplication Law for Independent Events
• If A and B are independent, then
P(A and B) = P(A) P(B)
• Reason: From Multiplication Rule,
P(A and B) = P(A|B) P(B)
From independence,
P(A|B) = P(A)
P(A and B) = P(A) P(B)
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
29
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Putting It All Together
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
30
Putting It All Together
Exercise 3.56 [Pepys asked Newton this question]:
A famous problem in the history of probability theory
(which started with the analysis of games of chance) goes
like this: In one game, a player rolls 6 fair, six-sided dice,
and wins if one or more 6's appear. In a second game, the
player rolls 12 fair dice and wins if at least two 6's appear.
In a third game, the player rolls 18 fair dice and wins if at
least three 6's appear. It was argued that a 6 would
appear, on average, one-sixth of the time, so that the three
games should have an equal chance of winning. Is this
argument correct?
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
31
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Putting It All Together
Game 1:
Roll six fair dice.
Win if one or more 6’s appear.
P(Win) = P(One or more 6’s)
{by complements law}
= 1 – P(no 6’s)
{by independence}
= 1 – (5/6)6
= .6651
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
32
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Putting It All Together
Game 2:
Roll twelve fair dice.
Win if at least two 6's appear.
P(Win) = P(Two or more 6's)
= 1 - P( “no 6” or “one 6” ) {by complements law}
= 1 - P( “no 6” ) - P( “one 6” )
= 1 - (5/6)12 - (12)(1/6)(5/6)11 {by independence}
= .6187
The “one 6” could be any one of 12 dice.
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
33
Putting It All Together
Game 3:
Roll eighteen fair dice.
Win if at least three 6's appear.
P(Win) = P(Three or more 6's)
= .5973
Can you show how to find this answer?
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
34
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Putting It All Together
P(Win)
= P(Three or more 6's)
= 1 – (5/6)18 – (18)(1/6)1(5/6)17
– [(18)(17)/2] (1/6)2(5/6)16
The 1st ‘6’ can be any one of the 18 dice.
The 2nd ‘6’ can be any one of 17 dice.
Divide by 2 because the order of the 1st and 2nd ‘6’
doesn’t matter
= .5973
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
35
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Keywords: Chapter 3
• Classical interpretation
• Relative frequency
interpretation
• Subjective
interpretation
• Addition Law for
Mutually Exclusive
Events
• General Addition Law
• Complements Law
• Conditional Probability
• Multiplication Law
• Statistical
Independence
• Multiplication Law for
Independent Events
• Probability Tree
• Simulation
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
36
Summary of Chapter 3
• 3 Interpretations of Probability
• Classical approach (equally likely outcomes)
• Relative frequency approach
• Subjective approach
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
37
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Summary of Chapter 3
• Manipulating Probabilities
• P(not A) = 1 – P(A)
• Are events A and B mutually exclusive?
If yes: P(A or B) = P(A) + P(B)
If no: P(A or B) = P(A) + P(B) – P(A and B)
• Conditional probability
P(A | B) = P(A and B) / P(B)
• Multiplication Law
P(A and B) = P(A | B) P(B) { Useful in
probability trees
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
38
2005 Brooks/Cole, a division of Thomson Learning, Inc.
Summary of Chapter 3
• Determining Independence of events A and B
• If P(A | B) = P(A), then A and B are independent events
• If A and B are independent events, then
P(A and B) = P(A) P(B)
• Simulation
Hildebrand, Ott & Gray, Basic Statistical Ideas for Managers, 2nd edition, Chapter 3
Copyright
©
2005 Brooks/Cole, a division of Thomson Learning, Inc.
39