Basic Concepts and Definitions
INRODUCTION TO PROBABILITY
y An experiment is any process that generates
well-defined outcomes.
¾ Experiment: Record an age
¾ Experiment: Toss a die
¾ Experiment: Record an opinion (yes, no)
¾ Experiment: Toss two coins
y The sample space for an experiment is the set of
allll experimental
i
t l outcomes
t
((or simple
i l events).
t )
2
What is Probability?
y
Basic Concepts
y In previous lectures we used graphs and numerical
measures to describe data sets which were usually
samples.
samples
y We measured “how often” using
y A simple event is the outcome that is
observed on a single repetition of the
experiment.
p
y The basic element to which probability is
Relative frequency = f/n
applied.
applied
• As n gets larger,
Sample
S
l
And “How often”
= Relative frequency
y One and only one simple event can occur
when the experiment is performed.
P
Population
l ti
y A simple event is denoted by E with a
subscript.
Probability
y
4
22
Basic Concepts
What is Probability ? (again)
y Probability is a numerical measure of the
y Each simple event will be assigned a
y
probability, measuring “how
probability
how often”
often it
occurs.
y The set of all simple events of an
experiment is called the sample space, S.
y
y
y
5
6
likelihood that an event will occur.
Probability values are always assigned on a scale
from 0 to 1.
1
A probability near 0 indicates an event is very
unlikely to occur.
A probability near 1 indicates an event is almost
certain to occur.
A probability of 0.5 indicates the occurrence of
the event is just as likely as it is unlikely.
Probability as a Numerical Measure
off th
the Likelihood
Lik lih d off Occurrence
O
Assigning Probabilities
y Classical Method
Assigning probabilities based on the assumption of
equally likely outcomes.
Direction of increasing Likelihood of Occurrence
Probability:
0
.5
5
1
y Relative Frequency Method
Assigning probabilities based on experimental or
historical data.
The occurrence of the event is
just as likely as it is unlikely.
7
y Subjective Method
Assigning probabilities based on the assignor’s
j d
judgment.
8
Relative Frequency Method
Classical Method
Example: The following table summarizes data on daily
rentals of floor polishers for the last 40 days
days.
If an experiment has n possible outcomes, using this
method one would assign a probability of 1/n to each
outcome.
Example:
Experiment: Rolling a die
Sample Space: S={E1, E2, E3, E4, E5, E6 }
Probabilities: Each simple event has a 1/6 chance
of occurring.
Thus we may generalize: If the simple events in an
experiment are equally likely, one may find
P ( A) =
Number of
Polishers
P
li h
R
Rented
d
0
1
2
3
4
9
Subjective Method
Example
p
y The die toss:
y When economic conditions and a company’s
y Simple events:
circ mstances change rapidl
circumstances
rapidly it might be
inappropriate to assign probabilities based solely
on historical data.
y We can use any data available as well as our
experience and intuition, but ultimately a
probability value should express our degree of
belief that the experimental outcome will occur.
y The best probability estimates often are obtained
by combining the estimates from the classical or
relative frequency approach with the subjective
estimates.
20
Probability
P
b bili
.10 = 4/40
.15 = 6/40
.45 etc.
.25
.05
1.00
The probability assignments are given by dividing
the number-of-days frequencies by the total frequency
(total number of days).
nA
number of simple
p events in A
=
N
total number of simple events
9
Number
off Days
D
4
6
18
10
2
40
12
1
E1
2
E2
3
E3
4
E4
5
E5
6
E6
Sample space:
S ={E1, E2, E3, E4, E5, E6}
•E
E1
•E2
•E3
•E4
S
•E5
•E6
The Probability of an Event
E ample A si
Example:
six-sided
sided fair die toss
y Simple events: E1, E2, E3, E4, E5, E6 , where the subscript
y The probability of an event A measures “how
denotes the number that has occurred on the upper
surface of the die.
y Sample space: S={E1, E2, E3, E4, E5, E6 }
y Event:
E
t A:
A an even number
b occurs;
B: the number is greater than 3;
¾ Event A can be represented by
¾ Event A can be represented by
often” we think A will occur; denote P(A).
y P(A) must be between 0 and 1.
¾ If event A can never occur
occur, P(A) = 0
0. And on the
contrary, if event A always occurs when the
experiment is performed, P(A) =1.
A = {E2, E4, E6}
B = {E4, E5, E6}
y The
Th sum off the
th probabilities
b biliti ffor allll simple
i l events
t
in S equals 1.
y Mutually exclusive events:
y The probability of an event A is found by adding
C: the number is less than 2;
D: the number is “5”;
How about A and C? A and D?
the probabilities of all the simple events contained
in A.
13
14
Basic Concepts
p
Basic Concepts
y An event is a collection of one or more
y Two events are mutually exclusive if, when one
event occurs, the other cannot, and vice versa.
simple events.
•E
E1
A
•The die toss:
–A: an odd number
–B: a number > 2
•E2
•E3
•E4
S
•E5
•Experiment: Toss a die
Not Mutually
Exclusive
–A:
A observe
b
an odd
dd number
b
–B: observe a number greater than 2
–C:
C observe
b
a6
B and C?
–D: observe a 3 Mutually
B and D?
B
•E6
A ={E
{E1, E3, E5}
B ={E
{E3, E4, E5, E6}
Exclusive
15
16
The Probability
of an Event
The Probability
of an Event
y P(A) must be between 0 and 1.
y The
Th probability
b bilit off an eventt A measures “h
“how
y If eventt A can never occur, P(A) = 0.
0 If eventt A always
l
occurs when the experiment is performed, P(A) =1.
often” we think A will occur. We write P(A).
y Suppose that an experiment is performed n
times. The relative frequency for an event A is
y The sum of the p
probabilities for all simple
p events in
S equals 1.
N mber of times A occurs
Number
occ rs f
=
n
n
•If
If we let n get infinitely large
large,
P ( A) = lim
li
17
n→∞
f
n
•The
Th probability
b bilit off an eventt A is
i ffound
db
by
adding the probabilities of all the simple
events
t contained
t i d iin A
A.
18
Finding Probabilities
Example
y Probabilities can be found using
y Toss
T
a fair
f i coin
i ttwice.
i
Wh
Whatt is
i th
the
y Estimates
E ti t from
f
empirical
i i l studies
t di
y Common sense estimates based on equally likely
probability of observing at least one head?
events.
1st Coin
•Examples:
–Toss a fair coin.
Ei
P(Ei)
H
HH
1/4
P( least
P(at
l
1h
head)
d)
T
HT
1/4
= P(E1) + P(E2) + P(E3)
H
TH
1/4
T
TT
1/4
H
P(Head) = 1/2
–10%
10% off th
the U
U.S.
S population
l ti h
has red
dh
hair.
i S
Select
l t
a person at random.
P(Red hair) = .10
19
= 1/4 + 1/4 + 1/4 = 3/4
T
20
Example
p
y A bowl contains three M&Ms®, one red, one
Counting Rules
blue and one green. A child selects two M&Ms
at random.
random What is the probability that at least
one is red?
1st M&M
m
m
m
nd M&M
Ei
P(Ei)
m
RB
1/6
m
RG
1/6
P(at least 1 red)
BR
1/6
BG
1/6
= P(RB) + P(BR)+
P(RG) + P(GR)
GB
1/6
= 4/6 = 2/3
GR
1/6
m
m
m
m
y If the simple events in an experiment are equally
likely, you can calculate
P( A) =
21
n A number of simple events in A
=
N total number of simple events
• You can use counting
g rules to find nA
and N.
22
The mn Rule
m
Examples
y If an experiment is performed in two
m
Example: Toss three coins. The total
number of simple events is:
stages, with m ways to accomplish the first
stage
t
and
d n ways to
t accomplish
li h th
the
second stage, then there are mn ways to
accomplish the experiment
experiment.
y This rule is easily extended to k stages,
with the number of ways equal to
n1 n2 n3 … nk
2×2×2=8
E
Example:
l Toss
T
two
t
dice.
di
Th
The ttotal
t l number
b off
simple events is:
6 × 6 = 36
Example: Two M&Ms are drawn from a dish
containing two red and two blue candies. The
total number of simple events is:
Example: Toss two coins. The total number
of simple events is:
4 × 3 = 12
2×2=4
23
2nd Coin
24
Permutations
Examples
y The number of ways you can arrange
Example: A lock consists of five parts and can
be assembled in any order. A quality control
engineer
g
wants to test each order for efficiency
y
of assembly. How many orders are there?
n distinct
di ti t objects,
bj t ttaking
ki th
them r att a time
ti
is
i
n!
(n − r )!
where n!= n(n − 1)(n − 2)...(
) (2)(1) and 0!≡ 1.
Prn =
The order of the choice is
important!
Example: How many 3-digit lock combinations can
we make from the numbers 1
1, 2
2, 3
3, and 4?
The order of the choice is
important!
25
4!
P = = 4(3)(2) = 24
1!
P55 =
4
3
26
Combinations
objects that can be formed, taking them r at a time
is
Crn =
random.
What
d
Wh t is
i the
th probability
b bilit that
th t exactly
tl
one is red?
n!
r!(n − r )!
The order of
the choice is
not important!
4!
=4
1!3!
ways to choose
1 red M & M.
C14 =
5!
5(4)(3)(2)1 5(4)
C35 =
=
=
= 10
3!(5 − 3)! 3(2)(1)(2)1 (2)1
B i R
Basic
Relationships
l ti
hi B
Between
t
E
Events
t
There are some basic probability relationships that can
b used
be
d tto compute
t the
th probability
b bilit off an eventt without
ith t
knowledge of all the sample point probabilities.
y Complement of event A: Ac consists all sample
points that are not in event A.
y Union of events A and B: A∪B consists of all
sample
l points
i t b
belonging
l
i tto A OR B OR both.
b th
y Intersection of events A AND B: A∩B consists of all
sample
p p
points belonging
g g to both A and B.
y Mutually Exclusive Events (already discussed)
m m
m mm
y A box contains six M&Ms®, four red
y and two green. A child selects two M&Ms at
Example: Three members of a 5-person
5 person committee must be
chosen to form a subcommittee. How many different
subcommittees could be formed?
29
m
Example
y The number of distinct combinations of n distinct
The order of
the choice is
not important!
27
5!
= 5(4)(3)(2)(1) = 120
0!
6! 6(5)
=
= 15
2!4! 2(1)
ways to choose 2 M & Ms.
Ms
C26 =
2!
=2
1!1!
ways to choose
1 green M & M.
C12 =
4 × 2 =8
8 ways to
choose 1 red and 1
green M&M.
P( exactly one
red) = 8/15
28
Event Relations
y The union of two events, A and B, is the event that
either A or B or both occur when the experiment is
performed.
f
d We
W write
it
A∪B
S
A∪ B
30
A
B
Graphical Representation of Probability
R l ti
Relationships
hi (continued)
(
ti
d)
Event Relations
y The intersection of two events, A and B,
is the event that both A and B occur when
the experiment is performed. We write A
∩ B.
y The union of events A and B is the event
containing all sample points that are in A or B or
both.
y The union is denoted by A ∪ B.
B
y The union of A and B is illustrated below.
S
A∩ B
Sample Space S
Event A
Event B
A
B
• If two events A and B are mutually
exclusive, then P(A
( ∩ B)) = 0
0.
32
31
Graphical Representation of
Probability
P
b bilit Relationships
R l ti
hi (concluded)
(
l d d)
Event Relations
y The complement of an event A consists of all
y The intersection of events A and B is the set of
outcomes of the experiment that do not result
in event A. We write AC.
all sample points that are in both A and B.
y The intersection is denoted by A ∩ B
y The
Th intersection
i t
ti off A and
d B is
i th
the area off overlap
l
in the illustration below.
S
Sample Space S
I t
Intersection
ti
AC
A
Event A
Event B
34
33
Graphical Representation of
Probabilit Relationships
Probability
y The complement of event A is defined to be the
event consisting of all sample points that are not
in A.
y The complement of A is denoted by Ac.
y The following diagram below illustrates the
concept of a complement. Sample Space S
Example
y
Select a student from the classroom and
record his/her hair color and gender.
y A: student has brown hair
y B: student is female
y C: student is male
•What is the relationship between events B and C?
Mutually exclusive; B = CC
Event A
35
Ac
•AC: Student does not have brown hair
B∩C: Student is both male and female = ∅
•B∩C:
36 •B∪C: Student is either male and female = all students = S
Calculating Probabilities for Unions
and Complements
Example: Additive Rule
Example: Suppose that there were 120
students in the classroom
classroom, and that they
could be classified as follows:
y There are special rules that will allow you to
calculate probabilities for composite events.
A brown
A:
b
h
hair
i
y The Additive Rule for Unions:
y For any two events, A and B, the probability of their
P(A) = 50/120
union P(A ∪ B),
union,
B) is
B female
B:
f
l
Male
Brown Not Brown
20
40
Female 30
30
P(B) = 60/120
P(A∪B) = P(A) + P(B) – P(A∩B)
P( A ∪ B) = P ( A) + P ( B ) − P( A ∩ B )
A
= 50/120 + 60/120 - 30/120
Check: P(A∪B)
(
)
= 80/120 = 2/3
= (20 + 30 + 30)/120
B
37
38
A Special Case
Calculating Probabilities
f Complements
for
C
l
t
When two events A and B are
mutually exclusive
exclusive, P(A∩B) = 0
and P(A∪B) = P(A) + P(B).
y We know that for any event A:
A: male with brown hair
y Since either A or AC must occur,
P(A) = 20/120
B: female with brown hair
P(B) = 30/120
A and B are mutually
exclusive, so that
Female 30
30
P(A ∪ AC) =1
y so that
P(A ∪ AC) = P(A)+ P(AC) = 1
P(A∪B) = P(A) + P(B)
P(AC) = 1 – P(A)
= 20/120 + 30/120
= 50/120
39
40
Calculating Probabilities for
Intersections
Example
y In the p
previous example,
p , we found P(A
( ∩ B))
Select a student at random from
the classroom. Define:
A: male
P(A) = 60/120
B female
B:
f
l
A and
d B are
complementary, so
that
41
A
y P(A ∩ AC) = 0
Brown Not Brown
20
40
Male
AC
Male
directly from the table. Sometimes this is
impractical or impossible. The rule for
calculating P(A
( ∩ B)) depends on the idea off
independent and dependent events.
events.
Brown Not Brown
20
40
Female 30
Two events, A and B, are said to be
p
if and only
y if the probability
p
y
independent
that event A occurs does not change,
depending on whether or not event B has
occurred.
d
30
P(B) = 1- P(A)
= 1- 60/120 = 40/120
42
Conditional Probabilities
y The probability that A occurs,
given that event B has occurred is
called the conditional probability
off A given
i
B and
d iis d
defined
fi d as
P( A | B) =
Conditional Probability
y The probability of an event given that another
event has occurred is called a conditional
probability.
y The conditional probability of A given B is
denoted by P(A|B), where the vertical line stands
for “given.”
g
y A conditional probability can be computed by the
famous Bayes’ rule:
P( A ∩ B)
if P ( B ) ≠ 0
P( B)
P( A | B ) =
“given”
P( A ∩ B )
P( B )
And of course the event B must be such that P(B) ≠ 0
43
44
Example 2
Example
p 1
y A bowl contains five M&Ms®, two red and three
blue. Randomly select two candies, and define
y A: second candy is red.
y B: first candy is blue.
y Toss a fair coin twice. Define
y A: head on second toss
y B: head on first toss
P(A|B) = ½
HH
HT
1/4
P(A|not
( |
B)) = ½
m
1/4
TH
1/4
TT
1/4
P(A) does not
change whether
change,
B happens or
not…
m
A and B are
independent!
45
P(A|B) =P(2nd red|1st blue)= 2/4 = 1/2
m
46
m
m
P(A|not B) = P(2nd red|1st red) = 1/4
P(A) does
change,
depending on
whether B
happens or not
A and B are
dependent!
Defining Independence
y We can redefine independence in terms of
Independence of events
conditional probabilities:
y Events A and B are independent if P(A|B) = P(A).
Two events A and B are independent if and
only if
P(A||B) = P(A) or
P(A
Or, equivalently we may say
P(B|A) = P(B)
y Events A and B are independent if P(B|A) = P(B).
Oth
Otherwise,
i
they
th are dependent
d
dependent.
d t
In other words, the occurrence of event A (or B)
has
as no
oe
effect
ec o
on the
ep
probability
obab y o
of occu
occurrence
e ce o
of
event B (or A).
• Once you’ve
you ve decided whether or not two
events are independent, you can use the
following rule to calculate their intersection.
47
48
Independence of Events
(contin ed)
(continued)
The multiplication rule for independent events
becomes
P(A ∩ B) = P(A)P(B)
The Multiplicative Rule for
Intersections
y For any
y two events,, A and B,, the probability
p
y that both A
and B occur is
P(A ∩ B) = P(A) P(B given that A occurred)
= P(A)P(B|A)
This rule can be used as a test for independence of
two events.
events
• If the events A and B are independent, then
the probability that both A and B occur is
P(A ∩ B) = P(A) P(B)
50
49
Multiplication and Addition Laws
for Comp
Computing
ting Probabilities
y The multiplication law provides a way to compute
the probability
th
b bilit off an iintersection
t
ti off ttwo events.
t
The law is written as:
P(A ∩ B) = P(B)P(A|B)
y The addition law, on the other hand, provides a way
to compute the probability of a union of two events,
and writes as follows:
P(A ∪ B) = P(A) + P(B) - P(A ∩ B)
¾ The
Th addition
dditi llaw ffor mutually
t ll exclusive
l i events
t
becomes
P(A ∪ B) = P(A) + P(B), since by
definition of mutually exclusive events P(A ∩ B) = 0
Example 1
In a certain population, 10% of the people can be
classified as being high risk for a heart attack
attack.
Three people are randomly selected from this
population. What is the probability that exactly one
of the three are high risk?
Define H: high risk N: not high risk
P(exactly
(
y one high
g risk)) = P(HNN)
(
) + P(NHN)
(
) + P(NNH)
(
)
= P(H)P(N)P(N) + P(N)P(H)P(N) + P(N)P(N)P(H)
= (.1)(.9)(.9)
( 1)( 9)( 9) + ((.9)(.1)(.9)
9)( 1)( 9) + ((.9)(.9)(.1)
9)( 9)( 1)= 3(.1)(.9)
3( 1)( 9)2 = .243
243
52
17
Example 2
Suppose we have additional information in the
previous example. We know that only 49% of the
population are female. Also, of the female patients, 8%
are high risk. A single person is selected at random.
What is the probability that it is a high risk female?
Define H: high risk
The Law of Total Probability
y Let S1 , S2 , S3 ,..., Sk be mutually exclusive and
exhaustive
h
ti events
t (that
(th t is,
i one and
d only
l one mustt
happen). Then the probability of another event A can
be written as
F: female
P(A) = P(A ∩ S1) + P(A ∩ S2) + … + P(A ∩ Sk)
From the example, P(F) = .49 and P(H|F) = .08.
Use the Multiplicative Rule:
= P(S1)P(A|S1) + P(S2)P(A|S2) + … + P(Sk)P(A|Sk)
P(high risk female) = P(H∩F)
= P(F)P(H|F) =.49(.08) = .0392
53
54
The Law of Total Probability
The Law of Total Probability
S1
Let E1 , E2 , E3 ,..., Ek be mutually exclusive and
exhaustive events (that is, one and only one must
happen). Then the probability of another event A
can be written as:
A ∩Sk
A
A ∩ S1
P(A) = P(A ∩ E1) + P(A ∩ E2) + … + P(A ∩ Ek)
Sk
S2….
= P(E1)P(A|E1) + P(E2)P(A|E2) + … + P(Ek)P(A|Ek)
P(A)
( ) = P(A
( ∩ S1) + P(A
( ∩ S2) + … + P(A
( ∩ Sk)
= P(S1)P(A|S1) + P(S2)P(A|S2) + … + P(Sk)P(A|Sk)
56
55
Bayes’ Rule
Bayes
y Let S1 , S2 , S3 ,..., Sk be mutually exclusive and
exhaustive events with prior probabilities P(S1),
P(S
( 2)),…,P(S
( k)). If an event A occurs, the p
posterior
probability of Si, given that A occurred is
Example
p
From a previous example, we know that 49% of the
population are female. Of the female patients, 8% are
high risk for heart attack, while 12% of the male patients
are high risk. A single person is selected at random and
f
found
d to be
b hi
high
h risk.
i k Wh
What iis the
h probability
b bili that
h iit iis a
male?
Define H: high risk
P( Si | A) =
We know:
P(F) =
P(M) =
P(H|F) =
P(H|M) =
P( Si ) P( A | Si )
for i = 1, 2 ,...k
∑ P( Si ) P( A | Si )
57
.49
.51
.08
F: female
P(M | H) =
=
P(M)P(H | M)
P(M)P(H | M) + P(F)P(H | F)
.51(.12)
=.61
.51((.12) +.49((.08)
.12
58
Random Variables
y A quantitative variable x is a random
Probability Distributions for
Discrete Random Variables
variable if the value that it assumes,
corresponding to the outcome of an
experiment is a chance or random event.
y Random
R d
variables
i bl can b
be discrete
di
t or
continuous.
y The probability distribution for a discrete random
variable x resembles the relative frequency
distributions we constructed in Chapter 1. It is a
graph,
g
p , table or formula that g
gives the p
possible values
of x and the probability p(x) associated with each
value.
• Examples:
E
l
59
M: male
9x = SAT score for a randomly selected
student
9x = number of people in a room at a
randomly selected time of day
9x = number on the upper face of a
randomly tossed die
We must have
0 ≤ p ( x) ≤ 1 and ∑ p ( x) = 1
60
Example
Probability Distributions
y Toss a fair coin three times and
y Probability distributions can be used to describe
define x = number of heads.
x
HHH
HHT
HTH
1/8
3
1/8
2
1/8
2
P(x = 0) =
P(x = 1) =
P(x = 2) =
P(x = 3) =
1/8
3/8
3/8
1/8
THH
HTT
THT
61
1/8
2
1/8
1
1/8
1
TTH
1/8
1
TTT
1/8
0
x
p(x)
( )
0
1/8
1
3/8
2
3/8
3
1/8
th population,
the
l ti
jjustt as we d
described
ib d samples
l iin
Chapter 1.
y Shape: Symmetric,
Symmetric skewed
skewed, mound
mound-shaped…
shaped
y Outliers: unusual or unlikely measurements
y Center
C t and
d spread:
d mean and
d standard
t d d
deviation. A population mean is called μ and a
population standard deviation is called σ.
σ
Probability
Histogram for x
62
The Mean
and Standard Deviation
Example
p
y Let x be a discrete random variable with probability
distribution p(x). Then the mean, variance and
standard deviation of x are given as
M
Mean
: μ = ∑ xp( x)
Variance : σ 2 = ∑( x − μ ) 2 p ( x)
y Toss a fair coin 3 times and record x
the number of heads.
x
p(x)
xp(x)
0
1
1/8
3/8
0
3/8
(x-μ)2p(x)
(-1.5)2(1/8)
(-0.5)2(3/8)
2
3/8
6/8
(0 5)2(3/8)
(0.5)
3
1/8
3/8
(1.5)2(1/8)
μ = ∑ xp ( x ) =
12
= 1 .5
8
σ 2 = ∑( x − μ ) 2 p( x)
σ 2 = .28125 + .09375 + .09375 + .28125 = .75
Standard deviation : σ = σ 2
σ = .75 = .688
63
Example
p
Key
y Concepts
p
y The probability distribution for x
the number of heads in tossing 3
f i coins.
fair
i
• Shape?
Sh
?
• Outliers?
• Center?
• Spread?
μ
Symmetric;
mound-shaped
None
μ = 1.5
σ = .688
688
I. Experiments and the Sample Space
1 Experiments
1.
Experiments, events,
events mutually exclusive events
events,
simple events
2. The sample space
3. Venn diagrams, tree diagrams, probability tables
II. Probabilities
1. Relative frequency
f
definition
f
off probability
2. Properties of probabilities
a Each probability lies between 0 and 1
a.
1.
b. Sum of all simple-event probabilities equals 1.
3. P(A), the sum of the probabilities for all simple events
in A
Key Concepts
Key Concepts
P( A | B) =
3. Conditional probability:
4 IIndependent
4.
d
d t and
dd
dependent
d t events
t
III. Counting Rules
1 mn Rule; extended mn Rule
1.
2. Permutations: P n = n!
5. Additive Rule of Probability:
(n − r )!
n!
n
Cr =
r!(n − r )!
r
3. Combinations:
IV. Event Relations
1. Unions and intersections
2. Events
a. Disjoint or mutually exclusive: P(A ∩ B) = 0
b. Complementary: P(A) = 1 −
P(AC )
Key Concepts
V. Discrete Random Variables and Probability
Distributions
1. Random variables, discrete and continuous
2 Properties of probability distributions
2.
0 ≤ p ( x) ≤ 1 and ∑ p ( x) = 1
3. Mean
3
M
or expected
t d value
l off a di
discrete
t random
d
variable: Mean : μ = ∑ xp( x)
4 Variance and standard deviation of a discrete
4.
random variable: Variance : σ 2 = ∑( x − μ ) 2 p( x)
Standard deviation : σ = σ 2
P( A ∪ B) = P( A) + P( B) − P( A ∩ B)
6 Multiplicative Rule of Probability:
6.
P( A ∩ B ) = P ( A) P( B | A)
7. Law of Total Probability
8. Bayes’ Rule
P( A ∩ B)
P( B)
© Copyright 2026 Paperzz