Exam 2: Rules
Bring a cheat sheet. One page 2 sides.
Bring a calculator.
Bring your book to use the tables in the back.
Section 2.1
Exam 2: Rules
Five question:
One fill in the blanks
One multiple choice
Three to solve
One of those three is from your homework
Show work; if you do you might get partial credit
Section 2.1
In studying uncertainty:
1) Identify the experiment of interest and understand it
well (including the associated population)
2) Identify the sample space (all possible outcomes)
3) Identify an appropriate random variable that reflects
what you are studying (and simple events based on
this random variable)
4) Construct the probability distribution associated
with the simple events based on the random variable
Ch3:
3.1 Random Variables
3.2 Probability distributions for discrete
random variables
3.3 Expected values
3.4 The Binomial Distribution
3.5 Negative Binomial and
Hypergeometric
3.6 The Poisson
Random Variables
Section 3.1
A Random Variable: is a function on the outcomes of
an experiment; i.e. a function on outcomes in S.
A discrete random variable is one with a sample space
that is finite or countably infinite. (Countably infinite
=> infinite yet can be matched to the integer line)
A continuous random variable is one with a continuous
sample space.
Probability distributions for discrete rvs
Section 3.2
For discrete random variables, we call P(X = x) = P(x)
the probability mass function (pmf).
From the axioms of probability, we can show that:
1.
2.
A CDF, F(x) is defined to be,
Probability distributions for discrete rvs
Section 3.2
So, for any two numbers a, b where a < b,
We can also find the pmf using the CDF if we note that:
Expected values
Section 3.3
The expected value E(X) of a discrete random variable is
the weighted average or the mean of that random variable,
The variance of a discrete random variable is the
weighted average of the squared distance from the mean,
The standard deviation,
Let h(X) be a function, a
and b be constants then,
Discrete Probability & Expected values
Talked about the following distributions:
Bernoulli
Binomial
Hypergeometric
Negative Binomial
Geometric
Poisson
Section 3.2-6
Discrete Probability & Expected values
Section 3.2-6
Bernoulli
Two possible outcomes S and F, probability of success = p.
S = {S, F}
Discrete Probability & Expected values
Section 3.2-6
Binomial
The experiment consists of a group of n independent
Bernoulli sub-experiments, where n is fixed in advance of
the experiment and the probability of a success is p.
What we are interested in studying is the number of
successes that we may observe in any run of such an
experiment.
Discrete Probability & Expected values
Section 3.2-6
Binomial
The binomial random variable X = the number of
successes (S’s) among n Bernoulli trials or subexperiments.
We say X is distributed Binomial with parameters n and p,
The pmf can become (depending on the book),
Discrete Probability & Expected values
Section 3.2-6
Binomial
The CDF can become (also depending on the book),
Tabulated in Table A.1, page 664-666
The Binomial Distribution
Section 3.4
When to use the binomial distribution?
1. When we have n independent Bernoulli trials
2. When each Bernoulli trial is formed from a sample
n of individuals (parts, animals, …) from a
population with replacement.
3. When each Bernoulli trial is formed from a sample of
n individuals (parts, animals, …) from a population
of size N without replacement if n/N < 5%.
Discrete Probability & Expected values
Section 3.2-6
Hypergeometric
The experiment consists of a group of n dependent
Bernoulli sub-experiments, where n is fixed in advance of
the experiment and the probability of a success is p.
What we are interested in studying is the number of
successes that we may observe in any run of such an
experiment.
Discrete Probability & Expected values
Section 3.2-6
Hypergeometric
The hypergeometric random variable X = the number of
successes (S’s) among n trials or sub-experiments.
We say X is distributed Hypergeometric with parameters
N, M and n
Discrete Probability & Expected values
Hypergeometric
The pmf can become (depending on the book),
The CDF
Section 3.2-6
Discrete Probability & Expected values
Hypergeometric
Section 3.2-6
Discrete Probability & Expected values
Section 3.2-6
Negative Binomial
The experiment consists of a group of independent
Bernoulli sub-experiments, where r (not n), the number of
successes we are looking to observe, is fixed in advance of
the experiment and the probability of a success is p.
What we are interested in studying is the number of
failures that precede the rth success.
Called negative binomial because instead of fixing the
number of trials n we fix the number of successes r.
Discrete Probability & Expected values
Section 3.2-6
Negative Binomial
The negative binomial random variable X = the number
of failures (F’s) until the rth success.
We say X is distributed negative Binomial with parameters
r and p,
pmf is:
Discrete Probability & Expected values
Negative Binomial
CDF is
Section 3.2-6
Discrete Probability & Expected values
Section 3.2-6
Geometric
A special case of the negative binomial is when r = 1, then
we call the distribution geometric.
The geometric random variable X = the number of
failures (F’s) until the 1st success.
We say X is distributed geometric with parameter p,
pmf is:
Discrete Probability & Expected values
Geometric
CDF is
Section 3.2-6
Discrete Probability & Expected values
Section 3.2-6
Poisson
We can get to the Poisson model in two ways:
1. As an approximation of the Binomial distribution
2. As a model describing the Poisson process
Discrete Probability & Expected values
Section 3.2-6
Poisson
1. Approximating the Binomial distribution
Rules for approximation:
The math ones are:
If
,
, and
then
In practice:
If n is large (>50) and p is small such as np < 5,
then we can approximate
with
,
where
Discrete Probability & Expected values
Section 3.2-6
Poisson
1. Approximating the Binomial distribution
Poisson random variable X = the number of successes (S).
We say X is distributed Poisson with parameter l,
pmf:
Discrete Probability & Expected values
Poisson
1. Approximating the Binomial distribution
CDF:
Tabulated in Table A.2, page 667
Section 3.2-6
Discrete Probability & Expected values
Section 3.2-6
Poisson
2. As a model describing the Poisson process
This is a process of counting events, usually, over time
Assumptions:
a. There exists a parameter a > 0 such that,
b. There is a very small chance that 2 or more
events will occur in ,
b. The number of events observed in is independent
from that occurring in any other period.
Discrete Probability & Expected values
Section 3.2-6
Poisson
2. As a model describing the Poisson process
Poisson random variable X = the number of successes (S)
within time period t.
We say X is distributed Poisson with parameter at,
pmf:
Discrete Probability & Expected values
Poisson
CDF:
Tabulated in Table A.2, page 667
Section 3.2-6
Ch4:
4.1
4.2
4.3
4.4
Probability Density Functions
CDFs and Expected Values
The Normal Distribution
The Exponential Distribution
Continuous pdfs, CDFs and Expectation
For continuous random variables, we call f(x) the
probability density function (pdf).
From the axioms of probability, we can show that:
1.
2.
CDF
Section 4.1-2
Probability distributions for discrete rvs
Section 3.2
So, for any two numbers a, b where a < b,
We can also find the pdf using the CDF if we note that:
Expected values
Section 3.3
Continuous random variables
Talked about the following distributions:
Uniform
Normal
Exponential
Section4.2-6
Continuous random variables
Uniform
Section4.2-6
Continuous random variables
Section4.2-6
Normal
The most important distribution of classical and applied
statistics.
CDF
Expectation
Continuous random variables
Section4.2-6
Normal
The standard Normal
Z is said to have a standard normal distribution with
mean = μ = 0 and standard deviation = σ = 1,
pdf,
A CDF,
668-669
, as provided by Table A.3 pages
Continuous random variables
Section4.2-6
Normal
Percentiles
zα = x(1-α) = equal to the (1-α)th percentile.
If
, with
, then
we can use the normal distribution to approximate this
distribution as follows,
Continuous random variables
Exponential
Commonly used to model component life time (if
that component can be assumed not to change over
time) and times between occurrence of multiple
events in a Poisson process. A good approximation
to the geometric distribution
Section4.2-6
Continuous random variables
Section4.2-6
Exponential
We say that a random variable is exponentially distributed,
, governed by parameter λ if the pdf of its
distribution is,
CDF,
Expectation,
For any type of random variables
Chebyshev’s rule:
Says that no matter what probability distribution you are
looking at the chance that an observed simple event of an
experiment (from now on we will hand waive it and call it
an outcome) will be between k standard deviations from
the mean of the distribution is going to be at least 1 – 1/k2
In simple math:
© Copyright 2026 Paperzz