Distribution Gamma Function Stochastic Process

Distribution
Gamma Function
Stochastic Process
Tutorial 4, STAT1301 Fall 2010,
12OCT2010, MB103@HKU
By Joseph Dong
2
Reference
Wikipedia
3
Recall: Distribution of a Random
Variable
• One way to describe the random behavior of a
random variable is to give its probability
distribution, specifying the probability of taking
each element in its range (the sample space).
• The representation of a probability distribution
comes either in a differential form: the pdf/pmf,
or in an integral form: the cdf.
• The cdf is a never-decreasing, right-continuous
function from
to
.
• The pdf/pmf is a non-negative, normalized
function from
to a subset of
.
4
Recall:
versus
•
▫
▫
▫
is never-decreasing
is rightward continuous
∞
0,
∞
1
•
↓
▫ A slightly modified formula can apply to
lim

▫
▫
▫
0
↓
1
:
5
Gamma Function
•
•
•
•
,
6
Handout Problems 6 & 7
• Problem 6:
▫ Gamma function and integration
practice
• Problem 7:
▫ important continuous distributions
and their relationships
7
From Bernoulli Trials to Discrete
Waiting Time (Handout Problems 1-4)
• A single Bernoulli trial:
▫ Tossing a coin
▫ Only two outcomes and they are complementary to each
other.
• Bernoulli trials: we want to count #success, this gives
rise to a Binomial random variable
• Bernoulli trials: we want to know how long we should
wait until the first success (Geometric random
variable).
• Bernoulli trials: we want to know how long we should
wait until the
success (Negative Binomial)
• Bernoulli trials: we want to know how long we should
wait between two successes (?)
8
Poisson [pwa’sɔ]̃ Distribution
• Poisson Approximation to Binomial
(PAB)
▫ Handout Problem 5
• The true utility of Poisson
distribution—Poisson process:
▫ Sort of the limiting case of Bernoulli
trials (use PAB to facilitate thinking)
▫ “continuous” Bernoulli trials
9
Sequence of Random Variables
• A sequence of random variables is an ordered and
countable collection of random variables, usually
indexed by integers starting from one: , , ⋯ , ,
where can be finite or ∞.
▫ Shortly written as
|
1,2, ⋯ ,
▫ A sequence of Random Variables is a discrete-time
stochastic process.
▫ For example, a sequence of Bernoulli trials is a
discrete-time stochastic process called a Bernoulli
process.
10
Stochastic Process:
Discrete-time and Continuous-time
• A stochastic process is (nothing but) an ordered, not
necessarily countable, collection of random variables,
indexed by an index set .
▫ Shortly written as
| ∈
▫ Usually bears a physical meaning of Time
▫ If is a continuous(discrete) set, we call the indexed
r.v.’s a “continuous(discrete)-time process.”
▫ In many continuous-time cases, we choose
0, ∞ ,
and in that case, we can write the stochastic process as
.
11
Stochastic Process = Set of RVs + Index Set
Sample Path of a Stochastic Process
Discrete-time process
Continuous-time process
12
Bernoulli Trials (Bernoulli Process)
• Bernoulli Trials (with success probability )
▫ Discrete-time process
▫ a sequence of independent and identically
distributed (iid) Bernoulli Random Variables
following the common distribution
.
where are independent
▫ Written
and all
.
13
Poisson Process
• Poisson Process (with intensity )
▫ Continuous-time process
▫ Limiting case of Bernoulli Trails when the index
set becomes continuous.
▫ “Poisson” in the name because the counts of
success on any interval follows
,
irrespective of the location of the chosen
interval on the time axis.
and
are
▫ Also if two disjoint time intervals
chosen, then the counts of success on each of them
are independent.
14
Discrete Distribution Based On
Bernoulli Trails
•
•
•
•
Bernoulli Distribution
, one trial
Binomial Distribution
, n trials
Poisson Distribution
, ly many trials
Geometric Distribution
,
indefinitely many but at least one trial
• Negative Binomial Distribution
,
indefinitely many but at least r trials.
15
Continuous Distribution Based On
Poisson Process
• Poisson Distribution (discrete) as building block
▫ Distribution of counts on any infinitesimal time
interval is
, where represents the
intensity (a differential concept).
▫ Additive: ~
, ~
, and independent,
then
~
(Proof: use MGF)
• Exponential Distribution
as waiting time
until first success/arrival/occurrence or inter-arrival
time.
• Gamma Distribution
as waiting time
until
success/arrival/occurrence.
16
Examples of Poisson Process
•
•
•
•
•
Radioactive disintegrations
Flying-bomb hits on London
Chromosome interchanges in cells
Connection to wrong number
Bacteria and blood counts
Feller: An Introduction to Probability Theory and Its Applications (3e) Vol. 1. §VI.6.
17
Geiger
Rutherford
Chadwick
Radioactive Disintegrations
Geiger Counter
Rutherford, Chadwick, and Ellis’ 1920 Experiment
#intervals
(recorded)
#intervals as predicted by
∗
3.87
0
57
54.5439956
1
203
210.9397008
2
383
407.8868524
3
525
525.8111954
4
532
508.371522
5
408
393.2082186
6
273
253.4444078
7
139
140.0219269
8
45
67.68889735
9
27
29.08615451
10
16
16.99712879
N
2608
Intensity(7.5s)
3.867331
18
19
Explanation
• There are 57 time intervals (7.5 sec each) recorded
zero emission.
• There are 203 time intervals (7.5 sec each) recorded
1 emission.
• ……
• There are total 2608 time intervals (7.5 sec each)
involved.
• On average, each interval recorded 3.87 emissions.
• Use 3.87 as the intensity of the Poisson process that
models the counts of emissions on each of the 2608
intervals.
20
What’s the waiting time until
recording 40 emissions?
• Assuming emission mechanism follows a
Poisson process with intensity
over
every 7.5s interval, then waiting time until
recording the
emission follows
.
emission
• The waiting time of recording the
follows
and its expected value
is 40/3.87=154.8 intervals (each of 7.5s long) or
1161 seconds (a bit more than 19 minutes).