Example Lemma Theorem Corollary 등확률모형 Objective 확률실험

Lecture Note in Probability and Statistics Theory1
Combinations
Example Lemma Theorem Corollary
등확률모형 Objective 확률실험 (Experiment) 을 이해하기 위하여 필요한
경우의 수 (Counting), 순열 (Permutation), 조합 (Combinations) 에 대한 개념을
다룬다.
실험은 무수히 반복가능하며 실험의 조건에 따라 서로 다른 결과를 나타내며
그 결과들은 잘 정의된 결과물들의 집합에 속한다. 실험을 할 때 특정 조건이
그 결과를 결정하면 deterministic experiment 라고 하며, 여러가지 결과를
나타낼 수 있으면 이를 random experiment 라고 한다.
서로 다른 두 실험을 한다고 하자. 첫번째 실험을 통해서는 m 개의 가능한
결과 중 하나를 얻을 수 있고, 두번째 실험은 n 개 중 하나를 얻는다고 할 때,
두 실험을 통해서 얻을 수 있는 결과들의 전체 경우의 수는 mn 이다.
Generalized Basic principle of counting
There are r experiments.
Ep 1 results in n1 possible outcomes.
For each outcome of Ep 1, there are n2 possible outcomes of Ep 2
Lecture Note in Probability and Statistics Theory2
Combinations
n
(x + y)
=
=
=
=
=
!
n − 1 k n−1−k
(x + y)(x + y)
= (x + y)
x y
k
k=0
!
!
n−1
X n − 1 k+1 n−1−k n−1
X n − 1 k n−k
x
y
+
x y
k
k
k=0
k=0
!
!
n−1
n
X
n − 1 i n−i X n − 1 i n−i
xy
+
xy
i−1
i
i=1
i=0
!
!
n−1
X n − 1 i n−i
n−1 n 0
xy
+
x y
i−1
n−1
i=1
!
!
n−1
n − 1 0 n X n − 1 i n−i
+
x y +
xy
0
i
i=1
!
!
n−1
X
n−1
n−1
n
x +
+
xi y n−i + y n .
i−1
i
i=1
n−1
n−1
X
Example
Lecture Note in Probability and Statistics Theory3
Combinations
집합 A 가 n 개의 원소로 이루어져 있을 때, 그 부분집합의 개수는 ?
!
n
X
n
= (1 + 1)n = 2n .
k
k=0
Multinomial Coefficients Ex: A set of n distinct items is to be divided into
r distinct groups of respective sizes n1 , n2 , . . . , nr , where
n1 + n2 + . . . + nr = n. How many divisions are possible?
!
n!
n
=
n1 , n2 , . . . , nr
n1 !n2 ! . . . nr !
Theorem
Multinomial theorem
(x1 + . . . + xr )n =
X
n1 +...+nr =n
!
n
r
xn1 . . . xn
r
n1 , n2 , . . . , nr 1
Lecture Note in Probability and Statistics Theory4
Conditional probability
실험의 결과와 관련된 부분정보를 알고 있다면 실험의 결과에 대한 확률에
어떤 영향을 줄까 ? When no partial information is available, it is often useful
to use conditional prob. as a tool to enable us to compute the desired
probability more easily.
Definition
P (E|F ) : conditional probability that E occurs given F has occurred, if
P (F ) > 0 then P (E|F ) = P (E ∩ F )/P (F ).
Remark
(1) Our definition of P (E|F ) is consistent with the interpretation of
probability as being a long-run relative frequency.
(2) Multiplicative rule :
P (E1 E2 ...En ) = P (E1 )P (E2 |E1 )...P (En |E1 ...En−1 ).
Proposition (Bayes formula)
Lecture Note in Probability and Statistics Theory5
Independent Events
Let Fj , j = 1, ..., n are mutually exclusive events, Then
P (E|Fj )P (Fj )
P (Fj |E) = Pn
.
1 P (E|Fj )P (Fj )
Proposition
Let E be an event,
0 ≤ P (E|F ) ≤ 1
P (S|F ) = 1
Let Ej , j = 1, ..., n are mutually exclusive events, Then
P(
∞
[
1
Ej |F ) =
∞
X
P (Ej |F ).
1
Remark
By the result of the proposition, the conditional probability is a probability.
Lecture Note in Probability and Statistics Theory6
Introduction
Definition
Two events E and F are independent if
P (EF ) = P (E)P (F ).
Remark
Assume E, F are indep. then
(1) P (E|F ) = P (E).
(2) E and F c also are indep.
여러가지 확률모형
확률과 관련된 개념 및 용어 :
(표본공간) Sample Space
(사건) Event
(근원사건)Elementary Event
(사건의 상대도수) Relative Frequency of an Event
(확률) Probability
Lecture Note in Probability and Statistics Theory7
Sample space and Events
(확률의 공리) Axioms of Probability
(확률의 특징) Basic Properties of Probability
Objectives: To define notation and solve some fundamental questions:
1
확률은 무엇을 대표하는가 ?
2
확률을 어떻게 정의하는가 ? 그리고 항상 존재하는가 ?
3
확률이 가지고 있는 특징은 무엇인가 ?
실험을 통해서 장차 어떤 결과가 나오는지 알 수 없다. 하지만 모든 가능한
결과의 집합을 안다고 가정할 수는 있다. 이렇게 실험의 결과로 가능한
결과값의 집합을 sample space (표본공간) 이라고 부르며 통상적으로 S 라고
표현한다. 또한 표본공간의 부분집합, 즉 E ⊂ S 를 Event(사건) 이라고
부른다.
Note: 표본공간은 실험자가 관심을 갖는 대상에 따라 정의가 달라질 수 있다.
즉 동일한 실험이라 할지라도 관심사에 따라 서로 다른 표본공간이 정의 된다.
Example
Lecture Note in Probability and Statistics Theory8
Sample space and Events
Ep. to toss two coins:
S1 = {HH, HT, T H, T T }
S2 = {(2, 0), (1, 1), (0, 2)}
where (number of tails, number of heads)
S3 = (A, D)
where A- alike, D is different
Note: (1) Any designated collection of possible outcomes of an Ep
constitutes an event, including individual outcomes, the entire sample space,
and the null set.
(2) An event is said to occur if the outcome of an Ep is one of the
constitute members of the event.
Lecture Note in Probability and Statistics Theory9
Sample space and Events
Elementary event (e): a single element of S (one possible outcome)
S = {ei ; i = 1, 2, . . . , n} finite S
S = {ei ; i = 1, 2, . . . , ∞} countable S
Example
Lecture Note in Probability and Statistics Theory10
확률의 정의 : Axiom of probability
A lottery consists of 5 tickets, two of which will be drawn and designated as
the winners. If the two prizes to be awarded are the same, what is the
sample space? Ep . to select 2 winner tickets out of 5 tickets. Let the
tickets be numbered, through 1 to 5; then there are 10 possible pairs of
numbers that can be drawn.
S = {(1, 2), (1, 3), (1, 4), (1, 5), (2, 3), (2, 4), (2, 5), (3, 4), (3, 5), (4, 5)}
If the prize are different, then
S = {(1, 2), (1, 3), (1, 4), (1, 5), (2, 3), (2, 4), (2, 5), (3, 4), (3, 5), (4, 5),
(2, 1), (3, 1), (4, 1), (5, 1), (3, 2), (4, 2), (5, 2), (4, 3), (5, 3), (5, 4)}
where (x, y): x- ticket # of 1st prize y- ticket # of 2nd prize
How to define the probability using the sample space?
Relative frequency Let S : sample space and E : events, and n(E) : the
Lecture Note in Probability and Statistics Theory11
확률의 연속성 : continuity of probability
number of times that events E occurs in the first n-experiments. Then
P (E) == lim
n→∞
n(E)
→ cE .
n
: Limiting percentage of times that E occurs.
Axioms of probability :
Axiom 1 : For any event A ⊂ Ω, 0 ≤ P (A) ≤ 1
Axiom 2 : P (Ω) = 1.
Axiom 3(countable additivity) : A1 , A2 , . . . , are mutually exclusive
events, Then
P (∪∞
1 Ai ) =
∞
X
1
P (·) : A → P (A)
is continuous function.
Definition
P (Ai ).
Lecture Note in Probability and Statistics Theory12
확률의 연속성 : continuity of probability
극한 사건 : For A1 , A2 , . . .,
A1 ⊂ A2 ⊂ . . . ⊂ An ⊂ . . .
we say that {An } are increasing events, and ∪∞
n=1 An is the limiting event,
lim An = ∪∞
n=1 An
n→∞
and {Bn } such that
B1 ⊃ B2 ⊃ . . . ⊃ Bn ⊃ . . .
{Bn } is called decreasing events, and ∩∞
n=1 Bn is the limiting event,
lim Bn =
n→∞
Example
∞
\
n=1
Bn
Lecture Note in Probability and Statistics Theory13
확률의 연속성 : continuity of probability
표본공간 Ω = (0, 1) 일때, An = (1/n, 1), n = 1, 2, . . . 인 사건열을 생각하면,
{An } 은 증가사건열이고 따라서 극한사건은
lim An =
n→∞
∞
[
An =
n=1
∞
[
(1/n, 1) = (0, 1)
n=1
또한 Bn = (0, 1/n) 은 감소사건열이고 따라서 극한사건은
lim Bn =
n→∞
Definition
∞
\
n=1
Bn =
∞
\
(0, 1/n) = ∅.
n=1
Lecture Note in Probability and Statistics Theory14
확률의 연속성 : continuity of probability
For {An } s.t
A1 ⊂ A2 ⊂ . . . ⊂ An ⊂ . . .
if
P ( lim An ) = lim P (An ),
n→∞
n→∞
then P is called continuous from bellow.
Theorem
If {An } ↑, then
lim P (An ) = P ( lim An ).
n→∞
Proof.
n→∞
Lecture Note in Probability and Statistics Theory15
확률의 연속성 : continuity of probability
For {An } ↑, define {Bn } s.t
B1 = A1 , B2 = A2 − A1, . . . , Bn = An − An−1 , . . . .
Then {Bn } are mutually exclusive, and An = ∪n
k=1 Bk . Since
∞
[
An =
n=1
∞ [
n
[
Bk =
n=1 k=1
∞
[
Bk ,
k=1
then
P ( lim An )
n→∞
=
P(
∞
[
Bk )
k=1
=
∞
X
P (Bk ) by countable additivity
k=1
=
=
lim
n→∞
n
X
lim P (
n→∞
P (Bk )
k=1
n
[
k=1
Bk ) by finite additivity
Lecture Note in Probability and Statistics Theory16
Distribution function
Example
성공률이 표본공간 Ω = (0, 1) 일때, An = (1/n, 1), n = 1, 2, . . . 인 사건열을
생각하면, {An } 은 증가사건열이고 따라서 극한사건은
lim An =
n→∞
∞
[
An =
n=1
∞
[
(1/n, 1) = (0, 1)
n=1
또한 Bn = (0, 1/n) 은 감소사건열이고 따라서 극한사건은
lim Bn =
n→∞
확률변수 (Random variables)
∞
\
n=1
Bn =
∞
\
(0, 1/n) = ∅.
n=1
Let S be a sample space and P is a
probability defined on S, X is called a r.v defined by the mapping
X : S → R.
F is called a cumulative distribution function(cdf) defined by
F (x) = P (X ≤ x) = P (ω : X(ω) ≤ x).
Lecture Note in Probability and Statistics Theory17
Discrete r.v’s
Remark (Properties of cdf)
F is nondecreasing function.
limx→∞ F (x) = 1
limx→−∞ F (x) = 0
F is right continuous, that is for any x, there exists a decreasing
sequence {xn } converging to x s.t.
lim F (xn ) = F (x)
n→∞
A r.v that can take on at most a countable number of possible values is said
to be discrete.
Remark
Lecture Note in Probability and Statistics Theory18
Discrete r.v’s
(1) probability mass function p(x) of r.v X is defined by
p(x) = P (X = x)
(2) properties of p.m.f : Assume that X ∈ {x1 , x2 , . . .}
p(xi ) ≥ 0, i = 1, . . .
P
p(x) = 1
Example
Lecture Note in Probability and Statistics Theory19
Discrete r.v’s
Let X is discrete of which support is all non-negative integers. And the p.m.f
is defined by
p(x) =
cλx
,λ > 0
x!
Find P (X = 0)
P
P
x
(Sol)Since
p(x) = 1, and using the fact ea = ∞
x=0 a /x!, so,
1=c
∞
X
λx
= ceλ ,
x!
0
Thus, c = e−λ . Hence
p(X = 0) = e−λ λ0 /0! = e−λ
P (X > 2)
Lecture Note in Probability and Statistics Theory20
Expected values
Remark
cumulative distribution function F can be expressed in terms of p(x) by
F (x) = P (X ≤ x) =
X
p(a)
a≤x
Expectation of the r.v X is defined by
E(X) =
X
xp(x).
x
: the weighted average of the possible values that X takes on, each weight is
the probability on that values.
Example
Lecture Note in Probability and Statistics Theory21
Expectation of a function of a r.v’s
Let I is the indicator for the event A, that is
I = 1 if A occurs or 0 o.w.
Find E(I)
E(I) =
X
Example
xp(x) = 0 × p(0) + 1 × p(1) = 0 × P (Ac ) + 1 × P (A) = P (A)
Lecture Note in Probability and Statistics Theory22
Expectation of a function of a r.v’s
Let X ∈ {−1, 0, 1} is r.v with probabilities
P (X = −1) = 0.2, P (X = 0) = 0.5, P (X = 1) = 0.3.
Compute E(X 2 )
Let Y = X 2 , then Y ∈ {0, 1} and
P (Y = 0) = P (X = 0) = 0.5 and ,
P (Y = 1) = P (X ∈ {−1, 1}) = P (X = −1) + P (X = 1) = 0.5.
Hence,
E(X 2 ) = E(Y ) = 0(0.5) + 1(0.5) = 0.5
Proposition
Lecture Note in Probability and Statistics Theory23
Expectation of a function of a r.v’s
If X is r.v that takes on {xi , i ≥ 1}, with respective probabilities p(xi ), then
for any real-valued function g
E(g(X)) =
X
g(xi )p(xi ).
xi
Proof
X
g(xi )p(xi )
=
i
=
X
X
j
i:g(xi )=yj
X
j
=
X
X
yj
Corollary
p(xi )
i:g(xi )=yj
yj P (g(X) = yj )
j
=
g(xi )p(xi )
E(g(X))
Lecture Note in Probability and Statistics Theory24
The poisson r.v.
If a, b are constants and X is r.v. Then
E(aX + b) = aE(X) + b.
Proof
E(aX + b)
X
=
(ax + b)p(x)
x:p(x)>0
=
a
X
x:p(x)>0
=
Read text book.
X
xp(x) + b
p(x)
x:p(x)>0
aE(x) + b
Read text book.
A r.v X, taking on one of the
variables 0, 1, 2, . . . is said to be Poisson r.v with parameter λ if for some
λ > 0, the p.m.f is
p(i) = P (X = i) =
Poisson approximation of the Binomial r.v
e−λ λi
.
i!
Lecture Note in Probability and Statistics Theory25
The poisson r.v.
Suppose that X ∼ Bin(n, p) and let λ = np. Then
P (X = i)
=
=
=
n!
pi (1 − p)n−i
(n − i)!i!
i n−i
n!
λ
λ
1−
(n − i)!i! n
n
n(n − 1) . . . (n − i + 1) λi (1 − λ/n)n
ni
i! (1 − λ/n)i
Now, for a sufficient large n and λ,
n
i
n(n − 1) . . . (n − i + 1)
λ
λ
1−
≈ e−λ ,
≈
1,
1
−
≈ 1,
n
ni
n
Hence,
P (X = i) =
e−λ λi
.
i!
Examples of Poisson r.v’s that obey the Poisson probability law
The number of misprints on a page of a book.
The number of 100-aged people in a community.
Lecture Note in Probability and Statistics Theory26
The poisson r.v.
The number of wrong telephone numbers that are dialed in a day.
The number of customers entering a post-office on a given time.
The number of α-particles discharged in a fixed period of time from
some radioactive material.
Expectations of Poisson r.v’s
E(X)
=
∞
∞
X
X
e−λ λi−1
e−λ λi
=λ
i
i!
(i − 1)!
i=1
i=0
=
λe−λ
∞
X
λj
j!
j=0
=
λ,
since
∞
X
λj
= eλ .
j!
j=0
Lecture Note in Probability and Statistics Theory27
Other discrete r.v’s
Geometric r.v
E(X 2 )
=
∞
X
∞
i2
i=0
X iλi−1
e−λ λi
= λe−λ
i!
(i − 1)!
i=1
∞
X
(j + 1)λj
j!
j=0
=
λe−λ
=
λe−λ
=
λ(λ + 1).
∞
∞
X
jλj X λj
+
j!
j!
j=0
j=0
!
= λ E(X) + e−λ eλ
V ar(X) = E(X 2 ) − (E(X))2 = λ.
Poisson process and Poisson r.v’s(see chapter 9)
Let X be the number of events occurred in a given interval of length h
P (X = 1) = λh + o(h)
P (X ≥ 2) = o(h)
For any integers n, j1 , . . . , jn , and any set of non-overlapping intervals
Lecture Note in Probability and Statistics Theory28
Other discrete r.v’s
The negative binomial r.v
X ∼ Geo(p) if its pmf is
p(x) = (1 − p)x−1 p, x = 1, 2, . . . .
Intuition : Suppose that the independent trials, each having success prob
p ∈ (0, 1) and performed until a success occurred.
Remark
(1) p(x) is a pmf. Check!
∞
X
(1 − p)x−1 p =
x=1
p
=1
1 − (1 − p)
(2)
E(X) = 1/p, E(X 2 ) =
2
1
−
p2
p
(3)
V ar(X) =
1−p
.
p2