Convergence of Probability distributions. Limit theorems

Convergence of
Probability distributions.
Limit theorems
Convergence in distribution
Suppose that F1, F2, ... is a sequence of cumulative distribution
functions corresponding to random variables X1, X2, ..., and that F
is a distribution function corresponding to a random variable X.
We say that the sequence Xn converges towards X in distribution, if
for every for every real number a at which F is continuous
Since F(a) = Pr(X ≤ a), this means that the probability that the
value of X is in a given range is very similar to the probability
that the value of Xn is in that range, provided n is sufficiently
large. Convergence in distribution is the weakest form of
convergence, and is sometimes called weak convergence.
Convergence in probability
To say that the sequence Xn converges towards X in probability
means
for every ε > 0. Formally, pick any ε > 0 and any δ > 0. Let Pn be
the probability that Xn is outside a tolerance ε of X. Then, if Xn
converges in probability to X then there exists a value N such
that, for all n ≥ N, Pn is itself less than δ.
Convergence in probability is often denoted by adding the letter
'P' over an arrow indicating convergence:
Convergence in probability implies convergence in distribution.
Almost sure convergence
To say that the sequence Xn converges almost surely or
almost everywhere or with probability 1 or strongly towards
X means
This means that the values of Xn approach the value of X, in the
sense hat events for which Xn does not converge to X have
probability 0. Using the probability space (Ω, F, P) and the
concept of the random variable as a function from Ω to R, this
is equivalent to the statement
Independent random variables
Let X1 and X2 be two random variables. X1 and X2 are called
stochasticly independent (or independent) iff
∀B1, B2 ∈ B R
P(( X1, X 2 ) ∈ B1 × B2 ) = P( X1 ∈ B1 ) ⋅ P( X 2 ∈ B2 )
Remarks
1. If X1 and X2 are independent ⇒ X1 and X2 are
uncorrelated, Cov( X1, X 2 ) = 0 .
2. Cov( X1, X 2 ) = 0 does not necessary mean
independence. However, uncorrelated Gaussian
random variables are independent.
X1, X2, X3,...,Xn...− nm
The central limit theorem (Lindeberg-Levy)
Let X1, X 2 , X 3 ,..., X n ... be a sequence of of independent
random variables. All r.v. have the same dustribution
with the expected value m and variance σ2, then
X1 + X 2 + X 3 + ... + X n ... − nm
nσ
Converges in distribution to standard normal random
variable
Acrobat Document
The weak law
The weak law of large numbers states that the sample average
converges in probability towards the expected value µ.
That is to say that for any positive number ε,
The strong law
The strong law of large numbers (Kolmogorov) states that the
sample average converges almost surely to the expected value
Thank you
y
o
u
»
h
a
n
k