Random Generators)
15
125
Random Generators (facultative)
The subject of the section is the generation of realizations
of a probabilty measure P by means of random generators. The GlivenkoCantelli theorem oers a possibilty of an
empirical examination.
15.1 The necessity of sample realizations
Probability measures (distributions) reveal themselves
in statistics by their realizations, which are the basis for estimations (presumptions) with respect to the
(unknown) probability measure, generating the realizations..
For many tasks articially generated (sample)
realizations are needed, whose probability law (measure) is known. If one wants to evaluate empirically
an estimation procedure for the specication of a certain probability meausre basing on its realizations, then
the knowledge of this probability measure is presupposed.
This stresses the availability of techniques for the
Random Generators)
126
generation of realizations of a given probability
measure P .
Anticipating later explications, the generation of realizations according to a given probability measure P
can be reduced to the one of realizations according to
the uniform distribution λ(0;1) over the interval (0; 1).
15.2 (0; 1)Realizations
The generation of random numbers by means of random generators means rst the generation of random
numbers on a nite segment of N ∪ {0}.
The generation of random numbers from the interval
(0; 1), the socalled (0; 1)realizations results from
an appropriate normalization.
15.3 The provision of articial (0; 1)realizations
In principle, there are two possibilities for generating
random numbers and consequently of (0; 1)realizations.
The one possibility consists of using a suitable physical experiment, for example processing certain electromagnetic or electric signals. In this case one speaks of
Random Generators)
127
the generating sample values by a physical random
genrator.
The other possibility consists in the algorithmic generation of a deterministic sequence of numbers, where
with the determination of the iteration rule it is tried
to imitate randomness, as well as possible. In this case
one speaks of pseudo random generators.
Although much can be said for the utilization of physically generated realizations, using socalled pseudo
random generators is preferred; which results from
the fact, that random data can be reconstructed
at any time without great eort when using pseudo
random generators.
15.4 Requirements
Even if (pseudo) random generators deliver by their
construction principally a deterministic sequence of
(0; 1)realizations, they have to suce at least aspects
of randomness within stochastics, in the sense, that
the statistical refutation is only possible basing
on big sample sizes.
Aspired in stochastics are realizations which may
Random Generators)
128
be interpreted as realizations of independent ran-
dom variables following the uniform distribution on (0, 1).
15.5 Types of (Pseudo) random generators
(Pseudo) random generators are algorithmic procedures, which generate a deterministic sequence of numbers from a nite segment of A ⊂ N ∪ {0}, (which then
are normalized to (0; 1)).
If the random generators are dened by an itertation
map f , i.e.if for a sequence of random numbers (zi )
zi+1 := f (zi ),
i ∈ N,
holds true, then the random generators are periodic
because of zi ∈ A , i ∈ N, i.e. there exists a so
called period (number) p ∈ N such that
zi = zi+p ,
i ∈ N.
Various types of random generators have become wellknown, its most wellknown is the socalled linear(congruence-)generator.
In Experimental Stochastics, O.Moeschlin et al., Berlin
Random Generators)
129
Heidelberg New York, 1998, a presentation of types
of random generators can be found; there also is shown,
how the realizations of such random generators may be
tested with respect to the desired requirements.
15.6 Realizations according to a probability measure P
Random generators oered today within softwarepackages
deliver (as a rule) realizations ui , i ∈ N, which indeed may be furthest reaching interpreted as those of
an independent sequence of random variables following
the uniform distribution over (0; 1).
If P is a probability measure (on B over R with
the generalized inverse distribution function F inv ,
then the numbers F inv (ui ) are realizations of an
independent sequence of random variables following P , cf. 8.5.
15.7 The evaluation of the generation of realizations of a probability measure P
15.7.1 Denition empirical distribution function)
Random Generators)
130
Let Xi : (Ω, A, P ) → (R, B), i ∈ N be random variables distributed according to P over (R, B), i.e. PXi =
P, i ∈ N .
For a realization ω̄ ∈ Ω the xi := Xi (ω) represent
realizations of the r.v. Xi , i ∈ N . Let further be
X = (Xi )i∈N
und x = (xi )i∈N .
The mapping Fn : R × Ω → [0, 1] which is dened by
(15.7.1.1)
n
1 X
Fn (t, ω̄) :=
1(−∞;t] ◦ Xj (ω̄),
(t ∈ R, ω̄ ∈ Ω)
n j=1
turns out to be the distribution function of a special
probability measure.
Fn is called the empirical distribution function of
the r.v. X1 , . . . , Xn .
Notice: For a xed t ∈ R resp. for a xed ω̄ ∈ Ω the
term
1(−∞;t] ◦ Xj (ω)
according to the denition of an indicator function attains either the value 1 or 0, depending on whether
Xj (ω̄) ∈ (−∞; t]
Random Generators)
131
or
Xj (ω̄) ∈
/ (−∞; t]
holds true.
Therefore Fn (t, ω̄) represents the percentage of those
Xj (ω̄), j ∈ Nn , lying within the interval (−∞; t].
As a rst step on the way to the main theorem of
statistics of Glivenko-Cantelli the following theorem concerning the empirical distribution function Fn
can be proved.
15.7.2 Theorem
Let the assumptions of the Denition 15.7.1 be given,
where in particular any nite number of r.v. Xi , i ∈ N
are independent. Let F be the distribution function of
P.
Then we have
P ({ω ∈ Ω| lim Fn (t, ω̄) = F (t)} = 1,
n→∞
t ∈ R,
i.e. for almost all ω ∈ Ω with the exception of a
P null set Fn (t, ω̄) converges to F (t).
Random Generators)
132
This fact will be used in Experiment 15.1 to evaluate
the quality of (0; 1) realizations.
For (0; 1) realizations ui displayed on the vertical axis
realizations xi := F inv (ui ) are determined according
to the explanations of 15.6. Following Theorem 8.5
the xi are realizations of independent r.v. distributed
according to P , if the ui are realizations of independent
r.v. distributed according to the uniform distribution
over (0; 1), which can be examined on the basis of the
pointwise convergence of Fn (ti , ω̄) to F (ti ) for selected
values of ti .
© Copyright 2026 Paperzz