ONE RANDOM VARIABLE
The Cumulative Distribution Function (cdf)
cdf is defined as the probability of the event {X x}:
FX ( x) P[ X x]
-
x
Applies to discrete as well as continuous RV.
Example: three tosses of a coin
1
x 1
8
13 1
1 x 2
8 8 2
FX ( x )
133 7
2 x3
8 8 8 8
1 3 3 1
1
x3
8 8 8 8
The cdf can be written compactly in terms of the unit step function:
1
3
3
1
FX ( x ) u( x ) u( x 1) u( x 2) u( x 3)
8
8
8
8
Example: The Uniform RV in the unit interval [0,1].
x0
0
FX ( x ) x 0 x 1
1
x 1
0 x 0
u( x )
1 x 0
Example: The waiting time X of a customer at a taxi stand is zero if the customer finds a taxi
parked at the stand, and a uniformly distributed random length of time in the interval [0, 1] (in
hours) if no taxi is found upon arrival. The probability that a taxi is at the stand when the
customer arrives is p. Find the cdf of X.
Use total probability theorem,
FX ( x) P[ X x] P[ X x | find taxi] p P[ X x | no taxi](1 p)
0 x 0
P[ X x | find taxi]
1 x 0
x0
0
P[ X x | no taxi] x 0 x 1
1
x 1
0
x0
FX ( x ) p (1 p) x 0 x 1
1
x 1
basic properties of the cdf
(i) 0 FX ( x) 1 .
(ii) lim FX ( x) 1 .
x
(iii) lim FX ( x) 0 .
x
(iv) FX ( x ) is a nondecreasing function of x, that is, if a b, FX (a) FX (b) (nondecreasing
function).
(v) FX ( x ) is continuous from the right, that is, for h 0, FX (b) lim FX (b h) FX (b ) .
h0
(vi) P[a X b] FX (b) FX (a)
(vii) P[ X b] FX (b) FX (b ) (if FX ( x ) has a jump at x = b).
(viii) P[ X x] 1 FX ( x) .
The Three Types of Random Variables
Discrete random variables: have a cdf that is a right-continuous, staircase function of x, with
jumps at a countable set of points x0, x1, x2, ...
FX ( x )
p
xk x
X
( xk ) p X ( xk )u( x xk )
k
Continuous random variables: is continuous everywhere, and which, is sufficiently smooth
that it can be written as an integral of some nonnegative function f(x):
x
FX ( x )
f ( x )dx
P[ X x ] 0
Random variable of mixed type: is a random variable with a cdf that has jumps on a
countable set of points x0, x1, x2, ... but that also increases continuously over at least one
interval of values of x.
Cdf:
FX ( x ) pF1 ( x ) (1 p) F2 ( x )
0 p 1,
F1 ( x ) : cdf of a discrete RV; F2 ( x ) : cdf of a cont. RV
The Probability Density Function (pdf)
probability density function of X (pdf) :
f ( x)
dFX ( x )
dx
The pdf represents the “density” of probability at the point x:
P[ x X x h] FX ( x h) FX ( x )
FX ( x h) FX ( x )
h
h
Properties:
(i) f X ( x) 0 (FX ( x) is a non-decreasing function of x).
b
(ii) P[a x b] f X ( x )dx probability of an interval [a, b]
a
x
(iii) FX ( x )
f X (t )dt .
(iv)
f X ( x )dx 1 .
pdf of Discrete Random Variables
f X ( x)
dFX ( x )
p X ( xk ) ( x xk )
dx
k
f X ( x )h
Conditional cdf’s and pdf’s
conditional cdf of X given C
FX ( x | C )
P[{ X x} C ]
P[C ]
f X ( x | C)
conditional pdf of X given C :
P[C ] 0
dFX ( x | C )
dx
Example:
The lifetime X of a machine has a continuous cdf FX ( x ) . Conditional cdf and pdf given the
event C = {X > t} (machine still working at time t)
conditional cdf:
FX ( x | X t )
P[{ X x} { X t}]
P[ X t ]
0
P[{ X x} { X t}]
P[t X x ] FX ( x ) FX (t )
xt
xt
P[ X t ] 1 P[ X t ] 1 FX (t )
0
FX ( x | X t ) FX ( x ) FX (t )
1 F (t )
X
f X ( x | X t)
conditional pdf:
cdf of X in terms of the conditional cdf’s;
f X ( x)
1 FX (t )
xt
xt
xt
partition of S: B1, B2, ..., Bn
n
n
i 1
i 1
FX ( x ) P[ X x ] P[ X x | Bi ]P[ Bi ] FX ( x | Bi ]P[ Bi ]
n
f X ( x ) f X ( x | Bi ]P[ Bi ]
i 1
Example:
A binary transmission system sends a “0” bit by transmitting a (–v) voltage signal, and a “1”
bit by transmitting a (+v). Received signal corrupted by Gaussian noise:
Y=X+N
X: the transmitted signal, N is a noise voltage with pdf f N ( x )
Assume P[“1”] = p = 1 P[“0”]. Find the pdf of Y.
B0 = {“0” is transmitted} B1 = {“1” is transmitted}.
FY ( x ) FY ( x | B0 ]P[ B0 ] FY ( x | B1 ]P[ B1 ]
P[Y x | X v ](1 p) P[Y x | X v ] p
P[Y x | X v ] P[ N x v ] FN ( x v )
P[Y x | X v ] P[ N x v ] FN ( x v )
FY ( x ) FN ( x v )(1 p) FN ( x v ) p
dFY ( x )
f N ( x v )(1 p) f N ( x v ) p
dx
x2
( x v )2
( x v )2
2
1
1
2 2
2
2
f N ( x)
e
fY ( x )
(1 p) e 2 p
e
2
2
fY ( x )
THE EXPECTED VALUE
expected value or mean of a random variable X:
E[ X ]
tf
X
exists if E[ X ]
(t )dt
Example:
t
f X (t )dt
Mean of a Uniform Random Variable
1
1 b2 a 2 a b
E[ X ]
t dt
ba a
2 ba
2
b
f X (m x) f X (m x) then E [X] = m.
If the pdf is symmetric about a point m:
Example: Mean of Exponential Random Variable
f X ( x) e
x
x 0 E[ X ] x e
x
0
1
x
dx xe
e x dx
0
0
Expected Value of Y = g (X) :
E[Y ]
g ( x) f
X
( x )dx
Example: Expected Values of a Sinusoid with Random Phase
Y a cos(t )
where a, and t are constants; is uniform in [0,2 ].
expected value of Y and expected value of the power of Y, Y 2:
E[Y ] E[a cos(t )]
1
2
2
1
a cos(t )d 2 a sin(t )
2
0
0
0
1
E[Y 2 ] E[a 2 cos2 (t )] E a 2 1 cos(2t 2)
2
1
1
a 2 1
2 2
2
1
cos(2t 2 )d 2 a
2
0
Some properties:
(i) E[c] c where c is constant.
(ii) E[cX ] cE[ X ]
n
n
k 1
(iii) E gk ( X ) E[ gk ( X )]
k 1
Variance of X
VAR[ X ] E ( X mX )2 E[ X 2 ] mX2
Standard Deviation:
m X E[ X ]
STD[ X ] VAR[ X ]
Example: Variance of uniform RV
mX
ab
2
E ( X mX )2
1
ab
(b a ) 2
x
dx
b a a
2
12
2
b
Example: Variance of the Gaussian RV
1
f X ( x )dx
2
e
( x m )2
2 2
dx 1
e
( x m )2
2 2
dx 2
Differentiate with respect to :
( x m) 2
3 e
( x m )2
2 2
dx 2
The nth moment of the random variable X:
E[ X n ]
x
n
f X ( x )dx
1
2
( x m) e
2
( x m )2
2 2
dx 2
IMPORTANT RVs
The uniform RV
1
f X ( x) b a
0
Pdf:
a xb
x a, x b
0
x a
FX ( x )
b a
1
Cdf
xa
a xb
xb
mean and variance:
ab
2
E[ X ]
VAR[ X ]
(b a ) 2
12
The exponential RV
Pdf:
0
f X ( x) x
e
Cdf:
0
FX ( x )
x
1 e
x0
x0
x0
x0
mean and variance:
E[ X ]
memoryless property:
proof:
1
VAR[ X ]
1
2
P[ X t h | X t ] P[ X h]
P[ X t h | X t ]
P[{ X t h} { X t}] P[ X t h] e ( t h )
t e h P[ X h]
P[ X t ]
P[ X t ]
e
The Gaussian (Normal) Random Variable
Pdf:
1
f X ( x)
e
2
( x m )2
2 2
x
1
FX ( x ) P[ X x ]
2
Cdf:
x
e
( u m )2
2 2
1
t u m FX ( x )
2
1
x
2
where
x
e
t2
2
dt
du
( x m )/
t2
2
xm
e dt
pdf of a Gaussian RV with m = 0 and = 1.
Example: Show that Gaussian pdf integrates to one.
1
2
e
x r cos ,
1
2
x2
2
2
1 x2 y2 1
dx
e dy
e dx
2
2
2
y r sin
e
( x2 y2 )
2
2
x2 y2 r2
1
dxdy
2
2
e
r2
2
e
( x2 y2 )
2
dxdy rdrd
rdrd e
0 0
r2
2
rdr 1
0
The Q-function:
2
t
1
2
Q( x ) 1 ( x )
e
dt
2 x
The Gamma Random Variable
Pdf:
For 0, 0
f X ( x)
( x ) 1 e x
( )
0 x
where
( z ) x z 1e x dx
z0
the Gamma function
0
properties:
1
2
( z 1) z( z )
(m 1) m!
When m, integer 0 :
z0
m nonnegative integer.
m-Erlang RV
dxdy
f X ( x )dx
0
1 x
x e dx
( ) 0
y x
1
y
1 y
e dy
( )
0
f
X
( x )dx 1
0
The Cauchy Random Variable
f X ( x)
1/
1 x2
X has no moments because the integrals do not converge.
FUNCTIONS OF A RANDOM VARIABLE
Given Y = g(X)
Example:
problem is to find the pdf of Y in terms of the pdf of X.
Y aX b
FY ( y ) P[Y y ] P[aX b y ] P[ X
P[ X
1 y b
fX
a a
1
y b
fY ( y )
fX
a a
fY ( y )
y b
y b
] FX
a
a
if a 0
y b
y b
] 1 FX
a
a
a0
a0
if a 0
fY ( y )
1
y b
fX
a
a
Example: Y = X2
y0
0
P[Y y ]
P y X
y
y0
0
FY ( y )
FX ( y ) FX ( y )
y0
y0
y d y dF y d y
dy
dy
d y
d y
1
1
f y
f y
y0
2 y
2 y
fY ( y )
dFX
X
X
X
If X is Gaussian with mean m = 0, and SD = 1,
2
1 x2
f X ( x)
e
2
fX y
1 2y
e
2
fY ( y )
y
1
e 2
2 y
y0
(chi-square rv with one degree of freedom)
General Case
Solutions of y0 g ( x ) :
Let n = 3,
x0 , x1 , ...., xn
P[ y Y y dy ] P[ x1 X x1 dx1 ] P[ x2 dx2 X x1 ] P[ x3 X x3 dx3 ]
fY ( y ) dy f X ( x1 ) dx1 f X ( x2 ) dx2 f X ( x3 ) dx3
fY ( y )
f X ( x1 )
f X ( x2 )
f X ( x3 )
dy / dx x x
dy / dx x x
dy / dx x x
1
In general,
2
n
fY ( y )
k 1
Previous example: Y = X2
y x2
3
f X ( xk )
g ( xk )
2 solutions for y 0 :
g ( x ) 2 x g ( x1 ) 2 y
g ( x2 ) 2 y
y
x1 y
g ( x1 ) g ( x2 ) 2 y
y
y
f (x )
f (x )
1 e 2
1 e 2
1 e 2
fY ( y ) X 1 X 2
g ( x1 )
g ( x2 )
2 2 y
2 2 y
2 y
Example: Y = cos(X) X: uniformly distributed in [0,2],
y cos( x)
Two solutions in [0,2 ]
f X ( x)
x1 cos1 ( y )
g ( x) sin( x) g ( x1 ) sin cos1 ( y )
1
2
g ( x1 ) g ( x2 ) 1 y 2
0 x 2
x2 2 cos1 ( y)
g ( x2 ) sin cos1 ( y )
cos1 ( y ) y cos( ) sin( ) 1 cos2 ( ) 1 y 2
x2 y
fY ( y )
1
1
1
1
1
2
2
2 1 y
2 1 y
1 y2
1 y 1
THE MARKOV AND CHEBYSHEV INEQUALITIES
Markov inequality:
P[ X a ]
E[ X ]
a
X is non-negative.
This bound is useful if we don’t have any information about the RV except its mean value.
Proof:
E[ X ] t f X (t )dt
0
a
0
a
a
a
a
t f X (t )dt t f X (t )dt t f X (t )dt a f X (t )dt a f X (t )dt aP[ X a ]
If only the mean and variance are known:
The Chebyshev inequality:
Proof: Let D 2 ( X m)2
2
P X m a 2
a
Markov inequality
a tighter bound than Markov.
E[ D 2 a 2 ]
E[ D 2 ] 2
2
a2
a
Example: Multi-user computer system;
mean response time m = 15 s
standard deviation = 3 s
Estimate the probability that the response time is more than 5 seconds from the mean.
Chebyshev inequality a 5 s
9
P X 15 5
0.36
25
The Chernoff bound: Let A = {t a}. Indicator function of A,
1 t a
I A (t )
0 t a
P[ X a ] I A (t ) f X (t )dt
0
Choose the bound for the indicator function as
I A (t ) f X (t )dt
t
a f
0
X
(t )dt
0
I A (t )
E[ X ]
a
t
for all t 0
a
: Markov inequality.
Let the bound be
I A (t ) e s ( t a )
s0
0
s ( t a )
sa
st
sa
sX
I A (t ) f X (t )dt e f X (t )dt e e f X (t )dt e E[e ]
0
P[ X a ] e
sa
0
sX
E [e ]
: the Chernoff bound
The Characteristic Function:
Continuous RVs
X ( ) E[e j X ]
f X ( x )e j x dx
Fourier transform of the pdf
f X ( x)
1
2
X
( )e j x d
Inverse Fourier transform
Discrete RVs:
X ( ) p X ( xk )e j xk
k
X ( ) p X ( k )e jk
: integer-valued RVs
k
p X (k )
1
2
2
X
( )e jk d
0
Example: Exponential RV
0
0
X ( ) e x e j x dx e ( j ) x dx
j
Example: Geometric RV
k 0
k 0
X ( ) pqk e jk p ( qe j )k
p
1 qe j
The moment Theorem:
E[ X n ]
Proof:
1 dn
X ( )
j n d n
0
Taylor series expansion of e j x
( j x )2
....
2!
( j x )2
j x
f X ( x )e dx f X ( x ) 1 j x
... dx
2!
e j x 1 j x
X ( )
1 j x f X ( x)dx
( j )2
2!
x
2
f X ( x )dx ....
( j )2
( j)n
E[ X 2 ] .....
E[ X n ] ....
2!
n!
Compare with the Taylor series expansion of X ( ) about = 0 :
X ( ) 1 j E[ X ]
(Xn ) (0) n ( j )n
E[ X n ]
n!
n!
E[ X n ]
1 (n)
1 dn
(0)
X ( )
X
jn
j n d n
0
Example: mean and variance of an exponentially distributed random variable
X ( )
j
E[ X ]
( j )2
1
2
E[ X 2 ] 2 X (0) 2
j
X ( )
j
2
X ( )
( j )3
1
1
X (0)
j
VAR[ X ]
2
2
1
2
1
2
RELIABILITY
The reliability at time t is defined as the probability that the component, subsystem, or system
is still functioning at time t:
R(t ) P[T t ]
T: lifetime of system
R(t ) 1 P[T t ] 1 FT (t )
R(t ) fT (t )
FT (t ) : cdf of T.
The mean time to failure (MTTF) is given by the expected value of T:
0
0
0
E[T ] t f T (t )dt t R(t )dt R(t )dt
The conditional cdf of T given that T > t :
xt
0
FT ( x | T t ) P[T x | T t ] FT ( x ) FT (t )
1 F (t )
T
fT ( x | T t )
fT ( x )
1 FT (t )
xt
xt
r (t ) f T (t | T t )
The failure rate function r(t) :
R(t )
R (t )
r(t) dt = the probability that a component that has functioned up to time t will fail in the next
dt seconds.
Example: Exponential Failure Law
constant failure rate function :
r (t )
R(t )
R (t )
initial condition R(0) 1
dR
dt
R
R (t )
ln
t
R(0)
f T (t ) e t
t 0
R( t ) e t
t0
: T is an exponentially distributed RV.
Reliability of Systems
As : the event “ system functioning at time t ”
Aj : the event “jth component is functioning at time t ”
Series Connection:
P[ As ] P[ A1
A2
.... An ] P[ A1 ]P[ A2 ]....P[ An ]
R(t ) R1 (t ) R2 (t )....Rn (t )
If systems have constant failure rates: Ri (t ) e t
i
R(t ) e( 1 2 ....n ) t
Parallel Connection:
The system will not be functioning if and only if all the components have failed.
P[ Asc ] P[ A1c ]P[ A2c ]....P[ Anc ]
1 R(t ) 1 R1 (t ) 1 R2 (t ) .... 1 Rn (t )
R(t ) 1 1 R1 (t ) 1 R2 (t ) .... 1 Rn (t )
Example: Compare the reliability of a single-unit system against that of a system that operates
two units in parallel. Assume all units have exponentially distributed lifetimes with rate 1.
single-unit system:
Rs (t ) e t
two units in parallel:
R p (t ) 1 (1 e t )(1 e t ) e t (2 e t ) e t
R p (t ) Rs (t )
© Copyright 2026 Paperzz