Notes

Lecture 5
Multivariable Distributions
Let
experiment.
be the outcomes of n successive trials of an
Example: Toss a pair of dice.
Definition: Let
and
be discrete random variables. The joint
(or bivariate) probability function for
and
is given by
(
)
(
)
.
Theorem: If
and
are discrete random variables with joint
) then
probability function (
(1) (
(2) ∑
)
(
for all
)
For the example above:
Definition: For any random variable
distribution function is given by
)
.
and
(
the joint (bivariate)
)
(
Definition: Let
and
be continuous random variables with
). If there exists a non-negative
joint distribution function (
) such that
function (
(
)
∫ ∫
(
)
,
then
and
are called jointly continuous random variables.
(
The function
function.
Properties of (
(1)
(
(2) ∫
):
)
∫
) is called the joint probability density
for all
(
)
Properties of (
):
Example (#5.8): Let
function:
(
(a) Find k;
(b) Find (
(c) Find (
Solution:
)
and
have a joint probability density
{
);
).
Example (#5.16):
(
(a) Find
)
(
(b) Find (
Solution:
{
);
).
Marginal and Conditional Distributions
Example: Toss two dice.
Definition: Let
and
be jointly discrete (continuous) random
) (density function
variables with probability function (
(
)). Then the marginal probability function (marginal
density function) of and is given by
Example: Let (
Sketch (
)
{
.
) and find ( ) and ( ).
Definition: If
and
are jointly discrete random variables with
(
) and marginal probability functions ( ) and ( ),
then the conditional discrete probability function of and is
Definition: If
and
are jointly continuous random variables
with joint density function (
), then the conditional
distribution function of and is
Note: (
) is a function of
for fixed
.
Definition: Let
and
be jointly continuous random variables
with joint density (
) and marginal densities ( ) and
( )
( ), respectively. For any
such that
, the
conditional density of
given
is
Example: (#5.26)
(
)
Find
(a)
( ) and
(b)
(
(c)
(d)
(
(
(e)
(
Solution:
( );
);
);
);
).
{
Independent Random Variables
Recall:
Definition: Let
have distribution function
( ),
have
distribution function ( ), and
and
have joint distribution
function (
). Then
and
are said to be independent if
and only if (
)
( ) ( ) for all (
)
.
(Otherwise they are called dependent.)
Theorem: If
and
are discrete (continuous) random variables
)( (
))
with joint probability (joint density) function (
and marginal probability (density) functions ( ) and ( )
( ( ) and ( )), respectively, then and are independent if
and only if
(
)
( ) ( ) for all (
( (
)
( ) ( ))
)
.
Example: (die tossing) Show that
Example: (
Show that
Solution:
)
and
and
{
are independent.
are independent.
Theorem: Let
such that
(
and
have a joint density function
)
,
a, b, c, d are constants.
Then
(
and
)
are independent if and only if
( ) ( )
where ( ) and ( ) are nonnegative functions.
Note: ( ) and ( ) need not be density functions.
Example: (#5.52)
Are
and
Solution:
(
independent?
)
{
(
)
Expected Value of a Function of Random Variables
Definition: Let (
) be a function of the discrete
(continuous) random variables
with probability function
(
) (density function (
)). Then
Example: (
Find (
Solution:
) and
)
{
( ).
Theorem:
( )
, c is a constant;
[ (
)]
[ (
and ;
)
(
(3) [ (
)].
[ (
(1)
(2)
)], (
)]
) is a function of
[
(
)]
Proof: exercise.
Theorem: Let
and
[ ( ) ( )]
where ( ) and
respectively.
Proof:
be independent, then
[ ( )]
[ ( )],
( ) are functions of only
and
,
Covariance
We shall consider two measures of dependence: the covariance
between two random variables and their correlation coefficient.
Definition: If
and
are random variables with means
, respectively, the covariance is
(
)
[(
)(
If
and
are standard deviations of
correlation coefficient is given by
(
)
.
and
)].
and
, then the
Theorem:
(
)
(
)
( ) ( )
Proof:
Theorem: If
and
are independent, then
Proof:
Note: It is not always true other way around.
(
)
.
Example:
Expected Value and Variance of Linear Functions of
Random Variables
Theorem: Let
( )
and ( )
and
. Define
∑
and
for constants
(a)
(b)
(c)
Proof:
(
and
∑
. Then
∑
)
(
(
be random variables with
)
∑
)
( )
∑
∑
∑∑
(
(
)
)
Example: Let
( )
( )
( )
(
be random variables,
( )
;
( )
)
( )
(
)
Given
Find E(U), Var(U), and Cov(U, W).
Solution:
0.5,
,
(
)
.
Multinomial Probability Distribution
Definition: A multinomial experiment possesses the following
properties:
1. It consists of n identical trials.
2. The outcome of each trial falls into one of k classes.
3. The probability that the outcome of a single trial falls into
class i, is
.
4. The trials are independent.
5. The random variables of interest,
, represent the
number of trials for which the outcome fall into class i.
(
)
defines the multinomial distribution for
∑
∑
Theorem:
(1)
(2)
( )
(
Proof: omitted.
, where
( )
)
;
.
Example: (#5.120) A sample of size n is selected from a large lot
of items in which a proportion , contains exactly one defect and
a proportion
contains more than one defect (with
).
The cost of repairing the defective items is
, where
is the number of items with one defect and
is the number of
items with more than one defect.
Find E(C) and Var(C).
Solution:
Example: (#5.122) The weights of a population of mice fed on a
certain diet since birth are assumed to be normally distributed with
and
. Suppose that a random sample of n=4
mice is taken. Find the probability that:
(a) Exactly two weigh between 80 and 100 grams, and exactly
one weighs more than 100 grams;
(b) All four mice weigh more than 100 grams.
Solution: