COSC 6221: Statistical Signal Processing Theory Assignment # 2: Transformation of Random Variables Due Date: October 16, 2003 In the last few lectures, we defined the expectation operation for a random variable (RV), X , and showed that the expected value of any function of X including higher order moments can be calculated directly from the probability density function f X (x) of X . Calculating the moments individually is a tedious process, however, the moment generating function, X (t ) , or alternatively the characteristic function, X () , can be used to simplify the process. Important bounds including the Chernoff bound and the Chebyshev and Schwartz inequalities that provides upper limit to the probabilities of a RV exceeding its expected value, were introduced. Finally, we had our first taste of estimation by covering problems in linear regression and the minimum mean square (MMSE) estimators for calculating the mean and variance of the observed process. We noted that an optimal estimator is usually unbiased and consistent. In the next week, we will expand the theory of random variables to multiple random variables and stochastic processes. Please review chapter 4 from the Woods text to cover the intrinsic details about the random variables. 1. (Expectation) Let X be a Poisson RV with parameter a . Compute EY X 2 b. 2. (Conditional Expectation) A random sample of 20 households shows the following numbers of children per household: 3, 2, 0, 1, 0, 0, 3, 2, 5, 0, 1, 1, 2, 2, 1, 0, 0, 0, 6, 3. (a) For this set, calculate the average number of children per household. (b) What is the average number of children per households given that there is at least one child in the family? 3. (Conditional Expectation) A particular TV model is manufactured in three different plants, say A, B, and C given as f XY x, y 1 3 exp 1 2 x 3 2 y 2 2 U x U y where U (x) is the unit step function. (a) Are X and Y independent RV? (b) Compute the probability 0 X 3,0 Y 2. 4. Consider a communication channel corrupted by noise. Let X be the value of the transmitted signal and Y be the value of the received signal. Assume that the conditional pdf of Y given X x is Gaussian, that is, fY X y x ( y x) 2 exp 2 2 1 2 2 and X is uniformly distributed between [-1, 1]. What is the conditional probability of X given Y , i.e., f X Y x y . 5. An U.S. defense radar scans the skies for unidentified flying objects (UFOs). Let M be an event that a UFO is present and M c be the event that a UFO is absent. Let the conditional pdf of the radar return signal X when a UFO is actually present be given by f X M x M and the conditional pdf of the radar be given by f X M x M 1 2 1 exp 0.5[ x r ] 2 2 exp 0.5[ x] 2 return signal when a UFO is not present To be specific let r 1 and let the alert level be x A 0.5 where A denote the event of an alert, that is, X x A . Compute P[ A | M ], P[ A c | M ], P[ A | M c ], and P[ A c | M c ]. 6. Show that if Y X 2 , then fY 7. X y x 0 U y f x y . 1 Fx 0 2 y The RV’s X and Y are independent with Rayleigh pdf f X x x 2 exp x 2 2 2 U x fY y y 2 exp y 2 2 2 U y Calculate the pdf of RV Z for the transformation Z X Y . 8. The RV’s X and Y are independent with exponential pdf’s f X x exp x U x f Y y exp y U y Find the densities of the following RV’s (a) Z (2 X Y ) (b) Z ( X Y ) (c) Z X / Y (d) Z max( X , Y ) , and (e) Z min( X , Y ). 9. The objective is to generate numbers from the pdf shown in the following figure fx(x) X -1 1 All that is available is a random number generator that generates numbers uniformly distributed in (0, 1). Explain what procedure you would use to meet the objective. 10. Consider the transformation Z X cos Y sin W X sin Y cos Compute the joint pdf of Z and W if the pdf of X and Y is jointly Gaussian as follows f XY ( x, y ) 1 1 exp x 2 y 2 2 2 . 11. Let X and Y be two independent RV’s with Poisson distribution PX k 1 2 k e 2 k! with k , m 0. Compute P[Z 5] where Z ( X Y ). PY m 1 3 m e 3 m!
© Copyright 2026 Paperzz