80 likes | 251 Views
Section 9 – Functions and Transformations of Random Variables. Distribution of a transformation of continuous RV: X. Y = u(X) Y is defined as a function “u” of X v(u(x))=x Function “v” is defined as function the inverse function of “u” Obtain v(u(x)) by solving the given Y=u(x) for x.
E N D
Section 9 – Functions and Transformations of Random Variables
Distribution of a transformation of continuous RV: X • Y = u(X) • Y is defined as a function “u” of X • v(u(x))=x • Function “v” is defined as function the inverse function of “u” • Obtain v(u(x)) by solving the given Y=u(x) for x
Distribution of a Sum of RV’s • If Y = X1 + X2 • E[Y] = E[X1] + E[X2] • Var[Y] = Var[X1] + Var[X2] + 2Cov[X1, X2] • Use the convolution method to find the distribution of a sum of independent RV’s • Note: X1 & X2must be independent
Central Limit Theorem • If asked for a probability involving a sum of a large number of independent RV’s, you usually need to use the normal approximation • X1, X2, … Xn are independent RV’s with the same distribution • X has mean and standard deviation
Distribution of Maximum or Minimum of a Collection of Independent RV’s • Suppose X1 and X2 are independent RV’s • U = max{X1, X2} • V = min{X1, X2} (trickier) • We know F1(x) and F2(x) such that F1(x)=P(X1<=x)
Mixtures of RV’s • X1 and X2 are RV’s with independent (but usually different) probability functions • X is a mixture of X1 and X2 • Where 0<a<1 • Weight “a” on X1 • Weight (1-a) on X2 • Expectation, probabilities, and moments follow a “weighted-average” form: • Variance is NOT calculated from a “weighted-average approach” • Use the above approach to find E[X] & E[X^2]
Mixtures of RV’s • There is a difference between a “sum of RV’s” and a “mixture of RV’s” • Sum of RV’s: X = X1 + X2 • Mixture of RV’s: X is defined by probability function: • So, don’t make the mistake of thinking: X = a*X1 + (1-a)*X2(this is wrong)