1 / 9

CS723 - Probability and Stochastic Processes

CS723 - Probability and Stochastic Processes. Lecture No. 29. Finished transformation of random variables Started discussion of mean and variance of transformed random variables f UV (u,v) can be used to find expected values of functions of U, V, or both

Download Presentation

CS723 - Probability and Stochastic Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS723 - ProbabilityandStochastic Processes

  2. Lecture No. 29

  3. Finished transformation of random variables Started discussion of mean and variance of transformed random variables fUV(u,v) can be used to find expected values of functions of U, V, or both fXV(x,y) can be used to find expected values of functions of X, Y, or both fXV(x,y) can be directly used to find expected values of functions of U, V, or both if transforming functions u=g( . , . ) and h( . , .) are known In Previous Lectures

  4. Diagonalizing A The eigenvalue-eigenvector relationship Ta1 = λ1a1 and Ta2 = λ2a2 can be written as Pre-multiplying with A-1

  5. If random variables X and Y are correlated, their covariance matrix KXY will have non-zero off-diagonal elements. Find the eigenvalues and eigenvectors of the covariance matrix KXY Form the transformation matrix A where columns of A are eigenvectors of KXY Under this linear transformation using A, the transformed random variables U and V will be uncorrelated Diagonalizing KXY

  6. A brief discussion of more than two random variables after analyzing a single and a pair of random variables The observation is a vector x = (x1,x2,x3,…xN) of random vector X = (X1,X2,X3,…,XN) Joint PDF fX(x) and joint CDF FX(x) are functions from RN to R1 All types of marginal densities for random vectors of lower dimension exist We can get PDF from CDF and vice versa Vector Random Variables

  7. Transformation of a vector of random variable to another vector RV is possible The joint PDF of new vector can be obtained using the determinant of the Jacobian of the transformation For linear transformation of vector random variables, the covariance matrix of transformed vector is easy to obtain Linear transformation is used to generate a vector of uncorrelated random variables from a vector of correlated RV’s Transforming Vector RV’s

  8. A random sample from a large population may or may not represent the population Each observation is a random variable with its own sample space that gets smaller and smaller as more samples are taken Average value of the sample is a transformed random variable with a new PDF This PDF can be found under a simplifying assumption that population is VERY large compared to the size of the sample Sum of Independent RV’s

More Related