1 / 260

Multivariate Analysis Review

Multivariate Analysis Review. Multivariate distributions. The multivariate Normal distribution. [ x 1 , x 2 , … x p ] is said to have a p -variate normal distribution with mean vector and covariance matrix S if. Surface Plots of the bivariate Normal distribution.

drule
Download Presentation

Multivariate Analysis Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multivariate AnalysisReview

  2. Multivariate distributions

  3. The multivariate Normal distribution

  4. [x1, x2, … xp]is said to have a p-variate normal distribution with mean vector and covariance matrix Sif

  5. Surface Plots of the bivariate Normal distribution

  6. Contour Plots of the bivariate Normal distribution

  7. Scatter Plots of data from the bivariate Normal distribution

  8. Trivariate Normal distribution - Contour map x3 mean vector x2 x1

  9. Trivariate Normal distribution x3 x2 x1

  10. Trivariate Normal distribution x3 x2 x1

  11. Trivariate Normal distribution x3 x2 x1

  12. Marginal and Conditional distributions

  13. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the marginal distribution of is qi-variate Normal distribution (q1 = q, q2 = p - q) with mean vector and Covariance matrix

  14. Theorem: (Conditional distributions for the Multivariate Normal distribution) have p-variate Normal distribution with mean vector and Covariance matrix Then the conditional distribution of given is qi-variate Normal distribution with mean vector and Covariance matrix

  15. The conditional distribution of given is:

  16. is called the matrix of partial variances and covariances. is called the partial covariance (variance if i = j) between xi and xj given x1, … , xq. is called the partial correlation between xi and xj given x1, … , xq.

  17. is called the matrix of regression coefficients for predicting xq+1, xq+2,… , xpfrom x1, … , xq. Mean vector of xq+1, xq+2,… , xpgiven x1, … , xqis:

  18. Independence

  19. Note: two vectors, , are independent if Then the conditional distribution of given is equal to the marginal distribution of If is multivariate Normal with mean vector and Covariance matrix Then thetwo vectors, , are independent if

  20. The components of the vector, , are independent if s ij = 0 for all i and j (i ≠ j ) i. e. S is a diagonal matrix

  21. Transformations

  22. Theorem Let x1, x2,…, xn denote random variables with joint probability density function f(x1, x2,…, xn ) Let u1= h1(x1, x2,…, xn). Transformations u2= h2(x1, x2,…, xn). ⁞ un= hn(x1, x2,…, xn). define an invertible transformation from the x’s to the u’s

  23. Then the joint probability density function of u1, u2,…, un is given by: where Jacobian of the transformation

  24. Theorem Let x1, x2,…, xn denote random variables with joint probability density function f(x1, x2,…, xn ) Let u1= a11x1+ a12x2+…+ a1nxn + c1 u2= a21x1 + a22x2+…+ a2nxn + c2 ⁞ un= an1x1+ an2x2 +…+ annxn + cn define an invertible linear transformation from the x’s to the u’s

  25. Then the joint probability density function of u1, u2,…, un is given by: where

  26. Theorem Suppose that The random vector, [x1, x2, … xp]has a p-variate normal distribution with mean vector and covariance matrix S then has a p-variate normal distribution with mean vector and covariance matrix

  27. Theorem (Linear transformations of Normal RV’s) Suppose that The random vector, has a p-variate normal distribution with mean vector and covariance matrix S Let A be a q × p matrix of rank q ≤ p has a p-variate normal distribution then with mean vector and covariance matrix

  28. Maximum Likelihood Estimation Multivariate Normal distribution

  29. The Method of Maximum Likelihood Suppose that the data x1, … , xnhas joint density function f(x1, … , xn; q1, … , qp) where q = (q1, … , qp) are unknown parameters assumed to lie in W(a subset of p-dimensional space). We want to estimate the parametersq1, … , qp

  30. Definition: The Likelihood function Suppose that the data x1, … , xnhas joint density function f(x1, … , xn; q1, … , qp) Then given the data the Likelihood function is defined to be = L(q1, … , qp) = f(x1, … , xn; q1, … , qp) Note: the domain of L(q1, … , qp) is the set W.

  31. Definition: Maximum Likelihood Estimators Suppose that the data x1, … , xnhas joint density function f(x1, … , xn; q1, … , qp) Then the Likelihood function is defined to be = L(q1, … , qp) = f(x1, … , xn; q1, … , qp) and the Maximum Likelihood estimators of the parameters q1, … , qp are the values that maximize = L(q1, … , qp)

  32. i.e. the Maximum Likelihood estimators of the parameters q1, … , qp are the values Such that Note: is equivalent to maximizing the log-likelihood function

  33. Maximum Likelihood Estimation Multivariate Normal distribution

  34. Summary: the Maximum Likelihood estimators of are and

  35. Sampling distribution of the MLE’s

  36. Summary The sampling distribution of is p-variate normal with

  37. The sampling distribution of the sample covariance matrix S and

  38. The Wishart distribution A multivariate generalization of the c2 distribution

  39. Definition: the p-variate Wishart distribution be k independent random p-vectors Let Each having a p-variate normal distribution with Then U is said to have the p-variate Wishart distribution with k degrees of freedom

  40. The density ot the p-variate Wishart distribution Then the joint density of U is: Suppose where Gp(·) is the multivariate gamma function. It can be easily checked that when p = 1 and S = 1 then the Wishart distribution becomes the c2 distribution with k degrees of freedom.

  41. Theorem Suppose then Corollary 1: Corollary 2:

  42. Theorem are independent, then Suppose Theorem are independent and Suppose then

  43. Summary: Sampling distribution of MLE’s for multivatiate Normal distribution Let be a sample from then and

  44. Correlation

  45. The sample covariance matrix: where

  46. The sample correlation matrix: where

  47. Note: where

  48. Tests for IndependenceandNon-zero correlation

  49. Tests for Independence Test for zero correlation (Independence between a two variables) The test statistic If independence is true then the test statistic t will have a t -distributions with n = n –2degrees of freedom. The test is to reject independence if:

  50. Test for non-zero correlation (H0: r = r0 ) The test statistic If H0 is true the test statistic z will have approximately a Standard Normal distribution We then rejectH0 if:

More Related