1 / 36

Multivariate Regression

Multivariate Regression. BMTRY 726 3/25/14. Extending Univariate Regression. Consider a regression problem now where we have m outcomes for each of our n individuals and want to find the association with r predictors

jonas-burke
Download Presentation

Multivariate Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multivariate Regression BMTRY 726 3/25/14

  2. Extending Univariate Regression Consider a regression problem now where we have m outcomes for each of our n individuals and want to find the association with r predictors If we assume that each response follows its own regression model we get

  3. In Classical Regression Terms When we write the model this was we can see it is equivalent to classic linear regression

  4. In Classical Regression Terms If we consider the ith response we can estimate b What about variance?

  5. Sum of Squares It follows from the univariate case that SSE is We want to minimize SSE, so our solution for the ith outcome is Our sum of squares decomposition for the model is

  6. Example Let’s develop our regression equations for the following data

  7. Example Let’s develop our regression equations for the following data

  8. Example Let’s develop our regression equations for the following data

  9. Example Use all this information to find our sum of squares…

  10. Properties The same properties we had in univariate regression hold here

  11. Properties The same properties we had in univariate regression hold here

  12. Estimate of S

  13. So far we haven’t made any assumptions about the distribution of e… what if normality holds

  14. LRT What do we do with this information? Naturally we can develop LRT for our regression parameters

  15. Predictions from Multivariate Regression What do we do with our model once we have it?

  16. Predictions from Multivariate Regression We can use this information to construct 100(1-a)% confidence regions

  17. Predictions from Multivariate Regression We can also construct 100(1-a)% prediction regions

  18. Predictions from Multivariate Regression We can also construct 100(1-a)% confidence Intervals and prediction intervals

  19. Concept of Linear Regression Up until now, we have been focusing on fixed covariates Suppose instead that the response and all covariates are random with some joint distribution What if we want to predict Y using

  20. Concept of Linear Regression We select b0 and b to minimize the MSE MSE minimized over b0 and b when Y Y-b0-b’Z b0+b’Z Z

  21. We select b0 and b to minimize the MSE MSE minimized over b0 and b when Where

  22. So how do we use this? Useful if we want to use Z to interpret Y…

  23. We’ve made no distributional assumptions so far If general form of f (Z) used to approximate ys.t. E(y-f (Z))min what will f (Z)be Special Case, Y and Zare jointly normal:

  24. Example Find the MLE of the regression function for a single response

  25. Example Find the best linear predictor, its mean square error, and the multiple correlation coefficient

  26. Prediction of Several Variables What if we are considering more than a single response? Consider responses Y1, Y2,…,Ym (and are MVN) It is easy to see that the regression equation takes the form

  27. Prediction of Several Variables The maximum likelihood estimators look very similar to the single response case…

  28. Example Find the MLE of the regression function for a single response

  29. Example Find the MLE of the regression function for a single response

  30. Partial Correlation We may also be interested in determining the association between the Y’s after removing the effect of Z We can define a partial correlation between the Y’s removing the effect of Z as follows The corresponding sampling partial correlation coefficient is:

  31. Testing Correlations May be interested in determining whether all correlations are 0

  32. Testing Correlations We then consider the -2log likelihood (using a large sample approximation) Bartlett correction:

  33. Testing Correlations: Example Say we have an estimated correlation matrix and want to test if all correlations are 0

  34. Inference for Individual Correlations Now what if we are interested in testing if individual or partial correlations are 0 Using the sample covariance matrix we can compute a t-test

  35. Inference for Individual Correlations We can also find an approximate (1-a)100% CI for correlation:

  36. Example From our earlier correlation matrix r13= 0.24:

More Related