1 / 11

AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH

AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH. Chapter 13.3 Multicollinearity. Multicollinearity. Multicollinearity occurs when two or more independent variables in a regression model are highly correlated to each other

rene
Download Presentation

AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AAEC 4302ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Chapter 13.3 Multicollinearity

  2. Multicollinearity • Multicollinearity occurs when two or more independent variables in a regression model are highly correlated to each other • Standard error of the OLS parameter estimate will be higher if the corresponding independent variable is more highly correlated to the other independent variables in the model

  3. Multicollinearity • Independent variables show no statistical significance when conducting the basic significance test • It is not a mistake in the model specification, but due to the nature of the data at hand

  4. Perfect Multicollinearity • Perfect multicollinearity occurs when there is a perfect linear correlation between two or more independent variables • When independent variable takes a constant value in all observations

  5. Severe Multicollinearity • The OLS method cannot produce parameter estimates • A certain degree of correlation (multicollinearity) between the independent variables is normal and expected in most cases • Severe multicollinearity

  6. Symptoms of Multicollinearity • The symptoms of a multicollinearity problem • independent variable(s) considered critical in explaining the model’s dependent variable are not statistically significant according to the tests

  7. Symptoms of Multicollinearity • High R2, highly significant F-test, but few or no statistically significant t tests • Parameter estimates drastically change values and become statistically significant when excluding some independent variables from the regression

  8. Detecting Multicollinearity • A simple test for multicollinearity is to conduct “artificial” regressions between each independent variable (as the “dependent” variable) and the remaining independent variables • Variance Inflation Factors (VIFj) are calculated as:

  9. Detecting Multicollinearity • VIFj = 2, for example, means that variance is twice what it would be if Xj, was not affected by multicollinearity • A VIFj>10 is clear evidence that the estimation of Bj is being affected by multicollinearity

  10. Addressing Multicollinearity • Although it is useful to be aware of the presence of multicollinearity, it is not easy to remedy severe (non-perfect) multicollinearity • If possible, adding observations or taking a new sample might help lessen multicollinearity

  11. Addressing Multicollinearity • Exclude the independent variables that appear to be causing the problem • Modifying the model specification sometimes help, for example: • using real instead of nominal economic data • using a reciprocal instead of a polynomial specification on a given independent variable

More Related