1 / 53

Chi-Square and F Distributions

Chapter 11. Chi-Square and F Distributions. Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania. The Chi-Square Distribution. The χ 2 distribution has the following features: All possible values are positive.

aspen
Download Presentation

Chi-Square and F Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 11 Chi-Square and F Distributions Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania

  2. The Chi-Square Distribution • The χ2 distribution has the following features: • All possible values are positive. • The distribution is determined solely by the degrees of freedom. • The graph of the distribution is skewed right, but as the degrees of freedom increase, the distribution becomes more bell-shaped. • The mode of the distribution is at n – 2 (for sample sizes greater than 3).

  3. The Chi-Square Distribution

  4. Table 7 in Appendix II • The table gives critical values for area that falls to the right of the critical value.

  5. Chi-Square: Test of Independence • The goal of the test is to determine if one qualitative variable is independent of another qualitative variable. • The hypotheses of the test: H0: The variables are independent H1: The variables are not independent

  6. Chi-Square: Test of Independence • The data will be presented in a contingency table in which the rows will represent one variable and the columns will represent the other variable.

  7. Chi-Square: Test of Independence • First, we need to compute the expected frequency, E, in each cell.

  8. Chi-Square: Test of Independence • The sample statistic for the test will be: O = the observed frequency in each cell E = the expected frequency in each cell n = the total sample size

  9. Chi-Square: Test of Independence • To use Table 7 to estimate the P-value of the test, the degrees of freedom are:

  10. Review of Chi-Square Test for Independence

  11. Review of Chi-Square Test for Independence

  12. Review of Chi-Square Test for Independence

  13. Test of Homogeneity • A test of homogeneity tests the claim that different populations share the same proportions of specified characteristics. • Test of homogeneity is also conducted using contingency tables and the chi-square distribution.

  14. Test of Homogeneity Obtain random samples from each of the population. For each population, determine the numbers that share a distinct specified characteristic. Make a contingency table with the different populations as the rows (or columns) and the characteristics as the columns (or rows). The values recorded in the cells of the table are the observed value O taken from the samples. • Set the level of significance and use the hypotheses H0: The proportion of each population sharing specified characteristics is the same for all populations. H1: The proportion of each population sharing specified characteristics is not the same for all populations. 2. Follow steps 2—5 of the procedure used to test for independence.

  15. Chi-Square: Goodness of Fit • We will test whether a given data set “fits” a given distribution. H0: The population fits the given distribution. H1: The population has a different distribution.

  16. Chi-Square: Goodness of Fit

  17. Chi-Square: Goodness of Fit

  18. Chi-Square: Goodness of Fit

  19. Testing σ2 • If we have a normal population with variance σ2and a random sample of n measurements with sample variance s2, then has a chi-square distribution with n – 1 degrees of freedom.

  20. Working with theChi-Square Distribution Use Table 7 in Appendix II.

  21. P-Values for Chi-Square Tests

  22. P-Values for Chi-Square Tests

  23. P-Values for Chi-Square Tests

  24. Test Procedure for σ2

  25. Test Procedure for σ2

  26. Confidence Intervals for σ2 Confidence Intervals for σ

  27. Confidence Intervals for σ2 orσ

  28. Using the Chi-SquareDistribution for C.I.

  29. Testing Two Variances • Assumptions: • The two populations are independent of each other. • Both populations have a normal distribution (not necessarily the same).

  30. Notation • We choose population 1 to have the larger sample variance, i.e. s21 ≥ s22.

  31. More Notation

  32. Compute the Test Statistic We will compare the test statistic to an F distribution, found in Table 8 of Appendix II.

  33. A Typical F Distribution

  34. Properties of the F Distribution

  35. Testing Two VariancesUsing F Distribution • Use the F-distribution and the type of test tofind or estimate the P-value with Table 8 of Appendix II. • Conclude the test. If P-value , then rejectH0. Otherwise do not reject H0. • Interpret your conclusion in the context of the application.

  36. Using Table 8 Estimate the P-Value for F = 55.2 with d.f.N = 3 and d.f.D = 2

  37. One-Way ANOVA • ANOVA is a method of comparing the means of multiple populations at once instead of completing a series of 2-population tests.

  38. ANOVA Assumptions

  39. Establishing the Hypotheses in ANOVA In ANOVA, there are k groups and k group means. The general problem is to determine if there exists a difference among the group means.

  40. One-Way ANOVA Procedure

  41. One-Way ANOVA Procedure

  42. One-Way ANOVA Procedure

  43. One-Way ANOVA Procedure

  44. One-Way ANOVA Procedure

  45. One-Way ANOVA Procedure

  46. Two-Way ANOVA • Two-Way ANOVA is a statistical technique to study two variables simultaneously. • Each variable is called a factor. • Each factor can have multiple levels. • We can study the different means of the factors as well as the interaction between the factors.

  47. Two-Way ANOVA Assumptions

  48. Steps For Two-Way ANOVA • Establish the hypotheses. • Compute the Sums of Squares Values. • Compute the Mean Squares Values. • Compute the F Statistic for each factor and the interaction. • Conclude the test.

  49. Two-Way ANOVA Hypotheses

  50. Compute the Mean Squares

More Related