1 / 16

Stats Lunch: Day 7 One-Way ANOVA

Stats Lunch: Day 7 One-Way ANOVA. Basic Steps of Calculating an ANOVA. M = 3 M = 6 M = 10. Remember, there are 2 ways to estimate pop. variance in ANOVA: 1. Within Subjects : Estimate variance using data from each group -This is a measure of ‘error’, how much Ss differ w/in a group

tyrone
Download Presentation

Stats Lunch: Day 7 One-Way ANOVA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stats Lunch: Day 7 One-Way ANOVA

  2. Basic Steps of Calculating an ANOVA M = 3 M = 6 M = 10 Remember, there are 2 ways to estimate pop. variance in ANOVA: 1. Within Subjects: Estimate variance using data from each group -This is a measure of ‘error’, how much Ss differ w/in a group 2. Between Groups: Estimate variance using means of each group: -This estimate is affected by BOTH error and the impact of our independent variable (drug type)

  3. Step 1: Estimate pop. variance from variation w/in each group -Estimate from each group the same way we did w/ t tests: S2group = SS/df -At this point, we are dealing with equal N’s in each group -So, we don’t have to control for the quality of estimates we get from different sample sizes (like we did w/ independent t tests) -Instead, we can just directly pool (average) the different estimates we get from each group (S21 + S22 + S23)/ Number of Groups Ex: (1 +.5 + 1)/3 = .83 This is S2within, or as it’s more commonly known: - “Mean Squares Within” = MSwithin -or “Mean Squares Error” = MSError

  4. Step 2: Estimate pop. variance from variation between groups: -Get the variance of the means of each group This has two parts: Part A) Estimate variance of distribution of means: -Squared deviation of each mean from the average of all the means divided by the df for the between group estimate… -Grand Mean (GM): Mean of all the means Ex: (3 + 6 + 10)/3 =19/3 = 6.33 -dfbetween : # of groups - 1 S2M = (M - GM)2/dfbetween - ((3-6.33)2 + (6-6.33)2 + (10-6.33)2)/2 - (11.09 + .11 + 13.49)/2 = 12.35

  5. Part B: Figure estimated variance of pop. of individual scores -What we’ve done so far is to estimate the dist. of means from a small # of means…now we need to estimate the dist. of individual scores this was based on… -This is essentially the opposite of what we’ve done before -We do this by reversing the procedure we’ve used to estimate variance in the past: -Instead of dividing by df, we multiply by df -S2Between (aka MSbetween) = S2M * n = 12.35 * 5 = 61.75

  6. Step 3: Calculate the F Ratio F ratio = What we can explain/What we can’t F Ratio = Between Groups Variance / Within Groups Variance F Ratio = MSbetween / MSwithin EX: F = 61.75/.83 = 74.40 Step 4: Decide if you should reject the null -If F > 1, we should start to think about rejecting the null -Of course, we need to quantify this somehow -We compare our F score to an F distribution -Like t tests, if our F ratio exceeds a cutoff point (based on our alpha level) we reject the null

  7. Setting Up a One-Way B/TWN Ss ANOVA in SPSS • Click on Analyze… • Choose “G.L.M.”, and then Univariate.. • Click on Analyze… • Choose “Compare Means”, and then One-Way ANOVA.. Or better yet…

  8. Setting Up a One-Way B/TWN Ss ANOVA in SPSS 6) I like to display means.. 7) You’ll probably want to click here to get your simple comparisons (more on this in a second). 8) Get your effect sizes, change your  level, etc. 9) Click on “Continue” then “OK” 3) Select your D.V. 4) Add your I.V. to the “Fixed Factors” window (you can run a bunch of individual ANOVAs by adding more IVs) 5) Click on “Options”

  9. Setting Up a One-Way B/TWN Ss ANOVA in SPSS

  10. Setting Up a One-Way B/TWN Ss ANOVA in SPSS But what does this tell us?

  11. The ANOVA is often called the “omnibus” or “overall” test: -see if there is a difference anywhere between the groups -Controls FW, protects against Type I Error -However, if we were doing this study for real, we’d also want to figure out WHERE the differences are: -Is the New Drug better than the placebo? Better than Old Drug? -We do this with “Planned Comparisons”: specific comparisons between individual means (decided in advance) -AKA: “planned contrasts” and “a priori comparisons” -We do planned comps the same way as the ANOVA: What we know(Between)/What we don’t (Within)

  12. What we don’t know = w/in subjects variance -We can just recycle this from the omnibus F What we know = between groups variance: -Same idea as before, but we are only interested in the variance between a pair of means: -To figure B/twn groups variance, we do the same two steps as before A) Estimate variance of Distribution of Means B) Estimate variance of population of individual scores So, say we want to make two comparisons: New Drug (3) vs. Placebo (10) New Drug (3) vs. Old Drug (6)

  13. Problems w/ Multiple Comparisons (again) -Doing multiple planned comparisons can lead to the same problems as doing a bunch of t tests -Hence, people often use the “Bonferroni Correction” to control for the problems associated w/ multiple comparisons -Adjusts the alpha level for each comparison so FW does not exceed .05 (or whatever your accepted cutoff happens to be) -Bonferroni Correction: True Significance level/# comparisons EX: .05/2 = .025, so we’d use an alpha of .025 for each comp -However, it is generally considered that you get 2 (or even 3) “free” planned comparisons…where you don’t have to adjust alpha -Balance Type I and Type II errors

  14. Setting Up Planned Comparisons in SPSS LSD Bonferroni Note the p values… …Do the same thing as before, and then 7) Click here to get your simple comparisons (more on this in a second). 8) Choose “LSD” (least squared difference) for no adjustments, or choose Bonferroni..

  15. Example of Orthogonal Comparisons Other Issues: Orthogonality • Not all planned comparisons are created equal… • To maintain statistical integrity, a set of comparisons SHOULD be orthogonal to one another. • Orthogonal: Comparisons reflect non-overlapping (independent) information… the outcome of one comparison gives no info about the outcome of another. • We know they’re orthogonal if the sum of the products of each column = 0 • Can have (Number of Levels – 1) orthogonal comparisons. • SS of a set of Orthogonal Comparisons = SS Between.

  16. Example of Non- Orthogonal Comparisons Other Issues: Orthogonality Problems: Orthogonal comparisons might not be what you WANT to know (e.g., they’re not psychologically, psychiatrically, or experimentally important) • What to do about it? • Depends who you ask… • However, most people would say that the meaningfulness of a set of contrasts is important… • Can also do “Post-Hoc” tests…

More Related