1 / 29

What Works In Teaching Science:

What Works In Teaching Science:. A Meta-Analysis of Current Research Carolyn Schroeder, Ph.D. Center for Math & Science Education Texas A&M University. Texas A&M University Project Staff. Timothy P. Scott, Ph.D., Project Director

vartan
Download Presentation

What Works In Teaching Science:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Works In Teaching Science: A Meta-Analysis of Current Research Carolyn Schroeder, Ph.D. Center for Math & Science Education Texas A&M University

  2. Texas A&M University Project Staff • Timothy P. Scott, Ph.D., Project Director • Carolyn Schroeder, Ph.D., Senior Research Associate • Homer Tolson, Ph.D., Senior Analyst • Yi-Hsuan Lee, Ph.D., Analyst • Tse-Yang Huang, Ph.D., Analyst

  3. Advisory Board • Carol L. Fletcher, Ph.D., Texas Regional Collaboratives, UT Austin • Ginny Heilman, Region VI ESC • Anna McClane, Region IV ESC • Sandra S. West, Ph.D., Texas State University • Jo Ann Wheeler, Region IV ESC

  4. What teaching strategies have been shown to improve student achievement in science???

  5. Criteria for Selection of Studies • Dates: 01/01/1980 – 12/31/2004 • Dealt with K-12 science education in the U.S. • Used student achievement (success, performance, etc.) as dependent variable • Used science education teaching strategies as independent variables • Was experimental or quasi-experimental • Reported effect size (ES) or statistics necessary to calculate it • Could not be totally correlational • Could not deal exclusively with special populations • Could not be included more than once (e.g., same study reported in a dissertation and journal article)

  6. Acquisition of Studies • Broad search conducted • Over 400 potential sources identified • Journal articles • Conference papers • Books • Dissertations • Government reports • Unpublished papers

  7. Search Methods • Electronic searches • Web of Science • ERIC (EBSCO, First Search, CSA) • Academic Search Premier • PsycInfo • ProQuest Dissertations and Theses • Reference lists from previous meta-analyses, books & other articles, electronic sources (e.g., government sites) • Request to NARST listserve • Requests to specific developers of instructional packages for product studies

  8. Coding of Studies • Study attributes coded: • Citation • Publication type (refereed journal, dissertation, etc.) • Study type (experimental, quasi-experimental, correlational) • Dependent variable (describe test used to measure achievement) • Independent variable (describe treatment & control or alternate treatment) • Length of treatment/study • Setting & characteristics • Schools (#, how selected, public/private, rural/urban, size, % free lunch) • Students (#, how selected, how assigned, gender, grade, ethnicity, SES) • Teachers ( #, how selected, experience, gender, certification) • Study results (ES, p, t, F, eta squared, omega squared)

  9. Intercoder Objectivity • 3 randomly selected articles were coded independently by senior analyst and 2 researchers • Degree of objectivity was 90% for two articles • Third article was identified as correlational therefore was not coded • Senior analyst read & coded all articles, resolved any differences in coding values

  10. Study Design Classification • True random assignment of schools/students to treatment and control groups • Quasi-experimental with match of schools/students to achievement and demographics of comparison school/group • Quasi-experimental with covariate adjustment for prior achievement differences • Quasi-experimental comparison of schools/subjects based a claim of “similarity” • Quasi-experimental comparison of schools/subjects to region, state, or national data • Quasi-experimental single group pre-post comparison • Quasi-experimental treatment vs. control pre-posttest • Quasi-experimental multiple group ANOVA

  11. Treatment Category Classification • Modified from Wise, 1996 • Questioning strategies • Manipulation strategies • Enhanced materials strategies • Testing strategies (changed to Assessment strategies) • Inquiry strategies • Enhanced context strategies • Instructional media strategies (changed to Instructional technology strategies) • Focusing strategies (not used) • Collaborative learning strategies (added)

  12. Table 1. Frequencies of Characteristics of Included Studies

  13. Table 1. Frequencies of Characteristics of Included Studies

  14. Table 2. Dependent Variable (Test Type)

  15. Effect Sizes • Obtained or calculated for all studies that met criteria • n = 62 • one removed later as extreme outlier • Internal & external validity influences on effect sizes calculated • Regression analysis for moderator variables & dependent variable effect sizes (n = 61) • Failsafe N calculated for all categories

  16. Table 3. Failsafe N for Total Data and Treatment Description Categories

  17. Analysis of Effect Size • Comprehensive Meta-Analysis® software from BioStat • Outputs • Cohen’s d, • Hedges’s g, • Q value, • confidence interval etc., • fixed and random effects, and • heterogeneity testing results.

  18. Figure 1. Mean Effect Sizes for Treatment Categories and Total Data C1=Questioning C2=Manipulation C3=Enhanced Material C4=Assessment C5=Inquiry C6=Enhanced Context C7=Instructional Technology C8=Collaborative Learning

  19. Conclusions

  20. What teaching strategies have been shown to improve student achievement in science??? • All of the innovative strategies have a positive influence on student achievement. • Innovative science instruction is a mixture of teaching strategies. • Teaching strategies are tools, and the right tool must be selected for the job at hand.

  21. Table 4. Ranking of Teaching Strategies

  22. Most Powerful – Enhanced Context Strategies • Make learning relevant to students • Use real-world examples and problems • Problem based learning • Case based learning • Use technology to bring real world into classroom • Take students out of classroom into real world • Use multiple contexts to teach concept

  23. Future Research – Meta-Analysis • Examine studies included in MA to determine how many of them meet the “strong” or “possible” evidence of effectiveness standards of the DOE Institute of Education Sciences (see Identifying and Implementing Educational Practices Supported by Rigorous Evidence, available athttp://www.ed.gov/rschstat/research/pubs/rigorousevid/rigorousevid.pdf) • Broaden scope of meta-analysis to include: • International studies • Correlational studies (data on two variables collected and summarized, showing the relationship between the variables) • Studies dealing with attitudinal and motivational changes in students and teachers • Studies dealing with special populations (English-language learners, special education, under-represented populations, etc.) • Studies dealing with teacher professional development

  24. Products Based on Results of Meta-Analysis

  25. Products • Research-based Teaching Strategies for Effective Science Instruction • Rubric for Analyzing Science Products • Combined in booklet – Effective K-12 Science Instruction: Elements of Research-based Science Education

  26. Science content Accuracy and alignment Safety Organization and structure Format of materials Coherency Meaningful assessment Alignment Formative Summative Metacognitive Effective instructional practices Enhanced context strategies Inquiry strategies Instructional technology strategies Collaborative learning strategies Manipulation strategies Questioning strategies Equity and practicality Equity Practicality Rubric Design Based on Meta-Analysis

  27. Rubric Development • Draft created using criteria • Sent to advisory board and stakeholders for comment • Revision • Discussion with science teachers/ supervisors • Further revisions, clarifications, & weighting of categories • Field test • Statistical validation (Interrater reliability = .945 using Cronbach’s alpha)

  28. Questions or Comments?

  29. Dr. Carolyn Schroeder 979-458-4450 cschroeder@science.tamu.edu Booklets may be ordered for $1.50 each + shipping • Texas A&M University • Center for Mathematics and Science Education • 3257 TAMU • College Station, Texas 77843-3257 • http://www.science.tamu.edu/cmse/tsi

More Related