1 / 13

Evaluating the Impacts of Classroom Assessment Initiatives: Benefits and Potential Approaches

Evaluating the Impacts of Classroom Assessment Initiatives: Benefits and Potential Approaches. Third Annual Black Sea Conference, September 2014 Batumi, Georgia Mathematica Policy Research: Ira Nichols-Barrer  Matt Sloan Millennium Challenge Corporation: Ryan Moore  Jenny Heintz.

ryu
Download Presentation

Evaluating the Impacts of Classroom Assessment Initiatives: Benefits and Potential Approaches

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Impacts of Classroom Assessment Initiatives: Benefits and Potential Approaches Third Annual Black Sea Conference, September 2014 Batumi, Georgia Mathematica Policy Research: Ira Nichols-Barrer  Matt Sloan Millennium Challenge Corporation: Ryan Moore  Jenny Heintz

  2. Agenda • MCC & Education • MCC’s Approach to Evaluation • Context for the Current Evaluation • Impact Evaluations to Inform Education Policy • Evaluation Evidence Informing Assessment Programs and Policies • Next Steps and Points for Discussion

  3. 23 26 MCC Portfolio 26 Total Compacts 13 Active 3 Signed 8 Closed 2 Terminated

  4. MCC’s Results in Education • $454 million invested in Education • 745 educational facilities constructed or rehabilitated • 4,160 instructors trained or certified • 228,847students participants of MCC-supported educational activities • 61,848graduates of MCC-supported educational facilities Results as of March 2014

  5. Rationale for Impact Evaluation and MCC’s Approach • MCC is committed to assessing the impacts of its programs through rigorous impact evaluation, conducted by independent research consultants • MCC impact evaluations contribute to a better understanding of success and cost-effectiveness of programs • Many countries all over the world seek opportunities to evaluate their programs to support evidence-based decisions. Impact evaluation can help inform education policy in Georgia

  6. Context for the current evaluation • MCC has hired Mathematica to design and implement an impact evaluation for the Improving General Education Quality Project • Improved Learning Environment Infrastructure • Training Teachers for Excellence • Education Assessment Support • Mathematica is currently in the early designphaseof the study

  7. Value of Evaluation Evidence to Policymakers • Evaluations can systematically inform policy: • U.S. Dept. of Education’s What Works Clearinghouse

  8. Potential for Evaluation Research to Inform Assessment-Related Policies • Opportunity to develop a pilot program with a rigorous impact evaluation focused on learning outcomes • Impact evaluations can target highly specific research questions related to assessments. • Goal would be to align program implementation schedule with expected changes in student learning • Many potential ideas to explore, such as: • Support additional schools to use standardized exams • Train teachers to use more diagnostic assessments more regularly and consistently • Inform students or parents about exam performance • Encourage community engagement in understanding exam results and activities related to student learning

  9. Potential Research Questions for Assessment Policy (1) • Example: does greater use of assessment data for instructional feedback improve student learning? • Yeh (2007) meta-analysis examined the impacts of requiring rapid formative assessment of students (2-5 times per week) with principal feedback • Study found that the intervention improved learning at low cost among disadvantaged students in the U.S. • Similar results from line of research on “data-driven instruction” at U.S. charter schools • Potential study approach for this research question: random assignment of intervention, activity, or program to classrooms or schools

  10. Potential Research Questions for Assessment Policy (2) • Example: does informing community stakeholders about local learning outcomes improve school performance? • Mixed results from past research: • Andrabi et al. (2009) RCT in Pakistan found that distributing school report cards improved learning outcomes by 0.10sd, with greater benefits in low-performing schools • Lieberman et al. (2012) and Banerjee et al. (2010) RCTs in Kenya and India found that publicizing reading test results did not change citizen engagement, oversight of schools, or learning outcomes. • Research approach: random assignment of program to communities

  11. Potential Research Questions for Assessment Policy (3) • Example: does support for greater community engagement in learning improve student outcomes? • Some promising results from past research: • Banerjee et al. (2010) RCT in India found that disseminating exam results combined with training volunteer tutors significantly improved learning. • Barr et al. (2012) RCT in Uganda trainedschool management committees to develop their own indicators and objectives for school performance and monitor results—the intervention improved student test scores. • Research approach: random assignment of intervention or program to communities

  12. Education Assessment Support – Discussion Questions and Evaluation Ideas • Questions for discussion: • What type of intervention would be of greatest interest for a pilot study? • Teacher use of classroom assessment data • Impacts of providing various types of individualized assessment information to students and parents • Role of community engagement in improving learning outcomes? • Other ideas?

  13. For More Information • Please contact: • Matt Sloan • msloan@mathematica-mpr.com • Ira Nichols-Barrer • inichols-barrer@mathematica-mpr.com • Ryan Moore • moorera@mcc.gov • Jenny Heintz • heintzja@mcc.gov

More Related