1 / 32

The national i 3 evaluation of diplomas now

The national i 3 evaluation of diplomas now. DN Summer Institute – July 9, 2013. Topics for this session. Refresher on the evaluation Evaluation team i3 context Study design Overview of data collection Student and staff surveys 2013-14 Fidelity of Implementation

maris
Download Presentation

The national i 3 evaluation of diplomas now

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The national i3 evaluation of diplomas now DN Summer Institute – July 9, 2013

  2. Topics for this session • Refresher on the evaluation • Evaluation team • i3 context • Study design • Overview of data collection • Student and staff surveys 2013-14 • Fidelity of Implementation • Wave 1 Schools – Spring 2012

  3. Partner Organizations – DN Study • MDRC • 40-yr old nonprofit, nonpartisan, education and social policy research organization dedicated to learning what works to improve programs and policies that affect low-income individuals and communities • ICF International • 40-yr old research and consulting firm that seeks to provide solutions and services that address challenging policy issues

  4. Goals of the i3 Validation Grant Program • Identify some of the most promising school improvement initiatives • Provide support for them to scale up nationally • Research their effectiveness using the most rigorous methodologies available • Document lessons learned about implementation during the scale-up process • Publicize study results to influence national and state policy

  5. Overview of Diplomas Now Study • Overall goal: Validate effectiveness of the Diplomas Now model • Research Questions: • What is the impact of Diplomas Now on students’ outcomes, particularly with regard to attendance, in-school behavior, and course performance? • What lessons can be learned about implementation of the model during national expansion?

  6. Study Design: Random Assignment • A random assignment design uses a lottery to assign participating schools to one of two groups • DN schools (implementing the DN program) • Non-DN schools (implementing any other school reform program)

  7. National sample • Currently 62 secondary schools in 11 districts across the country participating in the study • Study will compare student outcomes in the 32 middle and high schools that implement DN to those in the 30 schools that do not • Study will document implementation in the 32 DN schools, and also investigate how it compares to any school improvement efforts in the 30 Non-DN schools.

  8. Data Collection * For their participation, Non-DN Schools will receive $10,000 compensation each year

  9. Surveying: Spring 2014

  10. Questions and Answers

  11. DRAFT - NOT FOR DISTRIBUTION Diplomas Now Evaluation Fidelity of Implementation: Spring 2012

  12. Implementation Fidelity Data Sources • Fidelity of Implementation Data come from the following sources: • Diplomas Now Implementation Support Team (DNIST) Survey • School Transformation Facilitator (STF) Survey • Citi Year Program Manager (CYPM) Survey • Communities In Schools (CIS) Site Coordinator (SC) Survey • Communities In Schools (CIS) Site Records DRAFT - NOT FOR DISTRIBUTION

  13. DN Fidelity of Implementation • Fidelity of implementation is based on the DN Logic Model and measured using the Fidelity of Implementation Matrix. • The matrix is built on 111 separate components, 62 of which were identified as critical to adequate implementation. • These components sort into 9 inputs, 6 of which were identified as critical to adequate implementation. DRAFT - NOT FOR DISTRIBUTION

  14. DN Fidelity of Implementation • That is… DRAFT - NOT FOR DISTRIBUTION

  15. DN Fidelity of Implementation • And… DRAFT - NOT FOR DISTRIBUTION

  16. DRAFT - NOT FOR DISTRIBUTION

  17. Fidelity Matrix: Inputs • Program Staff Training and Professional Development • 18 individual components, 15 of which are critical • Integrated On-Site Support (Critical Input) • 11 individual components, 9of which are critical • Family and Community Involvement • 6 individual components, 1 of which is critical DRAFT - NOT FOR DISTRIBUTION

  18. Fidelity Matrix: Inputs (cont.) • Tiered Intervention Model (Critical Input) • 3 individual components, 2 of which are critical • Strong Learning Environments (Critical Input) • 6 individual components, 4 of which are critical • Professional Development and Peer Coaching (Critical Input) • 5 individual components, 2 of which are critical DRAFT - NOT FOR DISTRIBUTION

  19. Fidelity Matrix: Inputs (cont.) • Curriculum for College Readiness • 24 individual components, 4 of which are critical • Student Supports (Critical Input) • 24 individual items, 19 of which are critical • Student Case Management (Critical Input) • 14 individual items, 5 of which are critical DRAFT - NOT FOR DISTRIBUTION

  20. DN Fidelity of Implementation • Fidelity is measured in two ways, by a categorical rating and a continuous score: • Implementation Rating (categorical measure): focused on critical components • Implementation Score (continuous measure): inclusive of all components DRAFT - NOT FOR DISTRIBUTION

  21. Implementation Rating • The rating focuses on “critical” components and “critical” inputs. How well did a site implement aspects of the model hypothesized to be most important to improving student outcomes? • Each input (e.g., program staff professional development) of the DN model was rated as either: • “Successful” - met implementation thresholds for all “critical” components • “Developing” - did not meet threshold for one or more critical components DRAFT - NOT FOR DISTRIBUTION

  22. Implementation Rating (cont.) • Individual input ratings served as the basis for the site-level fidelity rating, which has been broken up into four categories: • Low: successful on less than 3 critical inputs • Moderate: successful on at least 3 critical inputs • Solid: successful on at least 5 critical inputs • High: successful on 8 or more inputs including 5 critical inputs DRAFT - NOT FOR DISTRIBUTION

  23. Implementation Score • The score measures implementation of all aspects of the DN model, going beyond just the “critical” aspects of the model. How well did a site implement the model overall? • Each input is scored based on how well every one of its components was implemented. • The average of the 9 input scores provides the site-level implementation score (0-1 scale: the proportion of the entire model implemented by a site) DRAFT - NOT FOR DISTRIBUTION

  24. Wave 1 Schools - Year 1 Preliminary Findings Cohort 1: 12 DN Sites • 5 High Schools • 7 Middle Schools Cross-Site Implementation Rating • % of DN sites with Solid Implementation Rating: 0% • % of DN sites with Moderate Implementation Rating: 42% • % of DN sites with Low Implementation Rating: 58% Cross-Site Implementation Score • Overall Implementation Score: 0.59 DRAFT - NOT FOR DISTRIBUTION

  25. Highs and Lows: Critical Components by Input

  26. Highs and Lows: Critical Components by Input (cont.)

  27. Critical Components Met by < 50% of Sites Program Staff Training and Professional Development DRAFT - NOT FOR DISTRIBUTION

  28. Critical Components Met by < 50% of Sites Curriculum for College Readiness DRAFT - NOT FOR DISTRIBUTION

  29. Critical Components Met by < 50% of Sites Student Supports Student Case Management DRAFT - NOT FOR DISTRIBUTION

  30. Cohort 1 - Year 1 Preliminary Findings by H.S. DRAFT - NOT FOR DISTRIBUTION

  31. Cohort 1 - Year 1 Preliminary Findings by M.S. DRAFT - NOT FOR DISTRIBUTION

  32. National Evaluation Contacts • MDRC Project Director William Corrin Deputy Director, K-12 Education (212) 340-8840 william.corrin@mdrc.org • ICF Project Manager Aracelis Gray Senior Manager, Health, Education and Social Programs (703) 225-2290 agray@icfi.com

More Related