1 / 16

Agenda:

Agenda:. Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief. Block Watch Exercise:. Sketch out an outcome map for Block watch. Strategies Short-term outcomes Intermediate outcomes Long term outcomes Goals.

Download Presentation

Agenda:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agenda: Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief

  2. Block Watch Exercise: Sketch out an outcome map for Block watch. • Strategies • Short-term outcomes • Intermediate outcomes • Long term outcomes • Goals

  3. Program Theory and Outcome Maps • You must know what you are evaluating before you can evaluate it • Outcome maps, program theories and other tools generate information on what the program is and how it works • Program theory may have been borrowed (replicating intervention), developed, implicit, or absent • Find the theory through written materials, interviews, participatory processes and verify or update • Assess the program theory : • Is logic of model plausible? • observation of actual versus intended program, • comparison to similar programs, • use of research related to linkages within model

  4. Evaluation questions and purposes: • Evaluation goals should follow directly from the components in the outcome map • Ask stakeholders and Evaluation sponsor what program decisions will be made and by whom? (a la Patton) • Questions will usually address: • Who are targeted clients? What are their needs? • What activities are used? And how are they delivered? Who is getting what when? • What are the short-term and longer term impacts? • Are these impacts cost effective? • Be gentle with how questions are asked

  5. Participatory Processes: • Must be tailored to context and timing to be effective • Can change staff and organizational climate through empowerment, learning, collaboration, mutual appreciation • Can change clients through learning about program, self-learning, or empowerment • Can change relationships of program to community or other stakeholders

  6. Mentoring evaluation: • Logic model • Evaluation Questions • Design of Evaluation • Comparisons for impact evaluation • Multi-site programs/ cluster evaluations

  7. Chapter 1 gives logic model (visual) (p.4) • Program goals follow from legislation • Shows flow from activities to short and longer term outcomes • Activities are common across sites, but there is variation and that may affect outcomes • Multiple impacts for students; more specific than goals

  8. Evaluation questions (pp.7-8): • Primary research questions address program impact and variation in impacts by subgroups • Secondary questions address variation across sites in activities and impact of that variation

  9. Design of Evaluation: • Impact Study: • Chose 32 Programs (of 255 grantees for 2005 and 2006) • Randomly assigned 2,573 students within those 32 sites to program or control • Assess process and impact (overall and by site) • Grantee study • Compares 32 impact study programs to 100 random programs (overlap) • Characteristics of programs and students

  10. Surveys of Programs and Mentors

  11. Surveys and records for students

  12. Impact Estimates (p. xxi) • Table shows unadjusted and regression adjusted impacts • Study adjusts critical value (less than .05) for testing of many outcomes (B-H adjustment)

  13. Impact Estimates (p. xxi) • Table shows unadjusted and regression adjusted impacts • Study adjusts critical value (less than .05) for testing of many outcomes (B-H adjustment) Site dummy (“fixed effect”) Treatment dummy variable

  14. Multisite Evaluations

  15. Results: • Most students in the treatment received mentoring; some in control did as well • Program mostly served as expected • No significant impacts of program on student outcomes overall (after accounting for multiple comparisons), but impacts for some sub-groups • Some significant effects of program characteristics on impacts

  16. Assignment 1 Debrief: • What were the evaluation questions? • How did the program theory inform the evaluation goals? • What were the strengths and limitations of your evaluation? • Group come back with general questions and insights AND one example you want to present

More Related