1 / 35

From Logic Models to Evaluation Planning and Implementation

From Logic Models to Evaluation Planning and Implementation. Workshop: July 19, 2011 Susan Grantham, Ph.D. Naomi Clemmons, MPH Nancy Kasen, MS John Snow, Inc. OBJECTIVES:. Participants will: Understand the importance of evaluation for making informed program decisions

kitty
Download Presentation

From Logic Models to Evaluation Planning and Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Logic Models to Evaluation Planning and Implementation Workshop: July 19, 2011 Susan Grantham, Ph.D. Naomi Clemmons, MPH Nancy Kasen, MS John Snow, Inc.

  2. OBJECTIVES: Participants will: • Understand the importance of evaluation for making informed program decisions • Understand the importance of evaluation for communicating to other stakeholders • Understand logic model leads into an evaluation plan • Know and practice steps involved in developing an evaluation plan • Know how to proceed with the development of an evaluation plan • Know how to work with an outside evaluator

  3. Overview of Workshop Review logic model development Discuss technical review findings Transition from the logic model to evaluation Walk through steps of developing an evaluation plan Conduct two small group sessions to practice key components of developing your evaluation plan

  4. What is a Logic Model? • Picture/graphic of how an initiative, project, program work • Systematic way to show the connections among parts of a project • Underlying logic – reflects implicit assumptions about how change occurs • (IF – THEN logic: if we do X, then Y will happen) • Makes explicit the “theory of change” in local context

  5. How Can You Use a Logic Model? • For Project Planning • Understanding & specifying project elements • Involving stakeholders and partners to reach agreement • For Project Management • Monitoring & improving project implementation • Communicating & building consensus • For Project Evaluation • Showing assumptions in line of reasoning • Suggesting measures needed for evaluation • For Communications • Writing proposals for further funding → sustainability

  6. Resources Activities Outputs Outcomes Impact Ifthese benefits to participants are achieved, thencertain changes in organizations, communities, or systems might be expected to occur Certain resources are needed to operate your program If you have access to these resources,thenyou can use them to accomplish your planned activities Ifyou accomplish your planned activities, thenyou will deliver the amount of product and/or service that you intended Ifyou accomplish your planned activities to the extent you intended, then your participants will benefit in certain ways Planned Work Intended Results

  7. Activities Outputs Outcomes Impact Resources • Examples: • Staff • Families/Consumers • Partners • Grant funding • In-kind contributions • Resources are whatever the agency needs to deliver the program • Usually are nouns • Also called “inputs”

  8. Activities Outputs Outcomes Impact Resources • Examples: • Conduct trainings • Develop statewide provider inventory • Conduct learning collaboratives • Engage consumers in Advisory Board • Develop website • Disseminate resources to providers • Disseminate resources to families/consumers • Activities are actions an organization takes to conduct its program or project • Usually expressed as verbs • How resources will be used

  9. Outputs Resources Activities Outcomes Impact • Outputs are direct results of activities undertaken • Quantify • Think in terms of target audience & how many

  10. Outcomes Resources Activities Outputs Impact • Increase the number of children, youth and families who report that they have a medical home • Increase the number of pediatric practices engaging in early and continuous screening opportunities to identify children with special health care needs. • Increase the number of families that indicate that they are included as partners at all levels of decision making in their child’s health care • Outcomes are how the target audience changes or benefits after participating in program’s activities. • Short term: Immediate actions desired among participants • Measures are usually expressed quantitatively

  11. Resources Activities Outputs Outcomes Impact • Family/professional partnership at all levels of decision-making; • Access to coordinated ongoing comprehensive care within a med. home; • Access to adequate financing and private and/or public insurance to pay for needed services; • Early and continuous screening for special health care needs; • Organization of community services for easy use; • Youth transition to adult health care, work and independence. • Impact is organizational, community, or system changes expected over the long-term • Derived from MCHB’s six core outcomes • Tailor to what your program/project aims to achieve over life of grant

  12. Technical Review Key Findings • Inputs • Legislation/policy • Point for leveraging/accelerating implementation • Diversity of inputs/resources • Comprehensive? Strong? • Think about sustainability • 4 legged stool: Medicaid, Title V, family and providers (e.g, AAP)

  13. Technical Review Key Findings • Activities • Macro/overall project • Micro/specific project component • Key activities well defined • Outputs • Outputs vs. outcomes • Review activities • Think about intended results

  14. Technical Review Key Findings • Outcomes • SMART language • How will you know when you get there? • Think audience • Who will benefit? • Who will be interested? • Impact • Six core performance outcomes

  15. Moving from logic model to evaluation

  16. Purpose of Evaluation Formative - “to improve” Summative – “to prove” Generates information that demonstrates the results of your program Information/data collected throughout project but purpose is to prove results Drawn from intermediate/ long-term outcomes and impact columns of logic model Aka: outcome or impact evaluation • Provides information to help team improve the program • Information/data must be provided and shared quickly and on a routine basis • Drawn from activities, outputs, and short-term outcomes column of logic model • Aka: process evaluation

  17. Resources Activities Outputs Outcomes Impact Formative Process Evaluation Summative Outcomes Evaluation

  18. What Level of Evaluation Bronze - Process evaluation (focused on activities and outputs columns of logic model) Silver - Process evaluation and results of short-term outcomes Gold - Process evaluation and summative evaluation

  19. Evaluation Planning Steps Complete logic model with team Assess audiences for evaluation and questions they would like addressed Develop measures (using SMART language) Identify data needs to fulfill measures Identify who will collect data and timeframe for doing so

  20. Sample Evaluation Planning Matrix

  21. Step One: Complete Logic model with team

  22. Step 2: Assess audiences for evaluation and questions they would like addressed

  23. Who are audiences for evaluation? From: Kellogg Fdtn

  24. Exercise #1: Think about the MCHB core outcomes that your project is working toward Brainstorm various audiences who would be interested in the evaluation List questions that 3 different target audiences would want to know about your project (three audiences must be Medicaid and state legislature funder and HRSA or other current funder!) Identify column(s) of logic model where answers to questions are addressed Identify whether formative or summative questions

  25. Step 3: Develop measures (using SMART language)

  26. SMART • S = SPECIFIC • M = MEASURABLE • A = ATTAINABLE/ACHIEVABLE • R = RELEVANT • T = TIMEBOUND

  27. Rhode Island Example Pediatric Practice Enhancement Project (PPEP) Placement of Parent Consultants in select practices Claims data analysis in collaboration with State’s managed care organization for CYSHCN Pre/post PPEP participants Comparison of CYSHCN in PPEP practices compared to CYSHCN in non PPEP practices

  28. Rhode Island Findings # of health care (HC) encounters 21% higher for PPEP compared to non-PPEP Pre/post of PPEP group revealed fewer HC encounters in period following PPEP participation Inpatient utilization was 34% lower for PPEP participants compared to non-PPEP Annual HC costs were 27% lower for PPEP participants compared to non-PPEP

  29. Exercise #2 Discuss audience for each goal/outcome Discuss how you know if you succeeded Reword in SMART language For each SMARTly-worded objective, identify measures and data sources

  30. Planning for Evaluation Bring in partners Costs of evaluation vs. scope of evaluation Outside vs. inside evaluator

  31. Resources • http://wwwlwkkf.org W.K. Kellogg Foundation Evaluation Handbook • Robert Wood Johnson. 2009. A Practical Guide for Engaging Stakeholders in Developing Evaluation Questions. • Technical Assistance through National Centers and JSI

  32. Thank you.

More Related