1 / 61

Evaluating the Initiative

Evaluating the Initiative. Community Tool Box Curriculum Module 12. What have been your previous experiences in evaluation? Is your partnership or coalition currently undergoing any type of evaluation?. Concepts and Attributes of Evaluation.

ardith
Download Presentation

Evaluating the Initiative

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Initiative Community Tool Box Curriculum Module 12

  2. What have been your previous experiences in evaluation?Is your partnership or coalition currently undergoing any type of evaluation?

  3. Concepts and Attributes of Evaluation Evaluation: Systematic investigation of the merit, worth, or significance of an object or effort

  4. Concepts and Attributes of Evaluation Evaluation as: • A Process: ongoing, interactive (repeating) • A Product: information used for continuous improvement

  5. Why an Evaluation? • Help understand how efforts work • Ongoing feedback can improve community work • Gives community members voice and opportunity to improve efforts • Hold accountable those groups doing, supporting, and funding the work

  6. A Bill of Rights and Responsibilities for Community Evaluation • Contribute to understanding • Contribute to improvement • Encourage participation of all stakeholders • Respond to interests of different stakeholders • Capture the dynamic nature of the work • Provide clear and timely information • Be practical, with the benefits outweighing the costs • Do no harm to the community • Help clarify the initiative’s contribution to more distant outcomes • Strengthen capacity

  7. Concepts and Attributes of Evaluation DIALOGUE: Which of these attributes is particularly important to your community initiative? How will this attribute be assured in your evaluation?

  8. Overview of an Evaluation Plan • Identify stakeholders • Describe the program • Focus the evaluation design • Gather credible evidence • Make sense of data and justify conclusions • Use information to celebrate, make adjustments and communicate lessons learned

  9. “We should make things as simple as possible, but not simpler.” -- Albert Einstein

  10. Determining Who Cares and What They Care About Identify Who Cares – Stakeholders: • Those involved in operating the program or initiative • Those served or affected • Primary intended users of the evaluation (e.g., funders, staff, researchers)

  11. DIALOGUE: Who cares about your effort and its effects?

  12. Identify What They Care About: • Different stakeholders may have different perspectives. Ask. What evaluation questions or aspects are of interest to: • Community groups • Grantmakers and funders • Outside researchers

  13. Evaluation Stakeholders & Interests Program or Initiative: ______________________

  14. Developing Evaluation Questions • An evaluation question seeks information stakeholders want to know about the functioning of the program • Examples: • Did the intervention have the desired effect? • With whom? • Under what conditions?

  15. Categories of Evaluation Questions • Process • Planning and implementation issues • Outcome • Attainment of objectives • Impact on participants • Impact on the community

  16. Process Measures Measures for planning and implementation • How important are the goals? • How well was the effort planned? • How well was it implemented? • Who participated? How many? • Did those most affected contribute to planning, implementation, and evaluation? • How satisfied are stakeholders withthe program?

  17. Outcome Measures Attainment of objectives • How well has the program met its stated objectives?

  18. Developing Outcome Measures Impact on participants • How much and what kind of difference has the program made for its targets (e.g., in knowledge, behaviors, outcomes)? • From the perspective of participants (and outside experts), how significant were the effects?

  19. Developing Outcome Measures Impact on the community • How much and what kind of difference has the effort made on the community (e.g., in community and systems changes, rates of behavior, population-level outcomes)? • From the perspective of participants (and outside experts), how significant were the effects? • Were there any unintended consequences? Positive? Negative? • Is the community’s capacity to address problems improved—across issues, over time?

  20. ACTIVITY: What evaluation questions related to process measures or outcome measures is your group interested in?

  21. Evaluation Questions Program or Initiative: _______________________

  22. Developing Meaningful Questions Consider: • What exactly was it that you originally hoped to accomplish? • How did you plan to do so? • What does your strategic plan suggest? • What does your logic model suggest—activities, outputs, and outcomes?

  23. Evaluating the Initiative “Wonder is the beginning of wisdom.” --Anonymous

  24. Gathering Evidence to Address the Evaluation Questions • Evidence—information that could be used to assess the merit or worth of a program • Gathering credible evidence—Overview: • Indicators of Success • Sources of Evidence • Quality of Evidence • Quantity of Evidence • Logistics for Gathering Information

  25. Indicators of Success Translate expected effects into specific measurable units • Examples include: • Program outputs (e.g., services delivered) • Participation rates • Levels of satisfaction • Intervention exposure, or dose • Changes in communities and systems • Changes in behaviors of participants, populations • Population-level outcomes

  26. DIALOGUE: What are some indicators of success for the Austin Healthy STEPS Initiative?

  27. Sources of Evidence • People and Surveys • Documents (e.g., archival data, statistics) • Observation and Experimentation • More than one source helps make evidence more compelling

  28. DIALOGUE: What are some possible sources of evidence for the Austin Healthy STEPS Initiative?

  29. Quality of Evidence • Appropriateness and integrity of information • Reliability • Validity • Relationship to the evaluation questions • High quality data are both accurate and sensitive

  30. Quantity of Evidence • Sufficient data to draw conclusions • Enough to answer evaluation questions • Adequate for data analysis • Enough to draw conclusions • Not more than is necessary

  31. Logistics for Gathering Information • Methods • Timing • Infrastructure • Attention to cultural norms • Guarantee of confidentiality

  32. Selecting Evaluation Methods • Surveys about satisfaction and importance of the effort • Goal attainment reports • Behavioral surveys • Interviews with key participants • Archival records

  33. Selecting Evaluation Methods • Observations • Self-reports, logs or diaries • Documentation systems and analysis of contribution • Community-level indicators of impact • Case studies and experiments

  34. Ongoing Documentation and Evaluation • Document unfolding of intervention: • Community and systems changes (i.e., new or modified programs, policies and practices) • Analysis of contribution to population-level outcomes—Amount by: • Goal • Intensity of behavior change strategy • Duration • Penetration to Targets throughSectors in Places

  35. Assembling Evidence Chart Program or Initiative: ______________________

  36. Gathering Evidence to Address the Evaluation Questions DIALOGUE: What types of evaluation methods would be useful to the STEPS initiative? DIALOGUE: Which of these evaluation questions would be of interest to key stakeholders for your initiative? What are indicators of success? What evaluation methods would you use?

  37. Evaluating the Initiative “The best test … is how well the modeler can answer the questions, ‘What do you know now that you did not know before?’ and ‘How can you find out if it is true?’” --James M. Brower

  38. Using Evaluation Data toLearn and Make Adjustments Making Sense of the Data and Justifying Conclusions • Standards • Analysis and Synthesis • Sense Making and Interpretation • Judgments • Recommendations

  39. Using Evaluation Data toLearn and Make Adjustments DIALOGUE: How will the Austin Healthy STEPS group use the information to make judgments about the merit of the effort?

  40. Some Ways to Use the Information • Celebrate accomplishments • Make adjustments • Communicate lessons learned

  41. Using the Information • Plan for use • Consider the actual evaluation results • Encourage use of learning from the evaluation process

  42. Keeping Your “Eyes On The Prize” • Ultimate goal is to improve outcomes • May take too long to be useful • Crucial to document intermediate outcomes to support movement toward more distant outcomes

  43. Using Evaluation Data toLearn and Make Adjustments DIALOGUE: How might the STEPS initiative use their findings to learn and make adjustments? ACTIVITY: Who might use the evaluation findings to learn and make adjustments to your program? Use the following chart to map out the key questions you are going to ask, potential findings, and related recommendations.

  44. Evaluation Questions of Interest to Key Stakeholders (e.g., Did the program result in behavior change? Do people like the program?) Actual (Potential) Findings and Key Conclusions (e.g., The results showed that there was only a slight increase in… The findings suggest that …) Actual (potential) Recommendations (e.g., Consistent with these findings, we recommend that… including…) Using an Evaluation Information Chart

  45. Communicating the Findings to Relevant Audiences How, when, and what to communicate from an evaluation is a strategic decision.

  46. Sharing Lessons Learned Strategy Issues • Timing • Style • Tone • Message Source • Vehicle • Format of the information products

  47. Communicating the Findings to Relevant Audiences Broadening Your Dissemination • Formal evaluation • Methods for delivering evaluation data: • Press releases • Internet distribution, web pages, links • Storytelling • Presentations and publications • Consulting by group members or teams • Use multiple and diverse methods

  48. ACTIVITY: Use the chart on the next slide to identify the message, audience, plan and channel to communicate the findings of your evaluation.

  49. The Message - Actual (Potential) Findings Plus Recommendations Communications Plan AUDIENCE Who should receive this message? SOURCE Who should deliver it? CHANNEL How should it be delivered? (e.g., personal contact, written report, media, professional presentation or publication) Communicating Results (e.g., The results show… The findings suggest… We recommend…)

More Related