1 / 29

Evaluation

Evaluation. North Orange County Public Safety Task Force - CBO Capacity Building Workshop Series. WORKSHOP OBJECTIVES. Reconsider your mindset about evaluation How to approach your work with an evaluative mindset Take some of the mystery out of evaluation

kenley
Download Presentation

Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation North Orange County Public Safety Task Force - CBO Capacity Building Workshop Series

  2. WORKSHOP OBJECTIVES • Reconsider your mindset about evaluation • How to approach your work with an evaluative mindset • Take some of the mystery out of evaluation • Increase knowledge of what makes for a stronger evaluation • Increase confidence for those taking steps toward planning their first evaluation/ strengthening their evaluation capacity

  3. Principles • We engage in evaluation every day. • Evaluation is not an activity but a way of doing business. • There is not a one and/best way to do evaluation • However, there are steps which, if taken, set your agency up for a more successful evaluation.

  4. Principles • Evaluation is fundamentally about ascribing value to some program, intervention, agency, etc. • When planning the evaluation, aim to measure the least amount possible. • Evaluation is not about looking for 100% certainty that a program, agency, or intervention resulted in the outcomes – don’t let this belief stop you!

  5. EVALUATION IS SOMETHING WE ALL ENGAGE IN • When was the last time you purchased an expensive product? • When was the last time you conducted a job search? • When was the last time you had to choose between two or more curricula for a new program?

  6. Evaluative thinking Cognitive process motivated by inquisitiveness and a belief in the value of evidence Identifying assumptions Posing thoughtful questions Pursuing deeper understanding through reflection and perspective taking Making informed decisions in preparation for action

  7. Mainstreaming evaluation Mainstreaming evaluation means making it central to an agency’s work, rather than an add-on, end-of-project mandate.

  8. What can evaluation do for your agency? • Measure change your programs create. • Tells you where you are doing well and where improvements are needed. • Helps develop language that tell their story. • Holds agency accountability to participants, community members and funders. • Keep agencies on track and offer ways to make strategic decisions when shifts in plans are needed. • Helps to determine if the work we are doing is the most meaningful work and/or the right work for the right people at the right time. • Helps to develop an evidence base for agency’s work

  9. THERE IS NO ONE BEST WAY TO CONDUCT EVALUATION • Purpose • Evaluative questions • Methodology • Inclusiveness • Level of confidence required

  10. considerations that make for a more successful evaluation • When you plan the evaluation • Who should be included in evaluation planning / implementation? • What is the context of your evaluation?

  11. WHO TO INCLUDE IN THE EVALUATION PLANNING/IMPLEMENTATION? • Who is impacted by the program and evaluation? • Who makes decisions about or can impact how the program or the evaluation is implemented? • Whose voices are most in need of amplification during the process of evaluation planning? • To whom will the evaluation need to speak? To whom do you need or want to tell your story of your work? • What areas of expertise do you need? Who can you draw on for that expertise?

  12. Context of evaluation • What external factors impact the way your evaluation will be carried out and the way the data will be or won’t be used? • What resources are available to support evaluation efforts? • What characteristics of your target audience might have an impact on your evaluation. • What funder restrains or guidelines do you need to follow or meet?

  13. How to do evaluation (simplified) • Ask important question about your work • Collecting information that helps you answer that question • Make sense of that information (form evaluative conclusion) • Act on that information

  14. Elements of a good evaluation • Inclusion of rich program description • Clearly stated purpose • Meaningful evaluation questions • Explicit description of criteria to be used to judge the value of program • A mix of evidence • Evaluative conclusion

  15. PROGRAM DESCRIPTION Logic Models - a way of depicting the program by specifying inputs, activities, outputs, and outcomes. • Must be sequential. • Must be logical. Theory of Change - explains how to produce desired outcomes. • Must be explanatory- it must explain why and how the activities produce the outcomes.

  16. Where evaluative thinking comes in handy

  17. DIFFERENT PURPOSES of evaluation • To determine overall quality or value of something • To find areas of improvement • Both of the above

  18. Types of evaluation questions • Questions about absolute value (Summative) • Questions about the relative value of something (Formative)

  19. Purpose and questions

  20. Selecting criteria (outcomes) • Outcomes are clear statements of targeted change. • Community / social norms • School/community climate • Individual attitudes, beliefs, or behavior • Relationship dynamics • Organizational operations and practices

  21. Selecting indicators • An indicator is a measurement of the outcome • Number of children in foster care who are safely reunited with family of origin • Number of unemployed who become employed • Number of former truants who regularly attend school • Score on an instrument that measures self-esteem

  22. A note on selecting indicators • Choose indicators that are meaningful, easy to collect, and closest to your questions. • Resist following your curiosity when planning to collect data. • How will you know if …. • Consider what data is credible for your audience • “No numbers without stories; no stories without numbers” (Patton, 2014)

  23. WHEN TO COLLECT DATA? • Pre/Post • Retrospective Pre/Post • Post only (with comparison) • Ongoing and integrated

  24. How to collect data • Qualitative, qualitative, or both • Observational data • Focus groups/ interviews • Questionnaires • Existing data/documents • Creative materials

  25. Analyzing data • Qualitative data • Certain methods are relatively easy to learn and implement. • With the use of rubrics, qualitative data can be quantified. • Quantitative data • Most often used are simple descriptive statistics. • Frequencies • Percentages • Means, modes, and medians • Inferential statistics • t test • ANOVA • Chi square

  26. The causality issue • Outside of implementing a randomized control trial there are ways to infer causation: • Ask observers • Check if content of evaluand matches the outcome • Look for other telltale patterns that suggest one cause or another • Check if timing of outcomes makes sense • Check if does is related logically to the indicators • Make comparison with a control or comparison group • Control statistically for extraneous variables • Identify and check underlying causal mechanisms

  27. Pulling it all together

  28. Q & A

  29. Thank you!

More Related