1 / 39

Evaluation 101

Evaluation 101. Laura Pejsa Goff Pejsa & Associates MESI 2014. Objectives. Gain a greater understanding of evaluation and evaluative thinking Learn about some practical approaches & get familiar with some tools to use

mckile
Download Presentation

Evaluation 101

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation 101 Laura Pejsa Goff Pejsa & Associates MESI 2014

  2. Objectives • Gain a greater understanding of evaluation and evaluative thinking • Learn about some practical approaches & get familiar with some tools to use • Have an opportunity to apply your learning directly to a real world case

  3. Session Outline • Introductions / Intro to the day • Grounding definitions & terms • Understanding “programs” (purpose & logic) • Evaluative thinking and the evaluation process • Strategies for making evaluation desirable & usable • Debrief, questions, & close

  4. Metaphors: Your Ideas about Evaluation • Think of one object that represents your ideas and/or feelings about evaluation • Prepare to explain your choice • Share your with the person sitting next to you and notice common themes • Prepare to share your common themes with the group.

  5. E-VALU-ation • "Value" is the root word of evaluation • Evaluation involves making value judgments, according to many in the field

  6. Traditional definition: Michael Scriven(from Michael Scriven, 1967, and the earlier Program Evaluation Standards) "The systematic determination of the merit, worth (or value) of an object”

  7. Important concepts in this definition • SYSTEMATIC means that evaluators use explicit rules and procedures to make determinations • MERIT is the absolute or intrinsic value of an object • WORTH is the relative or extrinsic value of an object in a given context

  8. An Alternative Definition: Michael Quinn Patton Systematic collection of information about the activities, characteristics, and results of programs to (1) to make judgments about the program, (2) improve or further develop program effectiveness, (3) inform decisions, and/or (4) increase understanding. Done for and with specific intended primary users for specific, intended uses.

  9. Commonalities among definitions • Evaluation is a systematic process • Evaluation involves collecting data • Evaluation is a process for enhancing knowledge and decision making • Evaluation use is implicit or explicit Russ-Eft & Preskill (2009, p. 4)

  10. Discussion: Why Do Evaluation? • What are the things we might gain from engaging in evaluation/an evaluative process? • Why is it in our interest to do it? • Why is it in the interest of the people we serve to do it? • What are the benefits?

  11. From the textbooks… evaluation purposes • Accreditation • Accountability • Goal attainment • Consumer protection • Needs assessment • Object improvement • Understanding or support • Social change • Decision making

  12. One basic distinction… Internal vs. External • INTERNAL evaluation • Conducted by program employees • Plus side: Knowledge of program • Minus side: Potential bias and influence

  13. EXTERNAL evaluation • Conducted by outsiders, often for a fee • Plus side: Less visible bias • Minus side: Outsiders have to gain entrée; have less first-hand knowledge of the program

  14. Scriven's classic terms FORMATIVEevaluation Conducted during the development or delivery of a program Feedback for program improvement

  15. Scriven's classic terms SUMMATIVEevaluation • Typically done at the end of a project or project period • Often done for other users or for accountability purposes

  16. A new(er) term from Patton • DEVELOPMENTAL • evaluation • Help develop a program or intervention • Evaluators part of the program design team • Use systematically collected data

  17. What is the evaluation process? Every evaluation shares similar procedures

  18. Patton’s Basics of Evaluation: • What? • So what? • Now what?

  19. General Phases of evaluation planning

  20. What? • Words? • Pictures? The key is understanding…

  21. A word about logic models and theories of change… one way to understand a program. “We build the road, and the road builds us.”-Sri Lankan saying

  22. Simplest form of a logic model INPUTS OUTPUTS OUTCOMES Results-oriented planning

  23. A bit more detail. . . INPUTS OUTPUTS OUTCOMES Short Activities Partici-pation Program investments Medium Long-term What we invest Who we reach What we do What results? SO WHAT? What is the VALUE?

  24. A simplistic example… Outputs OUTCOMES Inputs: Short

  25. What does a logic model look like?

  26. Regardless of format, what do logic models and theories of change have in common? • They show activities linked to outcomes • They show relationships/connections that make sense (are logical). Arrows are used to show the connections (the “if-then” relationships) • They are (hopefully) understandable • They do not and cannot explain everything about a program!

  27. The Case

  28. The Case: Logic and/or Theory Draw a Picture… • Inputs (what goes in to the program to make it possible?) • Outputs (Activities: what do they do? Participation: counts) • Outcomes (what do they think will happen?) • Short, medium, and long term

  29. What can we evaluate? • Context • Input(s) • Process(es) • Product(s) Daniel Stufflebeam

  30. The basic inquiry tasks (BIT) • Framing questions • Determining an appropriate design • Identifying a sample • Collecting data • Analyzing data and presenting results • Interpreting results • “Reporting”

  31. Back to the Case: What are our questions?

  32. Back to the Case: What do we need to know, and where can we find it?

  33. Possible ways to collect data • Quantitative: • Surveys • Participant Assessments • Cost-benefit Analysis • Statistical Analysis of existing program data • Some kinds of record and document review • Qualitative: • Focus Groups • Interviews • Observations • Appreciative inquiry • Some kinds of record and document review

  34. What are the best methods for your evaluation? • It all goes back to your question(s)… • Some data collection methods are better than others at answering your questions • Some tools are more appropriate for the audience you need to collect information from or report findings to • Each method of collecting data has its advantages and disadvantages (e.g., cost, availability of information, expertise required)

  35. Back to the Case: How will we find out?

  36. Reminder: Importance of Context

  37. Desire & Use • How do we make this process palatable, even desirable? • What can we do to make information USE more likely? • Ways of sharing and reporting

  38. Debrief & Questions • What are the most important take-aways from today’s session? • What can you apply in your own work? • What questions remain for you?

More Related