1 / 32

EVAL 6000: Foundations of Evaluation

EVAL 6000: Foundations of Evaluation. Final lecture!. (Semi) In-Depth Examination of Five Evaluation Approaches. Utilization-focused evaluation Participatory evaluation Theory-driven/theory-based evaluation CIPP model for evaluation

khuyen
Download Presentation

EVAL 6000: Foundations of Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EVAL 6000: Foundations of Evaluation Final lecture!

  2. (Semi) In-Depth Examination of Five Evaluation Approaches • Utilization-focused evaluation • Participatory evaluation • Theory-driven/theory-based evaluation • CIPP model for evaluation • Consumer-oriented evaluation (Scriven’s Key Evaluation Checklist approach)

  3. To facilitate a clearer understanding of these evaluation approaches, we will use Heifer Project International (HPI) as a case example to provide context and to discuss how these approaches might be applied in practice

  4. Heifer Project International (HPI) • Aim is to reduce poverty, hunger, and social inequities through strategies aimed at creating self-reliance rather than providing short-term relief • “Passing on the gift” is one of the unique attributes that sets apart Heifer from other international development initiatives

  5. Utilization-Focused Evaluation (UFE) • Evaluation done for and with specific intended primary users for specific, intended uses • Premised on the assertion that evaluations should be judged by their utility and actual use

  6. Utilization-Focused Evaluation (UFE) • Evaluator is charged with giving careful consideration to how everything that is done, from beginning to end, will affect use • Is personal and situational, with strong emphasis on the “personal factor”

  7. Utilization-Focused Evaluation (UFE) • Does not give primacy to any specific method, model, approach, or ideological orientation (with the exception of an emphasis on use) • Does emphasize The Program Evaluation Standards as a basis for accountability and quality assurance

  8. Utilization-Focused Evaluation (UFE) • Advance organizers • What decisions, if any, are the evaluation findings expected to influence? • When will decisions be made? By whom? When, then, must the evaluation findings be presented to be timely and influential? • What is at stake in the decisions? For whom? What controversies or issues surround the decision? • What is the history and context of the decision-making process? • What other factors (values, politics, personalities, promises already made) will affect the decision making?

  9. Utilization-Focused Evaluation (UFE) • Advance organizers, continued • How much influence do you expect the evaluation to have—realistically? • To what extent has the outcome of the decision already been determined? • What data and findings are needed to support decision making? • What needs to be done to achieve that level of influence? • How will we know afterward if the evaluation was used as intended?

  10. Participatory Evaluation • An extension of the more restrictive stakeholder-based approach (with elements of UFE) • Emphasis on increasing use through participation • Includes aspects of organizational learning and capacity building through stakeholder participation

  11. Participatory Evaluation • Evaluator is a coordinator and responsible for technical support, training, and quality control • Ultimately, the evaluator works collaboratively/in partnership with a select group of intended users

  12. Participatory Evaluation • Two primary forms • Practical participatory evaluation (PPE) • Utilization-oriented (with an emphasis on formative evaluation) • Transformative participatory evaluation (TPE) • Democratic, emancipatory, empowerment-oriented

  13. Participatory Evaluation • Who controls? • Technical decision making (evaluator vs. stakeholder) • Stakeholder selection for participation? • Stakeholders selected for participation (diverse vs. limited) • How deep? • Stakeholder participation (involved in all aspects of inquiry vs. involved as a source for consultation)

  14. Original dimensions of PPE

  15. Modified dimensions of PPE (Cullen, 2010)

  16. Theory-Driven/Based Evaluation • Any evaluation strategy or approach that explicitly integrates and uses stakeholder, social science, some combination of, or other types of theories in conceptualizing, designing, conducting, interpreting, and applying an evaluation

  17. Theory-Driven/Based Evaluation • Sometimes referred to as program-theory evaluation, theory-based evaluation, theory-guided evaluation, theory-of-action, theory-of-change, program logic, logical frameworks, outcomes hierarchies, realist or realistic evaluation, and, program theory-driven evaluation science

  18. Theory-Driven/Based Evaluation • All, in some form or another, aim to determine how, why, when, and for whom a program works and under what conditions (i.e., causal explanation)

  19. CIPP Model for Evaluation • The model’s core concepts are denoted by the acronym CIPP, which stands for evaluations of an entity’s context, inputs, processes, and products • Generally targeted toward program managers and other decision makers http://www.wmich.edu/evalctr/archive_checklists/cippchecklist_mar07.pdf

  20. CIPP Model for Evaluation • Context evaluations are applied to assess needs, problems, assets, and opportunities, plus relevant contextual conditions and dynamics to help decision makers define goals and priorities and to help the broader group of users judge goals, priorities, and outcomes • Input evaluations serve program planning by helping identify and then assess alternative approaches, competing action plans, staffing plans, and budgets for their feasibility and potential cost-effectiveness to meet targeted needs and achieve defined goals

  21. CIPP Model for Evaluation • Process evaluations are used to assess the implementation of plans to help staff carry out activities and later to help the broad group of users judge program implementation and expenditures and also interpret outcomes • Product evaluations are used to identify and assess costs and outcomes (intended and unintended, short-term and long-term) and may be divided into assessments of impact, effectiveness, sustainability, and transportability

  22. Consumer-Oriented Evaluation • Predicated on “values” and “valuing” • Values (aka, criteria and standards) brought to bear are derived from multiple sources (e.g., definitional, needs of impacted population, legal, ethical, functional/logical) • Targeted toward those affected by programs (i.e., consumers) http://www.wmich.edu/evalctr/archive_checklists/kec_feb07.pdf

  23. Consumer-Oriented Evaluation • Requires evaluators to investigate values in terms of process, outcomes, costs, comparisons, and generalizability under the “Subevaluations” checkpoints in the Key Evaluation Checklist (KEC) • Explicit integration of empirical “facts” with values (i.e., the fact-value synthesis) as well as the integration of multiple values (i.e., the value synthesis)

  24. Consumer-Oriented Evaluation • Organized around 15 checkpoints • Preliminaries • Executive summary • Preface • Methodology • Foundations • Background and context • Descriptions and definitions • Consumers (impactees) • Resources (a.k.a., “strengths assessment”) • Values

  25. Consumer-Oriented Evaluation • Checkpoints, continued • Subevaluations • Process • Outcomes • Costs • Comparisons • Generalizability • Conclusions & Implications • Synthesis • (possible) Recommendations & Explanations • (possible) Responsibility & Justification • Report & Support • Metaevaluation

More Related