1 / 45

Introduction to Program Evaluation (Ex-Post Policy Analysis)

Introduction to Program Evaluation (Ex-Post Policy Analysis). Why It’s NOT Ex-Ante Policy Analysis Why It’s NOT Research Current Jumble of Approaches. Before and After. What we do before a policy is passed is generally referred to as “Ex-ante” “Policy Analysis”

anoush
Download Presentation

Introduction to Program Evaluation (Ex-Post Policy Analysis)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Program Evaluation (Ex-Post Policy Analysis) Why It’s NOT Ex-Ante Policy Analysis Why It’s NOT Research Current Jumble of Approaches

  2. Before and After • What we do before a policy is passed is generally referred to as “Ex-ante” “Policy Analysis” • What we do after a policy is passed and programs are established and operating can be referred to as “Ex-post” policy analysis, but, most often is referred to as “Program Evaluation”

  3. Ex-ante policy analysis "for policy making" - Analyzing policies. programs, and projects BEFORE they are implemented Commonly referred to as policy analysis Policy formulation Research for ….

  4. Ex-post policy analysis"of policy" - Analyzing policies, programs, and projects AFTER they have been implemented Often called program evaluation Research during and after policy implementation

  5. Ex-ante policy analysis • Ex-ante seems straight forward, doesn’t it? • Figure out what you want to do • Figure out ways to do it • Compare the ways • Choose the best one • But it’s not – It is incredibly chaotic

  6. Process Rational Institutional Choice Incremental Implementation Group Elite Agenda Evaluation Policy Tower of Babel • The policy analysis field is currently home to a babble of tongues • Dozens of “approaches,” “methodologies” and “frameworks” are discussed throughout the literature • Almost always without reference to the other half-dozen nearly identical “approaches”

  7. No unification of thought coming “The policy field is currently marked by an extraordinary variety of technical approaches, reflecting the variety of research traditions in contemporary social science. That variety is likely to persist for the foreseeable future, for the reductionist dream of a unified social science under a single theoretical banner is dead.” Davis Bobrow and John Dryzek, 1987

  8. Frames • While there are literally dozens (if not hundreds) of “methods” to conduct “Policy Analysis” they can be summarized into some larger headings based on their underlying assumptions (beliefs in truth) • Of course, many policy professors have attempted just that and now we have many different “frameworks” of “methods”

  9. Frames • The one I find most conceptually clear is that of Bobrow and Dryzek, they see all of these different approaches falling into one of 5 overarching frameworks • Welfare Economics • Public Choice • Social Structure • Information Processing • And Political Philosophy

  10. Frames • Welfare economics • has the greatest number of policy field practitioners and is manifested in such familiar techniques as cost-benefit analysis and cost-effectiveness analysis

  11. Frames • Public Choice • Straddles the disciplines of microeconomics, political science, and public administration and concerns itself mostly with the analysis and design of decision structures

  12. Frames • Social structure • Rooted in sociology, has some crucial subdivisions, most noticeably those focusing on individual endowments vs. those focusing on group endowments

  13. Frames • Information processing • Mostly focuses on the limits inherent upon any participant in the policy process (although some optimistic practitioners see a chance for change through recognition of the situation)

  14. Frames • Political philosophy • Practitioners focus on applying moral reasoning to the content of policy and the process of policy making • A key idea to remember is that policy analyses conducted within these frames, by and large, are not directly concerned with the final results in terms of program outcomes

  15. So why is the policy analysis field so chaotic? • What do you think? • What does policy result in? • Given that, is there any way that a completely “rational” view can achieved?

  16. Wamsley and Zald • If we think of every program, organization, network, etc as having an external/internal dimension and a political economic dimension, the difference between policy analysis and program evaluation becomes more clear

  17. Economic External Economy Technical Internal Economy Social Internal Polity Political External Polity Wamsley & Zald Remapped

  18. Economic External Economy Technical Internal Economy Social Internal Polity Political External Polity Wamsley & Zald Remapped • External to the program itself is a political and economic environment trying to decide WHAT TO DO!

  19. Economic External Economy Technical Internal Economy Social Internal Polity Political External Polity Wamsley & Zald Remapped • Internal to the program itself is a social and technical environment trying to decide IF WE DID IT WELL!

  20. Economic Political Economic Economic Economic Economic Technical Technical Technical Technical Social Social Social Social Political Political Political Political One Policy provides environment for many Programs Policy Analysis Program Evaluation

  21. What is Program Evaluation (Ex-post Policy Analysis)? • Early definition • “…determining the worth or merit of something.” Scriven (1967) • Contemporary definition • “…the identification, clarification, and application of defensible criteria to determine an evaluation object’s value (worth or merit) in relation to those criteria” Ftizpatrick, et al. (2004) • What’s the difference?

  22. What is Program Evaluation (Ex-post Policy Analysis)? • One educator may like a new reading curriculum because of the love of reading it instills • Another educator may not like the same curriculum because it doesn’t move the child along as rapidly as other curricula in terms of letter interpretation, word interpretation, or sentence meaning • They are looking at the same program using different “criteria”

  23. Program Evaluation is not Research • Research and Evaluation differ in their purposes and, as a result, in the roles of the researcher and evaluator in their work, their preparation, the generalizability of their results, and the criteria used to judge their work.

  24. Which approach seems more amenable to the likely roles of the public administrator? Why? Which approach seems more amenable to the likely roles of the public administrator? Why? Program Evaluation is not Research Develop knowledge/theory Help make judgments/decisions Researcher Stakeholders Specific to the evaluation object Widespread Internal validity (causality) & external validity (generalizability) Accuracy, utility, feasibility, propriety One discipline Interdisciplinary

  25. “Research seeks to prove, evaluation seeks to improve…” M.Q. Patton

  26. Formal vs. Informal Evaluation • Evaluation is not new! • Neanderthals used it in determining with saplings made the best spears

  27. Formal vs. Informal Evaluation • English yeoman abandoned their own crossbows in favor of the Welsh longbow • No GAO report has been found but we assume an informal evaluation was conducted as some point • Result: clobbered the French who tried the longbow but went back to the crossbow (BAD EVALUATION)

  28. Formal vs. Informal Evaluation • As humans we informally evaluate things everyday • Administrators make quick judgments on personnel, programs, budgets, etc. These judgments lead to decisions • A policy maker may make a judgment leading to a voting decision on a policy based on a single speech

  29. Formal vs. Informal Evaluation • Informal evaluation may result in poor or wise decisions • The point is that they are characterized by an absence of breadth and depth because they lack systematic procedures and formally collected evidence • Program evaluation is about “formalizing” our approaches in forming judgments and making decisions

  30. Formal Evaluation Process • Determine standards for judging quality • Collect relevant information • Apply the standards to determine value, quality, utility, effectiveness or significance • Identify recommendations to optimize evaluation object (program)

  31. Evaluation’s Purposes • Typical Purpose • determine merit or worth of something; render judgments about the value of whatever is being evaluated • Alternative purposes • Serve political functions • Facilitate learning • Social betterment • Foster deliberative democracy

  32. Why Evaluate Programs? • To gain insight about a program and its operations – to see where we are going and where we are coming from, and to find out what works and what doesn’t • To improve practice – to modify or adapt practice to enhance the success of activities • Toassess effects – to see how well we are meeting objectives and goals, how the program benefits the community, and to provide evidence of effectiveness • Tobuild capacity - increase funding, enhance skills, strengthen accountability

  33. Direct service interventions Community mobilization efforts Research initiatives Surveillance systems Policy development activities Outbreak investigations Laboratory diagnostics Communication campaigns Infrastructure-building projects Training and educational services Administrative systems What Can be Evaluated? 34

  34. When to Conduct Evaluation? Planning a NEW program Assessing a DEVELOPING program Assessing a STABLE, MATURE program Assessing a program after it has ENDED Conception Completion The stage of program development influences the reason for program evaluation.

  35. Two basic types of Evaluation • Formative (Process) • Provide information for program improvement, typically to judge the merit and worth of a part of a program • Audience is generally the people delivering the program or those close to it. • Typically qualitative in nature

  36. Two basic types of Evaluation • Summative (Impact or Outcomes) • Summative evaluation is a process of identifying larger patterns and trends in performance and judging these summary statements against criteria to obtain performance ratings • Provide information for making decisions about program adoption, continuation, or expansion • Audience is generally potential consumers (students, teachers, employees, managers, etc) • Mostly quantitative in nature

  37. Balance between Formative and Summative

  38. Three subtypes: Needs Assessment, Process, and Outcome Evaluations • Needs assessment • Does a problem/need exist? • Recommend ways to reduce the problem • Process/Monitoring • Description of program delivery • Outcome • Descriptions of changes in recipients or other secondary audiences based on program delivery

  39. A Typology of Evaluation Studies

  40. So that’s it? • No way! • The way an evaluator looks at Truth vs. truth creates another dimension! • Each one of those study types can be conducted in a manner focusing on replication with a lot of data or focusing on deep understanding of very little data

  41. Objectivist vs. Subjectivist Epistemology Objectivism • Requires an evaluation study to utilize data collection and analysis techniques that yield results that are reproducible and verifiable by other competent persons using the same techniques. Subjectivism • Bases its validity claims on “an appeal to experience rather than to the scientific method”

  42. A Typology of Evaluation Studies This question could be answered by a survey of all employees (quantitative-objective) or by convening a panel of “experts” in the field (qualitative-subjective)

  43. Evaluation Approaches • The Objective-Subjective Dimension creates a broader set of “approaches” • Any of the “types” of evaluation could fall within any of these “approaches” • It all depends on the methodologies employed (how you collect and analyze your data)

  44. Objectivism (mostly quantitative) Subjectivism (mostly qualitative) Evaluation Approaches Objectives-Oriented Approaches • Focus on specifying goals and objectives and determining the extent to which they have been attained. Management-Oriented Approaches • Central concern is identifying and meeting the information needs of managerial decision makers. Consumer-Oriented Approaches • Central issue is developing evaluative information on “products” and accountability for consumers. Expertise-Oriented Approaches • Depend on the direct application of professional expertise to judge quality of whatever is being evaluated. Participant-Oriented Approaches • Involvement of participants (primarily stakeholders) are central in determining the values, criteria, needs, data, and conclusions for the evaluation.

More Related