1 / 29

What is Impact Evaluation … and How Do We Use It ?

What is Impact Evaluation … and How Do We Use It ?. Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education Impact Evaluation (APEIE) Accra, Ghana May 10-14 2010. Some examples.

xaria
Download Presentation

What is Impact Evaluation … and How Do We Use It ?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education Impact Evaluation (APEIE) Accra, Ghana May 10-14 2010

  2. Some examples • Should my government distribute free textbooks to students to promote learning? • Should we distribute scholarships to poor children to promote attendance? • Should teachers be rewarded for learning improvements of their students? • Should management decisions be devolved to the school level?  Impact evaluation is a way to start answering those questions using rigorous evidence

  3. What is Impact Evaluation? Example • You would like to distribute textbooks to students as a way of improving learning outcomes • Your intuition tells you that textbooks should matter. • But what is that intuition based on? • Own experience • “Common sense” • Observing children in schools • Comparisons between children in schools with textbooks and in those without • Impact evaluation, in this situation, would aim at providing • rigorous evidence, • based on actual experience, • of what the actual impact of providing textbooks is.

  4. What is Impact Evaluation? • How would impact evaluation achieve this aim? • By establishing the causal impact of textbooks on learning outcomes • This is the ultimate goal of impact evaluation. • This workshop will be about • What makes for a good estimate • How to estimate that impact • How to interpret the estimate

  5. Why do we use impact evaluation? • Understand if policies work • Might an intervention work (“proof of concept”)? • Can an intervention be done on a large scale? • What are alternative interventions to achieve a particular goal, and how do they compare?

  6. Why do we use impact evaluation? • Understand the net benefits of the program, and cost-effectiveness of alternatives • Requires good cost and benefit data • Understand the distribution of gains and losses • Budget constraints force selectivity • Bad policies and programs are wasteful and can be hurtful

  7. Why do we use impact evaluation? • Demonstrate to politicians, population, donors that a program is effective • This can be key to sustainability • Informs beliefs and expectations

  8. Putting impact evaluation in context Impact evaluation Evaluation Monitoring Analytical efforts to relate cause and effect. Key part is establishing “what would have happened in the absence of the intervention” Analytical efforts to answer specific questions about performance of a program/activities. Regular collection and reporting of information to track whether actual results are being achieved as planned

  9. Monitoring, Evaluation, and Impact Evaluation • Periodically collect data on the indicators and compare actual results with targets • To identify bottle-necks and red flags (time-lags, fund flows) • Point to what should be further investigated Monitoring Regular collection and reporting of information to track whether actual results are being achieved as planned

  10. Monitoring, Evaluation, and Impact Evaluation • Analyzes why intended results were or were not achieved • Explores targeting effectiveness • Explores unintended results • Provides lessons learned and recommendations for improvement Evaluation Analytical efforts to answer specific questions about performance of a program/activities.

  11. Monitoring, Evaluation, and Impact Evaluation Impact evaluation • What is effect of program on outcomes? • How much better off are beneficiaries because of the intervention? • How would outcomes change under alternative program designs? • Does the program impact people differently (e.g. females, poor, minorities) • Is the program cost-effective? Analytical efforts to relate cause and effect. Key part is establishing “what would have happened in the absence of the intervention”

  12. The central problem in Impact Evaluation Analysis: The counterfactual • In order to establish the impact of the program, we need to know what would have happened in the absence of the program • Not in general, but specifically for the people who actually received the program

  13. The central problem in Impact Evaluation Analysis: The counterfactual • What is the effect of a scholarship on school enrollment? • We want to observe the units of treatment in two states • What’s wrong with this picture? Elizabeth on 1 July 2010 with scholarship Elizabeth on 1 July 2010 without scholarship

  14. The central problem in Impact Evaluation Analysis: The counterfactual • This is impossible! • we never observe the same individual with and without program at same point in time • The counterfactual is never actually observed • It needs to be estimated • Impact Evaluation Analysis is all about alternative approaches to estimating the counterfactual

  15. Why is the counterfactual important? • The next session will discuss the counterfactual in more detail • Here, just one illustration

  16. Illustration of the importance of the counterfactual • Question: What is the best estimate of the impact of the program on enrollment? Enrollment Before After Impact? A B Time 2008 2010 Program

  17. Illustration of the importance of the counterfactual • Question: What is the best estimate of the impact of the program on enrollment? Enrollment Before After Impact? A D? B C? Time 2008 2010 Program

  18. What is impact evaluation? • Impact is … the difference between outcomes with the program and without it • Impact evaluation involves … estimating the counterfactual so that changes in outcomes can be attributed to the program

  19. What is involved in implementing an impact evaluation? • Determine whyan IE is called for • Understand program and its results chain • Determine whatto measure • Determine methodology • Carry out • Datacollection (baseline, follow up) • Program implementation • Analysis and reporting • Adjust policy

  20. Determine why the evaluation called for • Specific intervention • Cash transfer to specific students • Specific teacher training program with a particular curriculum • School grant program with a particular structure • Alternative interventions/complementary interventions • Teacher training versus school grants as a way to improve outcomes • Information and school grants as a way to boost school accountability and performance • Entire program/cluster of activities • Reform program  What is the audience (policymakers, technocrats, public at large)?

  21. Outcomes Long-term Results Inputs Activities Outputs (What the program does) (Goods & Services) • Teachers’ training • School councils established • Conditional Cash transfers • Salary incentives • Increased enrollment • Lower teachers’ absenteeism • Higher primary education completion rates • Higher student learning achievements • Lower unemployment • Poverty reduction • Better income distribution • Teachers • Text books • Grants Understand the program using the results chain This results chain provides guidance on what to measure

  22. Determine what to measure • Based on the results chain • Choose indicators for the evaluation • Carefully defined indicators • Can be measured in a precise way • Are expected to be affected by the program… • … within the timeframe of the evaluation • What are the important sub-populations • E.g. age, gender, urban/rural, SES…

  23. Outcomes Long-term Results Inputs Activities Outputs (What the program does) (Goods & Services) • Teachers’ training • School councils established • Conditional Cash transfers • Salary incentives • Increased enrollment • Lower teachers’ absenteeism • Higher primary education completion rates • Higher student learning achievements • Lower unemployment • Poverty reduction • Better income distribution • Teachers • Text books • Grants Determine indicators using the results chain Indicators related to the implementation of a program

  24. Outcomes Long-term Results Inputs Activities Outputs (What the program does) (Goods & Services) • Teachers’ training • School councils established • Conditional Cash transfers • Salary incentives • Increased enrollment • Lower teachers’ absenteeism • Higher primary education completion rates • Higher student learning achievements • Lower unemployment • Poverty reduction • Better income distribution • Teachers • Text books • Grants Determine indicators using the results chain Indicators related to the results of a program

  25. Determine the methodology • We’ll be talking a lot more about this • Experimental methods • Quasi-experimental methods • Some principles: • Prefer method that complements program best • Prefer method that does not alter program design or implementation substantively • Prefer method that does not deny anyone benefits

  26. The IE “cycle” Design phase Baseline data collection Program implementation Follow-up data collection Program adjust-ements Data analysis and reporting

  27. The goal of impact evaluation • To improvepolicies • For example, to find out how to turn this teacher…

  28. The goal of impact evaluation • To improvepolicies • …into this teacher

  29. Thank you!

More Related