1 / 11

Programme Evaluation for Policy Analysis

Programme Evaluation for Policy Analysis. Mike Brewer, 19 October 2011 www.pepa.ac.uk. Programme Evaluation for Policy Analysis: overview. Part of the ESRC-funded National Centre for Research Methods PEPA is about ways to do, and ways to get the most out of, “programme evaluation”.

ely
Download Presentation

Programme Evaluation for Policy Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Programme Evaluation for Policy Analysis Mike Brewer, 19 October 2011 www.pepa.ac.uk

  2. Programme Evaluation for Policy Analysis: overview • Part of the ESRC-funded National Centre for Research Methods • PEPA is about ways to do, and ways to get the most out of, “programme evaluation” “estimating the casual impact of” “government policies” (although can often generalise)

  3. Programme Evaluation for Policy Analysis: overview • Part of the ESRC-funded National Centre for Research Methods • PEPA is about ways to do, and ways to get the most out of, “programme evaluation” • Estimating the counterfactual • Characterising the uncertainty • Generalizing and synthesizing • Beneficiaries • those who do programme evaluation • those who commission or design evaluations, or make decisions based on the results of evaluations

  4. 1. Step change in conduct of programme evaluation • Training courses, workshops, on-line resources • Research • doing inference more accurately • social networks and policy interventions • estimating bounds of true impact • Substantive research projects as exemplars

  5. 2. Maximise the value of programme evaluation • Research • Combining behavioural models with results of evaluation • Compare RCTs with non-experimental approaches • Synthesising results • Training courses, especially for • Those who commission evaluations and interpret results of evaluations (link with Cross-Government Evaluation Group)

  6. PEPA: who we are • Professor Richard Blundell, UCL & IFS • Professor Mike Brewer, University of Essex & IFS (Director) • Professor Andrew Chesher, UCL & IFS • Dr Monica Costa Dias, IFS (Deputy Director) • Dr Thomas Crossley, Cambridge & IFS • Professor Lorraine Dearden, Institute of Education & IFS • Dr Hamish Low, Cambridge & IFS • Professor ImranRasul, UCL & IFS • Dr Barbara Sianesi, IFS • Department for Work and Pensions is a partner www.pepa.ac.uk

  7. Thoughts on evidence provision • Pilots often not designed with focus of answering “what works?” • Funding “What works?” seen as government’s responsibility • Data • Limitation of “what works” evidence

  8. Spare slides on PEPA

  9. PEPA: overview

  10. PEPA: research questions

  11. PEPA: training and capacity building

More Related