1 / 28

The Tools of the Trade: An Overview of Diagnostic and Assessment Instruments

The Tools of the Trade: An Overview of Diagnostic and Assessment Instruments. April 21 Francesca Recanatini, WBI www.worldbank.org/wbi/governance. Outline of the Session . A working framework to select among tools Conceptual design Empirical tools Implementation process

jacob
Download Presentation

The Tools of the Trade: An Overview of Diagnostic and Assessment Instruments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Tools of the Trade: An Overview of Diagnostic and Assessment Instruments April 21 Francesca Recanatini, WBI www.worldbank.org/wbi/governance

  2. Outline of the Session • A working framework to select among tools • Conceptual design • Empirical tools • Implementation process • Sampling and Field work • Analysis and use of the data • A few country-specific illustrations

  3. Objective How to choose among governance tools? • Which are the key elements for a governance assessment? • Which empirical tools and approaches are already available? • How can we select among them? • How can such assessments be used for policy purpose?

  4. Governance assessment: one or many approaches? The characteristics of a governance assessment are a function of the objective of the assessment

  5. Key starting points 1. What is the purpose of the assessment? • Research and analysis • Awareness raising • Policy and Action planning • Capacity building • Monitoring

  6. Key starting points 2. What is the focus of the assessment? • Governance as a whole • Corruption • Performance of a specific agency/sector • Quality of a specific public service delivered

  7. Suppose we have determined…. • The final purpose of the assessment • The focus of the assessment What next?

  8. An example – Peru 2002 • Issue: the government wanted to monitor progress in terms of • Transparency of public administration activities • Civil society “participation” and voice • Quality of public services

  9. Peru 2002, cont. • Purpose of assessment: monitoring • Focus of the assessment: • Transparency • Citizens’“Participation” and Voice • Quality of public services What next?

  10. BEEPS IGR Public Official surveys PET QSDS Score Cards Investment Climate Surveys EC Audits PER CFAA CPAR GAC Case Studies HIPC Exp. Tracking ROSC Existing Empirical Tools www.worldbank.org/wbi/governance/assessing

  11. Which are the key elements of a Governance Assessment? Four dimensions: • Conceptual • Empirical • Process / Capacity Building • Analytical and Policy

  12. Conceptual dimension • Clear definition of the variable we focus on and its manifestations • Translation of the definition into observable and measurable components • Selection of methodological approach • Understanding of the links between governance and • Performance outcomes • Development outcomes

  13. PER HIPC E.T. ROSC CPAR EC Audits Public Official Surveys IGR & GAC & Governance Cross-Country Ind. CFAA QSDS SCORE CARDS PETs BEEPS & INVEST. CLIMATE Linking the Tools to the Blueprint

  14. Conceptual dimension, cont. • Finding answers may require single or multiple methods and data forms • The methodological approach can be a combination of different methods (for example, qualitative, quantitative or mixed) • To each method corresponds a set of empirical tools that we can use • Data can also be qualitative and/or quantitative For more information on alternative methods www.worldbank.org/wbi/governance/assessing

  15. Qualitative method Budget use monitoring Video Observations Judicial Investigations IGR Quantitative method Investment Climate Surveys QSDS Public official surveys PETs Examples of Existing Empirical Tools • Mixed method • Governance Diagnostic Surveys • Score Card approach • PER

  16. Empirical dimension • Focus on institutions vs. individuals • Experiential vs. perception data • One vs. many types of respondents • Standard vs. customized empirical tools • Definition of sample and field work details • Open end vs. close end questions

  17. Linking the Tools to the Respondents Score cards Civil Society GAC IGR PET QSDS PER CFAA CPAR Citizens BEEPS INV. CL. Government Officials Enterprises Private Sector The State

  18. Process / Capacity Building dimension To increase impact and sustainability: • Consultative and participatory approach to discuss purpose, use and features of the assessment • Engage local NGOs and academic institutions to adapt/revise tools • Public dissemination of results • Joint design of policy recommendations

  19. Stages for Development of the National Anti-Corruption Strategy Process – An illustration 7. Monitoring and Evaluation of NAS Country Implemented 6. Implementation by Government 5. Revision of the NAS 4. Public dissemination + discussion WBI Technical Assistance 3. Draft of the NAS 2. Diagnostic surveys + analysis 1. Establishment of Steering Committee Key Partnership: Government + Civil Society

  20. Analytical and Policy dimension • Distill key links between manifestations of governance and: • Quality of services • Growth • Specific characteristics of public sector • Results could be used as one input for policy purpose

  21. The power of diagnostic data and key dimensions for analysis • Identify both weak institutions (in need of reform) and strong institutions (example of good governance) • Unbundle corruption by type – administrative, capture of the state, bidding, theft of goods and public resources, purchase of licenses and regulations

  22. Key dimensions… Cont. • Assess the cost of each type of corruption on different groups of stakeholders • Identify key determinants of good governance • Develop policy recommendations

  23. In sum, how to select among instruments? EC Audits? HIPC Ex.Tr.? ROSC? CFAA? Pub. Officials? GAC? QSDS? PER? CPAR? Score cards? …..? PET? BEEPS? IGR? Case study?

  24. Use a working framework…. Conceptual dimension - Analytical Framework - - Analytical Framework - Governance Assessment Implementation process Empirical tools & sample Analysis & use

  25. Peru 2002 • Purpose of assessment: monitoring • Final users: government and civil society • Key feature: • Comparability across time • Ability to identify progresses • Type of information needed: agency-specific • Approach: objective, and based on citizen’s feedback

  26. Peru 2002 • Conceptual dimension • Transparency in the management of resources • Quality of basic health and education services • Quality of complaint and feedback mechanisms • Empirical Tool • Score card/Questionnaire to households • Focus on agency-specific information • Objective, experiential data • Close-end questions

  27. Peru 2002 • Process/Capacity building: • Partnership between WBI and with National Statistical Office on methodological issues • Data and results publicly available • Analytical dimension • Monitoring of indices’ performance over time • Link between indices of performance and measures of poverty

  28. Peru 2002 – Decisions taken • To develop the following yearly indicators: • Index of transparency and civil society participation • Index of quality of public services • To focus on households/users only • To promote a partnership between the National Statistical Agency and citizens

More Related