1 / 22

Evaluation and Monitoring Methodologies

Evaluation and Monitoring Methodologies. Strengthening the Legislature – Challenges and Techniques K. Scott Hubli, NDI. Overview. General Comments on Monitoring and Evaluation Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs

Download Presentation

Evaluation and Monitoring Methodologies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation and Monitoring Methodologies Strengthening the Legislature – Challenges and Techniques K. Scott Hubli, NDI

  2. Overview • General Comments on Monitoring andEvaluation • Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Practical Tips and Considerations

  3. General Comments onMonitoring and Evaluation • Evaluation (and Baseline Assessments) --Use to develop program design; use for major course corrections --More costly and less frequent than monitoring (every two to three years) --Typically done at the beginning and the end of a program, but often also done after a major change in the political landscape (e.g., regime change, ethnic conflict settlement, etc.) --Used for accountability to partners, donors, stakeholders, not for ongoing project management

  4. General Comments onMonitoring and Evaluation • Performance Monitoring --Ongoing monitoring; used to manage performance of implementation --Track changes (but less analysis) -- Informed by baseline assessment and, if well designed, it can reduce future evaluation costs -- May indicate a need for a evaluation or updated baseline -- Focus on low cost, regular data collection (workshop evaluations, information available from parliament, regular focus groups, etc.)

  5. General Comments onMonitoring and Evaluation • Always distinguish among: --Inputs (e.g., consultants, computers, etc.) --Outputs (e.g., 40 people trained in a workshop onoversight techniques) --Outcomes (e.g., increased knowledge of oversight investigation techniques) --Objectives (e.g., increased oversight hearings) --Goals (e.g., increased government accountability)

  6. How are legislative strengthening programs different from other programs with respect to monitoring and evaluation?

  7. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Legislatures are highly complex institutions --They involve multiple actors seeking to achieve multiple goals simultaneously --Where possible, disaggregate data (by gender, party, region, etc.) --Identify clear goals and targeted groups; watch for unintended consequences

  8. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Long-term goals, short-term programs -- Resist the tendency to monitor outputs rather than progress in achieving desired outcomes, objectives and goals -- Find ways to measure small changes in large goals; or outcomes that can be affected with the project time frame

  9. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Programs focus on process, not outputs --Example: number of laws passed--Emphasize qualitative over quantitative information --Use detailed process descriptions in establishing baselines --Use monitoring and evaluation to help strengthen this process and to teach results-based management, where possible (“Monitoring and evaluation should be managed as joint exercises with development partners.”)

  10. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Monitoring and evaluation is often highly political --Involving partners can sometimes further politicize evaluation and monitoring; use caution and judgment --Can be hard to get necessary information --Politics may cause people to be less than fully honest --Results can be used as a political weapon

  11. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Legislatures have natural cycles --Elections, post-election learning curves, legislative floor periods, recesses, budget processes, etc. --Example: constituency relations --Expect uneven development in performance monitoring, but try to attribute fluctuations in data --Time evaluations carefully – look for “normal” periods

  12. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Many intervening variables --Economic conditions, geopolitical developments, ethnic conflict, death of a key politician, etc. --No substitute for nuanced political analysis --Measure outcomes, objectives, goals – not just outcomes; this can help identify these intervening variables

  13. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Perceptions matter --Importance of qualitative over quantitative indicators --Use of focus groups, opinion polls, etc. --Even anecdotal evidence is useful if it captures a political mood or issue

  14. Special Considerations in Monitoring and Evaluating Legislative Strengthening Programs • Difficulty of comparative benchmarking --Only one national legislature; cross-countrycomparisons are of limited utility --Comparisons across time more important; use ofthorough baselines --Implications on setting goals and targets – use ofreasonable/consensus expectations

  15. What are some practical strategies for dealing with these unique aspects of monitoring and evaluating legislative strengtheningprograms?

  16. Practical Tips & Considerations • General Issues --Be pragmatic in designing an evaluation or monitoring plan; tie evaluation and monitoring to the purpose or objectives. Avoid evaluation for evaluation’s sake. Consider: --resource availability for evaluation --novelty of the program --confidence in program design or implementation --needs of funder --Budget sufficient resources – (Costs for legislative strengthening evaluation may exceed those for other program types – soft assistance, new field, etc.)

  17. Practical Tips & Considerations • Issues in Doing a Baseline --Limit scope to allow for detailed coverage of program areas --Protect against biases of person(s) doing the baseline by: --Using teams --Using clear, detailed terms of reference --Incorporating documentary evidence --Seeking consistency in future assessments

  18. Practical Tips & Considerations • Issues in Doing a Baseline (cont.) --Pick timing carefully; describe any special circumstances--Prepare carefully for baseline assessment team --Cover the range of stakeholders --Get out of the capital --Consider focus groups or creative methods for documenting perceptions and processes (e.g., a sample of 10 legislators to track periodically every 3 years)--Pay attention to protocol; build good will.

  19. Practical Tips & Considerations • Using outside evaluators --Outside evaluators can not only provide objectivity but also insulation from the political consequences of an evaluation --Combine multiple backgrounds (academic or legislative strengthening specialists and MPs or staff from similar systems) --Recognize value of “time in the trenches” --Designate a lead person with responsibility for producing the document --Get a sufficient time commitment

  20. Practical Tips & Considerations • Issues in Performance Monitoring --Draw on baseline and prior evaluations --Design performance monitoring plan up front; adjust it as project evolves: --Imposes discipline; keeps program on track --Provides clarity of expectations to partners --Keep it current, modify as needed --Make these changes explicit

  21. Practical Tips & Considerations • Issues in Performance Monitoring (cont.) --Tie to likely performance issues --Draw on low-cost existing information sources; may be more quantitative, with less analysis --May focus on outcome level, rather than objective or goal level --Consider quarterly or semi-annual monitoring --Expect, but explain, fluctuations --When you can’t explain repeated fluctuations, consider updating a baseline to try to identify issues --Often done, in part, by those implementing program

  22. Final Thoughts • Be creative: legislative strengthening is anart, not a science • Be willing to accept criticism; fight structural bias for “spinning” results • Share lessons learned, both internally and externally

More Related