1 / 18

National Evaluation and Results Management System – Sinergia –

National Evaluation and Results Management System – Sinergia – Two decades of lessons and experiences. Directorate of Monitoring and Evaluation of Public Policy November 2013. El Modelo Sinergia. Our m odel. MONITORING. EVALUATION. TERRITORIAL. ACCOUNTABILITY.

luana
Download Presentation

National Evaluation and Results Management System – Sinergia –

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NationalEvaluation and Results Management System– Sinergia – Two decades of lessons and experiences Directorate of Monitoring and Evaluation of Public Policy November 2013

  2. El Modelo Sinergia Ourmodel MONITORING EVALUATION TERRITORIAL ACCOUNTABILITY INNOVATION AND RESEARCH Evidenceforthedecisionmakingprocess

  3. THE VALUE CHAIN: OUR CONCEPTUAL BASIS • Sinergia’s model is based on the value chainandis oriented to identify bottle necks in each link of the public policy process. Executive Evauation Processes Institutional Outcomes Impacts • Our portfolioincludes different types of evaluations in order to respond to bottlenecks identified in each link of the value chain.

  4. Theevaluationprocess Ourprocessbringsabouttransparency and consistency. In order to be effective, evaluations need to • Be a result of a standardized process • Include the participation of all stakeholders • Answer decision maker questions • Be in line with the government agenda Government Area • Evaluation Schedule • Design  3 Months • Procurement  3 Months • Development  8 Months • Use of Results  6 Months • TOTAL: 20 Months Evaluation Design Evaluation Development Selection of policies to be evaluated Procurement Implementing Results

  5. Thesystem´sevolution Throughtheseyearstherehas beenchanges and lessonslearned Types of evaluationbyyear of implementation Evaluationsby sector • We are working in different sectors going beyond social inclusion area • We have a wide evaluations portfolio • We have published methodological guidelines of evaluation • Our evaluations are public on internet • Our process is part of the NDP quality management system

  6. Westillface new challenges 1 Spread ofthe evaluation culture: • It is necessary a high level champion who is aware of the importance of doing evaluations and has the capacity to disseminates its attributes within the executive level. • It is required an adequate legal framework but first it is important to know: What should be its scope?, What should regulate? • It is important to develop the evaluation culture through different levels of government, as well as improving knowledge of the M&E concepts • It is vital to involve citizens in the evaluation process, so they can use it for social control

  7. Westillface new challenges 2 Use of evaluations: Externally, fordecision-makingprocesses: Internally, for more influence: • Evaluated entities should me more committed with using the evaluations results and with the agenda setting. • Each evaluation must have a Plan fortransfer and implement recommendations, which should be design between Sinergia, the evaluator and the evaluated entity. • The data bases should be public and simple to be searched. • It is need to have a monitoring scheme for the imlementation of evaluation results • Replicate evaluations in order to contrast results and evaluate evaluators. • Improve the quality of evaluations through meta-evaluation. • To do systematic reviews in order to define new lines of actionbased on evaluations already done

  8. Westillface new challenges 3 Quality of theevaluations: Workingwithuniversities High leveladvisory Universitiesshouldplay a critic rolereplicatingevaluationsin ordertocontrast and compare results. As wellitwould be importanttoexchangeknowledge and experiences. Improvetheevaluationsprocesswiththesupportof a technicalexpert, duringthedesign, implementation and use of results.In houseorpeer reviewer? It´snotjustaboutquantity Regular training Itdoesn´tmatterifwehave a limitednumber of evaluationsbeen done at thesame time. It´sveryimportanttohaveanadequatenumber in ordertoguaranteequality and rigorousness. Of theevaluationteam, in ordertoimplement new methodologies and improvethequality of theexistingones.

  9. Westillface new challenges 4 Improving evaluators market • Dialogue withconsultingfirms in ordertoimprovetheprocurementprocess. • Training forbetterproposal’spresentation. • Prioritisation of thetechnicalqualitywhenit comes toqualifyproposals. • Improvestrategiesof evaluationcosting. • Promotethedevelopment of smallconsultingfirms.

  10. Proposedquestions

  11. 1. ¿Is it possible to observe any change in the quality of evaluations? How to ensure program evaluators are impartial and consistent? How to evaluate evaluators? • Each evaluation has a Monitoring Commitee in order toguarantee impartiality and consistency. • Evaluations managers must be technically strong to guarantee the accuracy of the evaluator. • NDP technical offices and representatives of the evaluated entitty should act as a quality filter for the evaluation. • Theevaluationsshouldalso be evaluatedthrough: • Evaluations replicas done by universities. • Meta-evaluation (contrasting evaluations results). • Peer reviewers.

  12. 2. Is there enough transparency in evaluations? (How evaluations are assigned?, Are evaluations public?) Government Area Evaluation Development Selection of policies to be evaluated Procurement Implementing Results - Theexternprocurementguaranteesimpartiality. - Evaluators are chosen by scoring upon specific criteria Evaluations are public in theofficialwebsite: sinergia.dnp.gov.co Theconsistency of theevaluationisguaranteebyitstechnicaldesign

  13. 3. Still the results-based budget on the table, or the M & E systems are limited to recommending actions for improvement? In Sinergia we are working on it: Basedonthemonitoringprocess, determine therequired inputs toacomplishtheplannedoutcomes of a publicintervention. (costing of inputs) Identify and link theplannedoutcomes of thepublicinterventionswiththe outputs. In thisway, attainthecoordinationbetweenthedesign of publicprograms and budgeting. Theevaluationsallowthevalidation of causal relationsbetweenthe links of thevaluechain.

  14. 3. Still the results-based budget on the table, or the M & E systems are limited to recommending actions for improvement? Externalfactors Outcomes Impacts Needs Socio-economicsituation Efectiveness Objectives Inputs Activities Outputs Productivity Efficiency Cost - efectiveness Expenditureseconomy Efficacy

  15. 3. Still the results-based budget on the table, or the M & E systems are limited to recommending actions for improvement? Executive entities from the national and subnational level Coordinatingentitiesfromthenational, regional and subnationallevels. Resultschain Productiveprocess Objectives Inputs (costs) Activities Outputs (costing) Inmediateresults Intermediateresults Final results Use of the information referred to the actions needed to implement the public interventions. Thisallowstodevelopgoodpracticesduringtheproductiveprocess. Entitiesobjectives Operative Use of theinformationabouttheoperation and thepartialresults of thepublicinterventions. Thisallowstodesigor re-designtheimplementation of publicpolicies, makebudgetingdecisions and prioritizepopulationgroups. Sectorial objectives Management Use of the information of the delivery of good and services and the generation of strategic results. This allows to make budgeting decisions , approve or disapprove the continuity of public interventions and influence the adoption of the recomendations resulted from evaluations. NationalObjectives Political

  16. 4. How to use evaluation results for the decision making process? / How to promote the use of evaluations? Externally, fordecision-makingprocesses: Internally, for more influence: • Evaluated entities should me more committed with using the evaluations results and with the agenda setting. • Each evaluation must have a Plan fortransfer and implement recommendations, which should be design between Sinergia, the evaluator and the evaluated entity. • The data bases should be public and simple to be searched. • It is need to have a monitoring scheme for the imlementation of evaluation results • Replicate evaluations in order to contrast results and evaluate evaluators. • Improve the quality of evaluations through meta-evaluation. • To do systematic reviews in order to define new lines of actionbased on evaluations already done

  17. 5. Is there a positive cost-benefit ratio doing evaluations? Financialresourcesinvested in evaluations 2010-2012 Use of evaluations • Todesignpublicpolicy • (CONPES) • To improve existing interventions. • To improve procurement processes. Numbers in USD Itwould be worthtoquantifythebenefits of evaluationsforthepublic sector

  18. Thankyou www.dnp.gov.co www.sinergia.dnp.gov.co/portaldnp/ @Sinergia_DNP PBX: 3815000

More Related