1 / 19

Dr Sean Phillips Director Gener al

Republic of South Africa The Presidency Department of Performance Monitoring and Evaluation. Dr Sean Phillips Director Gener al. M&E in South Africa PRESENTATION TO KENYA NATIONAL M&E WEEK. Roles and Responsibilities for Planning and M&E in SA. Auditor General. National Treasury.

keith
Download Presentation

Dr Sean Phillips Director Gener al

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Republic of South Africa The Presidency Department of Performance Monitoring and Evaluation Dr Sean Phillips Director General M&E in South Africa PRESENTATION TO KENYA NATIONAL M&E WEEK

  2. Roles and Responsibilities for Planning and M&E in SA AuditorGeneral National Treasury Presidency • Independent monitoring of compliance • Auditing of performance information • Reporting to Parliament • Regulate departmental 5 year and annual plans and reporting • Receive quarterly performance information • Expenditure reviews • National Planning Commission (NPC): • Produce long-term plan (20 years) • Department of Performance Monitoring and Evaluation (DPME) • Produce government-wide M&E frameworks • Facilitate government 5 year plans for priorities • Monitor and evaluate plans for priorities • Monitor management practices of government • Monitor front-line services Public Service Commission Cooperative Governance Dept (DCOG) • Independent monitoring and evaluation of public service • Focus on adherence to public service principles in Constitution • Reporting to Parliament • Regulate local government planning • Monitor performance of local government • Intervention powers over local government Public Service Dept (DPSA) • Monitor national and provincial public service • Regulate service delivery improvement Line depts (national/prov) • Monitor sectors

  3. Focus of DPME to date • M&E of national priorities • Plans for the 12 priority outcomes (delivery agreements) • Monitoring (ie tracking) progress against the plans • Evaluating to see how to improve programmes, policies, plans (2012-13 8 evaluations, then 15per year) Management performance M&E • Assessing quality of management practices in individual departments at national/state/local level • Moderated self assessment and continuous improvement • M&E of front-line service delivery • Monitoring of experience of citizens when obtaining services (joint with states/provinces) • Presidential Hotline – analysing responses and follow-up • Citizen-based monitoring • Government-Wide M&E System • Guidelines for M&E across government • Data quality/structures of M&E units/capacity development • National Evaluation System • National M&E Policy Framework

  4. Planning and M&E for priority outcomes • Performance agreements between President and Ministers • Cross-departmental and inter-sphere plans (delivery agreements) • Results based: logic chain / theory of change with indicators and targets from activities to outcomes • Close links with budgeting process • Coordinated by cross-government Implementation Forums (including provinces where appropriate) • Quarterly monitoring reports to Cabinet (traffic lights) by coordinating Ministers • Briefing notes to Cabinet from DPME • Public progress reporting through Programme of Action website • Mid-term Review – identifying successes and challenges and public • Not yet enough problem solving • President has had performance reviews with some ministers, but not yet systematic

  5. National evaluation system • National evaluation policy framework • Focus on evaluation of programmes related to strategic priorities through 3 year rolling national and provincial evaluation plans • National Evaluation Plan : • 2012/13: 7 evaluations • 2013/14: 15 evaluations • 2014/15: 15 evaluations • 5 evaluations completed, 18 evaluations currently underway • 2 provinces and 3 departments have evaluation plans, working to extend this • Departments encouraged to undertake other evaluations, in addition to those in the national and provincial evaluation plans • Cross-government Evaluation Technical Working Group to support the system • Building demand for evaluation – emphasising learning, reports to Cabinet, working with Parliament, training DGs in usefulness of evidence, publicising reports

  6. Key aspects of the SA national evaluation system • Emphasis on use • Departments submit proposals for interventions to evaluate (policies, programmes, projects) – as they have to own the evaluation and implement the findings • Selection by cross-government Evaluation Technical Working Group – based on importance (either by scale or because strategic or innovative) • Jointly funded – department and DPME, in some cases donors • Evaluations must be made public unless security concerns • All evaluation reports in the national system go to Cabinet (which approves the Plan) and then to Parliament • If the evaluation identifies weaknesses, then there must be an improvement plan, which is monitored

  7. Independence and quality of evaluations • To ensure independence: • Evaluations implemented as partnership between department(s) and DPME • Steering Committee makes decisions on evaluation not department • External service providers undertake the evaluation reporting to the Steering Committee • To ensure quality: • Peer reviewers (normally 2) per evaluation • Evaluation panel, standards, guidelines, training • Quality assessment once completed

  8. Design evaluation Does the theory of change seem strong? Impact evaluation Has the intervention had impact at outcome and impact level, and why Different types of evaluations related to questions around the outcome model Economic Evaluation What are the cost-benefits? Implementation evaluation - what is happening and why DESIGN Diagnostic what is the underlying situation and root causes of the problem

  9. Building the national evaluation system • Evaluation and Research Unit in DPME to drive the system (15 staff) • >12 guidelines and templatesproduced - ranging from Guideline on TORs to Guideline on Improvement Plans • Standards for evaluations and competences drafted, and standards have guided a quality assessment tool • 2 courses developed, over 250 government staff trained • Evaluation panel developed with 42 organisations which simplifies procurement – however evaluation capacity is still limited – and a particular problem that few universities are bidding for evaluations

  10. Evaluation challenges emerging Overall the system is working well but some challenges. These include: • Often poor programme plans (for the government programmes which are being implemented) and so difficult to evaluate - need for minimum standards for programme plans – DPME has issued guideline on this • Some senior managers wary of evaluation and don’t see it as an opportunity to improve their performance • Making sure evaluations proposed are strategic ones and that key sectors covered • Departments not planning ahead – very important for impact evaluations in particular where need to plan 3+ years ahead, also affects how rollout happens • Reluctance to rollout in carefully planned way which facilitates impact evaluation. To be clear on impact must compare with/without the intervention • Departments fear critical reports, some resistance to taking evaluations forward when draft reports are very critical of programmes

  11. Front-line service delivery monitoring • Unannounced visits to front line services eg police stations, clinics, home affairs offices, social grant offices • Joint programme between DPME and provinces • Interview citizens, front-line staff and management • Focus on service standards, queues, attitude of staff, cleanliness of facilities, etc • Approximately 400 sites visited over last two years - not “scientific” as not random sampling etc, indicative • Reports sent to facilities and management, Cabinet • If 3/7 red lights then repeat visit • Presidential Hotline , 180 000 calls, 94% resolved • Piloting system of citizen-based monitoring

  12. Monitoring management performance • Monitoring management practices in all national and provincial government departments • Joint initiative with Offices of the Premier • Standards developed collaboratively • Standards based on legislation and regulations • Assessment against 31 management standards in 17 management areas • Self assessment with moderation • Support offered to departments to develop and implement improvement plans to address key areas of weakness • Now starting to monitor at local level as well, using a different tool

  13. Monitoring management performance Governance & Accountability • Service Delivery Improvement • Management Structures • Accountability • Ethics • Internal Audit • Risk Management • Delegations Strategic Management • Strategic Planning • Programme Management • Monitoring & Evaluation Management Performance Areas Financial Management • Supply Chain Management • Asset management • Revenue Management • Compensation to employees • Goods and Services • Transfer Payments • Liability Management Human Resource and Systems Management • HR Strategy and Planning • HR Practices and Administration • Management of Performance • Employee Relations • IT systems 13

  14. Assessment descriptors 14

  15. % distribution of moderated scores for strategic management standards 2012/13 moderated scores, all 156 national and provincial departments

  16. What is the problem that PM&E aims to address? • Culture of doing things the way they have always been done, as opposed to culture of continuous improvement • Focus on activities without assessing the results or impact of the activities • Insufficient measurement, collection and analysis of data to inform improvements • Monitoring and reporting for compliance rather than for improvement • Poor programme planning, weaknesses in setting of indicators and targets, weak logic models / theories of change • Weaknesses with design of data measurement and collection processes • Lack of reengineering of plans and business processes based on analysis of data • Evidence-based planning and decision making not sufficiently valued • Disappointing results - lack of positive change in key indicators, such as rural unemployment • Implementation weaknesses • Poor quality of service delivery • Insufficient value for money: e.g. education and health results vs expenditure

  17. Key challenges • Duplication of reporting • Improving the quality of planning of programmes • Improving the quality of administrative data • Management culture • Monitoring and reporting for compliance rather than improvement • Capacity (and demand) to use evidence to support policy and decision making and to support improvements to implementation • Making accountability more effective • Ensuring that M&E leads to action to address identified weaknesses and improvement in government performance

  18. Ke ya leboga Ke a leboha Kea leboga Ngiyabonga Ndiyabulela Ngiyathokoza Ngiyabonga Inkomu Ndi khou livhuha Thank you Dankie our website: http://www.thepresidency-dpme.gov.za/

More Related