1 / 64

Monitoring in Outcome Mapping Principles, Design & Practice

Monitoring in Outcome Mapping Principles, Design & Practice. Steff Deprez & Kaia Ambrose OM Lab Dar Es Salaam, Tanzania 22-23 Sept, 2014. Monitoring in Outcome Mapping. 1. Core principles of monitoring in Outcome Mapping 2. Monitoring Design Issues Experiences from practice

peluso
Download Presentation

Monitoring in Outcome Mapping Principles, Design & Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring in Outcome Mapping Principles, Design & Practice Steff Deprez & Kaia Ambrose OM Lab Dar Es Salaam, Tanzania 22-23 Sept, 2014

  2. Monitoring in Outcome Mapping 1. Core principles of monitoring in Outcome Mapping 2. Monitoring Design Issues Experiences from practice 3. Monitoring Practice Approaches, tools & instruments Steff Deprez & Kaia Ambrose (Sept 2014) 2

  3. 1. Core principles of monitoring in Outcome Mapping Based on OM Manual

  4. Steff Deprez & Kaia Ambrose (Sept 2014) 4

  5. Outcome Mapping Monitoring • Systematic collection of data on outcomes and performance • A regular learning & improvement cycle • Credit a program for its contribution to bringing about change • Encourages the program to challenge itself Steff Deprez & Kaia Ambrose (Sept 2014) 5

  6. Outcome Mapping Monitoring • Flexibility • Participatory • Evaluative thinking • Organisational & social learning • Power of self-assessment • Regular face-to-face meetings Steff Deprez & Kaia Ambrose (Sept 2014) 6

  7. Steff Deprez & Kaia Ambrose (Sept 2014) 7

  8. Steff Deprez & Kaia Ambrose (Sept 2014) 8

  9. design boldly within the broadest development context or sphere of interest M&E modestly within the program’s sphere of influence Steff Deprez & Kaia Ambrose (Sept 2014) 9

  10. Outcome Mapping offers a system/process to gather • data & encourage reflection on: • The progress of external partners towards the achievement of outcomes (progress markers) • 2. The internal performance of the program (strategy maps) • 3. The program's functioning as an organizational unit (Organisational Practices) Steff Deprez & Kaia Ambrose (Sept 2014) 10

  11. Monitoring in Outcome Mapping indirect influence Sphere of interest Beneficiary 2 Beneficiary 1 Focus of M&E Beneficiary 3 Outcomes direct influence Sphere of influence Boundary partner 2 Behavioral changes Boundary partner 1 Intervention Strategies Boundary partner 3 direct control Sphere of control Efficiency & Relevancy Implementing team Organisational Practices Viability Steff Deprez & Kaia Ambrose (Sept 2014) 11

  12. 3 types of monitoring journals indirect influence Beneficiary 2 Beneficiary 1 Focus of M&E OM Journals Beneficiary 3 Outcomes direct influence Outcome Journal Boundary partner 2 Behavioral changes Boundary partner 1 Strategy Journal Intervention Strategies Boundary partner 3 direct control Efficiency & Relevancy Implementing team Performance Journal Organisational Practices viability Steff Deprez & Kaia Ambrose (Sept 2014) 12

  13. Monitoring Plan Steff Deprez & Kaia Ambrose (Sept 2014) 13

  14. Critical questions? Programme response • What should we keep doing? • What do we need to change in order to improve? • Are we still working with the right BPs? • What strategies/practices do we need to add? • What strategies do we need to end? • What should be evaluated in more depth? Steff Deprez & Kaia Ambrose (Sept 2014) 14

  15. 2. Monitoring design issues Experiences from practice

  16. Making learning explicit Use of Spaces & Rhythms (*) Conventional M&E design • Information needs related to the programme framework (objectives, results, outcomes, … + indicators) • Data collection methods • Reporting Outcome Mapping (as presented in the OM manual) Based on principles of Utilisation-Focused Evaluation • Focus on monitoring priorities: Who will use it? Purpose? • Use of outcome, strategy and performance journals (*) introduced by Guijt & Ortiz (2008) Steff Deprez & Kaia Ambrose (Sept 2014) 16

  17. Monitoring Plan Steff Deprez & Kaia Ambrose (Sept 2014) 17

  18. Assumptions about monitoring (in OM) • Monitoring process = learning process > reflection and analysis happen automatically • An actor-centered design leads to a participatory monitoring process • M&E results will be used • Users have the capacity, time and willingness to participate or facilitate the monitoring process • Using outcome journals & strategy journals is enough to pave the way forward • The monitoring process is embedded in organisational or programme management cycles • … Steff Deprez & Kaia Ambrose (Sept 2014) 18

  19. Making learning explicit • For learning to happen > data is not the starting point • Start from the intended use • Start with defining the spaces that are crucial for debate, sharing, reflection and decision-making • Make the monitoring integral to the thinking and doing of the organisation and programme Steff Deprez & Kaia Ambrose (Sept 2014) 19

  20. Learning-oriented monitoring (Source: Seeking Surprise (Guijt, 2008) Steff Deprez & Kaia Ambrose (Sept 2014) 20 20

  21. Three core steps in the design of a learning-oriented monitoring system 1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS • Which information is required, for who, at what time/event, • in what form, to do what? Steff Deprez & Kaia Ambrose (Sept 2014) 21

  22. 1. BE CLEAR ON PURPOSE, USES & USERS E.g. Intended uses of M&E process (Patton) Steff Deprez & Kaia Ambrose (Sept 2014) 22

  23. 1. BE CLEAR ON PURPOSE, USES & USERS e.g. Wheel of Learning Purposes (Guijt, 2008) Steff Deprez & Kaia Ambrose (Sept 2014) 23 23

  24. 1. BE CLEAR ON PURPOSE, USES & USERS e.g. PLA system VECO ACCOUNTABILITY Programme improvement LEARNING Knowledge creation VECO & partners Programmatic & financial accountability Negotiation & understanding chain actors Upward & downward accountability Evidence building & upscaling Short-term planning Long-term & strategic planning PLANNING Steff Deprez & Kaia Ambrose (Sept 2014) 24 24 24

  25. 1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS • What are the spaces and rhythms central to planning, learning, accountability, debate, decision-making, … (Guijt & Ortiz, 2007) • How can we ensure that monitoring is integral to the thinking and doing of the organisation and programme? Steff Deprez & Kaia Ambrose (Sept 2014) 25

  26. 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS Organisational spaces Formal and informal meetings and events which bring organisations and programmes to life Rhythms Patterns in time, the regular activities or processes which provide a structure-in-time, through which an organisation can direct, mobilise and regulate its efforts, i.e. regular weekly, monthly, annual activities that characterise the tempo of organisational functioning. When do people interact and share information and make sense of what is happening? Steff Deprez & Kaia Ambrose (Sept 2014) 26

  27. Description of the main spaces & rhythms > Group exercise Steff Deprez & Kaia Ambrose (Sept 2014) 27

  28. 1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS • Which data & information is required? • What type of data / information? • From ‘Nice-to-know’ to ‘Must-know’ information Steff Deprez & Kaia Ambrose (Sept 2014) 28

  29. Steff Deprez & Kaia Ambrose (Sept 2014) 29

  30. Information needs linked to the main spaces & processes Steff Deprez & Kaia Ambrose (Sept 2014) 30

  31. Masterlist of info needs

  32. Plan for Sensemaking 1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS • Plan how data is used and analysed> make it usable for action • Focus on social interaction: sharing, reflection, debate, decision • Should be well-planned & facilitated > it will not happen by itself Steff Deprez & Kaia Ambrose (Sept 2014) 32

  33. Institutionalising a learning-oriented M&E practice How to make sure that your monitoring principles and design is translated in an effective monitoring practice? 1. Creating organisational conditions: motives, means & oppportunities 2. The ‘web of institutionalisation’ > Should be reflected in the Organisational Practices Steff Deprez & Kaia Ambrose (Sept 2014) 33

  34. Creating the motives, means & opportunities • Creating Motives • -Guiding ideas • -Support by management • -Develop a learning Culture • -Provide incentives • Creating Means • -Human capacities • -Specialist support • -Concepts, methods and tools • -Budget • Creating Opportunities • -Integration in planning and management • -Clear M&E plans and responsabilities • -Responsive information management system • -Trust and respect – speak out, challenge, feedback (Steff Deprez, 2009) Steff Deprez & Kaia Ambrose (Sept 2014) 34

  35. 2. The web of institutionalisation (Levy, 2006) See: THE PROCESS OF INSTITUTIONALISING GENDER IN POLICY AND PLANNING: THE ‘WEB’ OF INSTITUTIONALISATION (Levy, 2006) Steff Deprez & Kaia Ambrose (Sept 2014) 35

  36. 3. Monitoring Practice in OM Experiences from practice

  37. Monitoring Practice in OM • Working with progress markers, boundary partners and organizational practices • Sense-making with boundary partners • Ongoing challenges

  38. Thinking through the different aspects of monitoring • M&E plan (Performance Measurement Framework) – more narrative • Unpack different moments

  39. Working with progress markers • What is your purpose and use? • What is your monitoring culture? • What resources do you have for monitoring? • What qualitative and quantitative data needs do you have?

  40. GrOW program • Show matrix

  41. men ? FSP leaders

  42. Outcome Mapping Logic Model

  43. Challenges Qualitative data collection – informal interviews, observation (including looking for unexpected – positive and negative) Qualitative analysis – looking for patterns and trends Critical analysis and sense-making – the need for facilitated, well-constructed (agenda, exercises) spaces and processes Usage of information

  44. Evolving Lessons Monitoring beyond outputs Good Enough (in terms of tools, capacity) and build from there M&E – mande – evaluative thinking – - explicit sense-making spaces

  45. Working with progress markers • Use progress markers as a checklist to track progression against pre-defined behavioural changes for a specific partner in a specific period of time; use of scoring (LMH, 1234, colour) • Write a qualitative description of change (I.e. every 4-6 months) for each pre-defined PM for a respective period • Other monitoring tools, qual or quan, that are then cross-referenced with pre-defined PMs (new ones added)

  46. Working with progress markers: who? • BP describe their own change - then send to implementing team • Implementing team describes change based on their own observations • Mutual reflection process with team and BPs • External evaluator judges progression in change

  47. Working with progress markers: what? • Every single PM monitored • Only PMs that are relevant for a specific period • PMs and / or OCs used to trigger discussion during reflection process; key changes documented • Depth of analysis can vary • Across different BPs (comparison) • In combination with SMs (effective intervention?)

More Related