1 / 16

Evaluation design and implementation

Evaluation design and implementation. Puja Myles Puja.myles@nottingham.ac.uk. Session outline. -Evaluation frameworks -CDC framework for evaluation -Theory of change and logic models -RE-AIM framework -Maxwell’s quality assessment framework

hewitt
Download Presentation

Evaluation design and implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation design and implementation Puja Myles Puja.myles@nottingham.ac.uk

  2. Session outline -Evaluation frameworks -CDC framework for evaluation -Theory of change and logic models -RE-AIM framework -Maxwell’s quality assessment framework -Practical exercise: Using a logframe matrix and decision models for evaluation planning/design

  3. F RAMEWORK Step 1 Step 2 Deciding and measuring health outcomes Step 3 Step 4 What is an evaluation framework?

  4. CDC framework for evaluation Step 1: Engage stakeholders Step 2: Describe the program Step 3: Focus the evaluation design Step 4: Gather credible evidence Step 5: Justify conclusions Step 6: Ensure use and share lessons learned

  5. Step 1: Engage stakeholders Key stakeholders: • People involved in programme operations (funders, managers, administrators) • People served or affected by the programme (clients, family members, elected officials, sceptics) • Primary users of the evaluation (will be a subset of all the stakeholders identified; these are the people who can act on findings and bring change)

  6. Role of stakeholders • Clarify the programme objectives • Help you elucidate the underpinning theory of change • Help design and carry out the evaluation • Help frame recommendations for practice based on findings • Initiate change/act on recommendations i.e. ensure that the evaluation is meaningful

  7. Step 2: Describing the programme-1 -Mission and objectives of the programme-The problems addressed by the programme (nature and magnitude of the problem; populations affected)-How the programme intends to address the problem (theory of change)-Expected effects of the programme

  8. Step 2: Describing the programme-2 • Activities • Resources • Context (setting and environmental influences e.g. Political/historical/social) • Logic Model

  9. Theory of change • This approach involves setting out the series of outcomes that are expected to unfold as a result of the various components of the intervention as a basis for planning the evaluation strategy. • Can be visualised as a sequential process of ‘if-then’

  10. Logic model/logframe matrix • A practical approach to understanding the theory of change for a given intervention • Can be used with stakeholders

  11. An example logframe matrix

  12. Step 3: Focusing the evaluation design Things to consider: • Purpose of evaluation (feasibility, effectiveness, change, empowerment, sponsor requirement) • Evaluation questions (merit, cost-effectiveness, equity, quality) • Feasibility • Ethics

  13. Study designs Ovretveit (1998) outlined six basic evaluation designs: • Descriptive • Audit • Outcome (the before-after comparison; quasi-experimental design) • Comparative experimental • Randomised controlled experimental • Intervention to a service (impact on providers and patients)

  14. CDC framework: Steps4-6 Step 4:Gather credible evidence (what outcomes and how will you measure these) Step 6: Justify conclusions (attribution versus contribution; alternative explanations such as bias, chance, confounding) Step 7:Ensure use and share lessons learned (stakeholder involvement; participatory approaches)

  15. RE-AIM framework for measuring public health impact Glasgow et al (1999): • Reach (uptake; who benefits; who is left out) • Efficacy (include behaviour outcomes and participant-centred quality of life measures; consider both positive and negative outcomes) • Adoption (proportion & representativeness of settings): use direct observation, interviews, surveys • Implementation (the extent to which a programme is delivered as intended); audit • Maintenance: long-term maintenance of behaviour change (both clients and service providers)

  16. Assessing Quality Maxwell’s dimensions of health care quality: • Access to services • Relevance to need (for the whole community) • Effectiveness (for individual patients) • Equity (fairness) • Social acceptability • Efficiency and economy

More Related