1 / 19

Hilary Pinnock 1 , Eleni Epiphaniou 2 , and Stephanie J. C. Taylor 2

Phase IV Implementation Studies The Forgotten Finale to the Complex Intervention Methodology Framework. Hilary Pinnock 1 , Eleni Epiphaniou 2 , and Stephanie J. C. Taylor 2

koto
Download Presentation

Hilary Pinnock 1 , Eleni Epiphaniou 2 , and Stephanie J. C. Taylor 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Phase IV Implementation StudiesThe Forgotten Finale to the Complex Intervention Methodology Framework Hilary Pinnock1 , Eleni Epiphaniou2 , and Stephanie J. C. Taylor2 Allergy and Respiratory Research Group, Centre for Population Health Sciences, The University of Edinburgh, Edinburgh, United Kingdom; Centre for Primary Care and Public Health, Blizard Institute, Barts and the London School of Medicine and Dentistry, London, United Kingdom

  2. The complex intervention methodology? • A framework that defines the iterative process for developing and evaluating complex interventions in healthcare • A systematic review of implementation studies identified significant problems with reporting standards • Inconsistent terminology • Missing / unclear information • There is a need for standardised checklists for the reporting of implementation studies.

  3. Reporting standards: possible scope • Key standards might include: • An explicit evidence base from an RCT or guideline recommendation; • recruitment to the clinical service (not the research) • Inclusion of (at least some) population-level outcomes using routinely collected data • A description of the setting and the process of implementing the service.

  4. Phase IV “cycle”: possible scope • Currently – the framework: • Illustrates a cycle of development and evaluation • Implementation is the final step • Proposition: the research underpinning implementation should be visualized as a second inter-related cycle. • The “phase III cycle” includes the iterative steps: • Development, and • Piloting A similar process may be needed to translate the intervention into a practical service that can be tested in a phase IV implementation study

  5. Evidence for healthcare • Healthcare systems globally need evidence on which to base their clinical, management, and policy decisions. • Many countries have organisations which review evidence and make objective recommendations about the comparative effectiveness of medical technologies.1 The focus may: • Be drugs (e.g., Australia2), • Be medical devices and procedures (e.g., Canada3), • Include public health interventions (e.g., the UK4). https://www.academyhealth.org; http://www.pbs.gov.au/info/industry/listing/participants/pbac; http://www.cadth.ca/en/products/healthtechnology-assessment/health-technology-assessments http://www.nice.org.uk/

  6. Evidence for healthcare • Complex interventions are those: made up of various interconnecting parts.5 • Complex interventions are beyond the scope of clinical effectiveness programs in most countries.1 HOWEVER • Most healthcare interactions are “complex,” and the context in which a treatment is prescribed may have important consequences on adherence or appropriate use of medication.6 5. Campbell M, et al. BMJ 2000;321:694–696. https://www.academyhealth.org; 6. Moullec G, et al. Respir Med 2012;106:1211–1225.

  7. Evidence for healthcare • Randomized controlled trials (RCTs) are considered the gold standard research design7,8and this is reflected in the process for grading the strength of guidelines and comparative effectiveness recommendations.9–12 YET • More pragmatically-designed trials may reveal different results to the phase III RCTs,13 even in the context of the (relatively) simple administration of a drug treatment. • There is an even greater need to implement and evaluate effectiveness in routine clinical practicein the context of complex interventions 7. EcclesM, et al. QualSaf Health Care 2003;12:47–52;8 Higgins JPT, et al (the Cochrane Collaboration, 2011): http://www.cochrane-handbook.org; 9 GRADE: http://www.gradeworkinggroup.org/index.htm; 10 SIGN 50, 2011: http://www.sign.ac.uk/guidelines/fulltext/50; 11 The AGREE Research Trust, 2009: http://www.agreetrust.org; 12 NICE 2012: http://publications.nice.org.uk/theguidelines-manual-pmg6#nice-guidance 13 Price D, et al. N EnglJ Med 2011;364:1695–1707

  8. Designing, Evaluating, and Implementing Complex Interventions • In 2000, the UK Medical Research Council described “a framework for the development and evaluation of RCTs for complex interventions”14 • Forward citations suggest widespread global adoption • Only methodological guidance for healthcare researchers developing and evaluating complex interventions. • The final step was to: “establish the long-term and real-life effectiveness of the intervention,” but the description of phase IV studies was limited to the statement: “this stage is likely to involve an observational study”14 • A 2008 update aimed to address this limited guidance 14. Medical Research Council, 2000: http://www.mrc.ac.uk/Utilities/Documentrecord/index.htm?d=MRC003372

  9. Designing, Evaluating, and Implementing Complex Interventions The complex intervention framework illustrates a cycle of development and evaluation that includes implementation. It recommends RCT results be disseminated widely with further research to assist the process of implementation.15 This implementation phase might be visualized as a second interrelated cycle also including iterative steps of development and piloting, to help translate the intervention into a practical service that can be tested in a phase IV implementation study. Context is crucial, so that the intervention may be implemented in different ways in different settings.16 15. Craig P, et al. Medical Research Council guidance; 2008: http://www.mrc.ac.uk/complexinterventionsguidance 16. HaweP, et al. BMJ 2004;328:1561–1563. • The framework describes a cycle in which an intervention is: • Developed • Piloted • Efficacy tested in a Phase III RCT.

  10. Designing, Evaluating, and Implementing Complex Interventions • Phase III trials are classically delivered: • Under tightly controlled conditions, involving • Carefully selected patients • Highly motivated participants • Rigid research protocols to avoid the influence of confounding variables, although pragmatic.17 • Healthcare professionals ALSO need information on: • Population-level effectiveness • the impact on time, resources • Training requirements, • Workforce implications of implementing interventions in routine care.18,19 17. Eldridge S, Kerry S. A practical guide to cluster randomised trials in health research. Chichester: Wiley; 2012. 18. Meredith LS, et al. PsychiatrServ 2006;57:48–55. 19 Glasgow RE, et al. Am J Public Health 2003;93:1261–1267.

  11. Designing, Evaluating, and Implementing Complex Interventions • Phase IV studies aim to: • Evaluate the implementation by healthcare services of research findings • Translate interventions of proven efficacy in research settings into routine care. • Compare the new procedure or service with the existing regimen, • Explore whether the new service/procedure improves: • Patient outcomes • Quality of care • Service delivery, and/or • Health and social well being of the population7 • i.e. Determine the intervention’s true population effect20 7. Eccles M, et al. QualSaf Health Care 2003;12:47–52;20. Kelly JA, et al. J Acquir Immune DeficSyndr 2008;47:S28–S33–1267.

  12. Designing, Evaluating, and Implementing Complex Interventions • Effectiveness of an intervention is assessed in a heterogeneous, unselected population and real-life clinical settings, and examines outcomes relevant to the patient, provider, social, and healthcare contexts (20, 22). The focus is on external validity and generalizability, and implementation studies are thus useful for informing policy.20 • Even pragmatic trials with broad entry criteria recruit selected populations, and additional professional time and enthusiasm may be provided to deliver an intervention. • Effects are likely to be attenuated when an intervention competes with the demands of busy clinical practice, or may vary depending on the healthcare context 20. Kelly JA, et al. J Acquir Immune DeficSyndr 2008;47:S28–S33 21. Glasgow RE, et al. Am J Public Health 2003;93:1261–1267 22. Narayan KMV, et al. Diabetes Care 2000;23:1794–1798.

  13. Designing, Evaluating, and Implementing Complex Interventions • In an RCT, telephone consultations enabled 26% more people to have a routine review of their asthma.23 In our subsequent implementation study, the effect size was halved, reflecting the challenge of providing care as part of the normal practice workload for a population of about 33,000 patients, 1,800 of whom had active asthma.24 • In an implementation study, a patient is offered the intervention as part of routine clinical care with none of the selection imposed by requiring informed consent. Uptake of the intervention and attrition are important outcomes that will affect the effectiveness of the new service. 23. Pinnock H, et al. BMJ 2003;326:477–479. 24. Pinnock H, et al. Br J Gen Pract 2007;57:714–722.

  14. (Poor) Quality of Reporting of ISs • Recently conducted systematic review of implementation studies encountered significant challenges related to design and reporting. Difficulty in • Difficulty in identifying terms for the search strategy (due to a plethora of non-specific descriptors in use for implementation studies) – “effectiveness trials,” “routine clinical care,” “implement*,” “real-world,” “phase IV,” and “pragmatic.” • Difficulty in ascertaining (from the full text) recruitment route – clinical service (as per implementation studies) or via research (as per RCT) Although study aims and objectives, • Many areas of unclear/missing information on: implementation strategies; professional training required; population descriptions (numbers and characteristics)

  15. Examples of Phase III and Phase IV implementation study descriptions

  16. Standards for designing implementationresearch:examples • The study should be based explicitly on phase III evidence e.g., an RCT and/or guideline recommendations. • Implementation studies adopting novel interventions, should be considered phase III evaluations. • Eligibility and recruitment must be for the clinical service, not the research. • Patients should not have to participate in the research to receive the intervention, and this should be reported clearly. • The population of interest is the whole eligible population, which should be defined and the characteristics described. • This population may not be static: for example, there was a 20% turnover in patients on the asthma register during the course of the year in our implementation study (24). 24. Pinnock H, et al. Br J Gen Pract 2007;57:714–722.

  17. Standards for designing implementationresearch:examples (II) • Outcomes and, ideally, the primary outcome should be at population level. Uptake of the intervention is a crucial outcome. • A subgroup may be recruited (e.g., to complete questionnaires]), but they should be considered as a subgroup, and their characteristics compared with the whole population. • A comprehensive description of the intervention should include the setting and details of staff training, as well as the clinical intervention. In multicenter studies, the implementation process may vary in detail from site to site,16but crucial components should be standardized and described. • It is helpful if authors describe the process of implementing the service and reflect on the barriers and facilitators to inform readers wishing to replicate a service in their practice. 16. Hawe P, et al. BMJ 2004;328:1561–1563.

  18. Standardization: a must • Existing quality standards for observational studies and pragmatic trials (examples below) have some relevance, but none are wholly applicable to phase IV implementation studies: • Consolidated Standards of Reporting Trials, • Consolidated Criteria for Reporting Qualitative Research • Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklists.25–27 • Strengthening the Reporting of Observational Studies in Epidemiology28 • WHAT’S REQUIRED? A checklist for quality standards when reporting on phase IV implementation studies that researchers can use to improve completeness and transparency of their reporting, and editors can apply to assess the quality of publication.29,30 25. Schulz KF, et al. BMC Med 2010;8:18; 26 Altman DG, et al. Lancet 2008;371:1149–1150; 27 Moher D, et al. PLoS Med 2009;6:e1000097; 28 von Elm E et al; STROBE Initiative. J ClinEpidemiol2008;61:344–349; 29 Newhouse R, et al. Med Care 2013;51:S32–S40; 30 Rycroft-Malone J, et al. Worldviews Evid Based Nurs 2011;8:189–190.

  19. Conclusions • The updating of the complex intervention framework in 2008 signaled the recognition that phase IV implementation research is a crucial component of the evaluation of complex interventions in healthcare. • This message now needs to be adopted by researchers, funders, and guideline developers. No evidence base on a complex intervention can be considered complete until implementation in real-world practice has been evaluated in a phase IV study

More Related