1 / 27

What Constitutes Good Evidence? Louise Scott, Head of Children and Families Analysis Scottish Government

What Constitutes Good Evidence? Louise Scott, Head of Children and Families Analysis Scottish Government. Do we need to collect new evidence? PDSA. What’s the emerging issue/problem to fix? EYC Key change. Do people need help to understand and apply the evidence?. Key.

nikita
Download Presentation

What Constitutes Good Evidence? Louise Scott, Head of Children and Families Analysis Scottish Government

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Constitutes Good Evidence? Louise Scott, Head of Children and Families Analysis Scottish Government

  2. Do we need to collect new evidence? PDSA What’s the emerging issue/problem to fix? EYC Key change Do people need help to understand and apply the evidence? Key Why is this question important for the EYC? How do we expect to use the evidence? To identify what works Can we actually use this evidence? What’s the existing evidence? How can we access it? Extranet, IRISS, SG, Google …etc What’s shaped the research? What does the research tell us about what works? (How, for whom, in what circumstances?) Are the findings well grounded?

  3. Aim of Session To help you learn from high quality evidence to apply to practice To help you produce high quality evidence to share with EYC

  4. 1. Assessing Quality of Evidence • Good research design • Sampling • Data collection • Analysis • Clear and coherent reporting • Clear link between the data, interpretation and conclusions • The context makes it usable Methodology Analysis and reporting Findings and usability

  5. Assessing Quality: Findings • Always look at the findings first because this will help you assess features of the research process (eg. Quality of data collected, logic and transparency of analysis etc). • Is there a clear link between data, interpretation and conclusions? • QI: what do the findings say and are they clear? • QI: The findings need to be credible and supported by the data/evidence. The findings/conclusions make sense and have a coherent logic. • QI: The findings are resonant with other knowledge and experience. • QI: The study needs to address its original aims and purpose. Need to see a clear statement of study aims and objectives; findings need to clearly link to the purpose of the study; summary and conclusions should refer back to aims of the study. Ideally you would see included the limitations of the study.

  6. Assessing Quality: Findings • Is the context of the work clearly described? • QI: You need to see a detailed description of the contexts in which the study was conducted (this will also allow you to assess applicability to other settings – see later. Is the context sufficiently similar to yours? What are the key differences? Is it so different you can’t apply it?) • QI: A discussion of what can be generalised to the wider population from which the sample or case study has been made • QI: When was the study undertaken and have things (i.e. political, social and economic contexts) changed so much that it is no longer valid?

  7. Research: Methodology • Good Research Design: does it use the right method to address the questions posed? There is no such thing as a hierarchy of methods (RCTs etc), just suitability of research design • QI: you should see a discussion of the rationale of the design of the study and be convinced that this approach meets the aims of the study.. • Methods: Survey/ opinion poll • Focus group/interview/action learning set/case studies • Systematic review • Longitudinal (or cohort) studies • Randomised Control Trials Where? How? Why? Who? What works and why does it matter? What?

  8. Assessing Quality: Methodology • Sampling: i.e. how well does the sample/selection of case studies or documents reflect the population you need to include? • QI: Studies should aim to get maximum inclusion (representativeness) and go to great lengths to access ‘hard to reach’. They should also document any missing coverage in achieved samples and implications for the evidence. Representativenessmeans the sample reflects the make up of the population. You need this in research so that you can take a small group and survey them or interview them to get their personal responses but infer bigger trends. e.g. Research to explore characteristics of the quality of nursery settings would need case studies from public, private and voluntary settings of variable sizes and locations based on what you know about the total nursery estate. What is representativeness?

  9. Assessing Quality: Methodology • Data collection: how well was the data collection carried out? • QI: discussion of who conducted the data collection, what were the procedures for collection or recording (interviews, for example, should be audio or video recorded); how field notes were recorded and how this may have influenced data collection (eg. Traffic survey recording on public holiday or during diversions etc)

  10. Assessing Quality: Analysis and Reporting • Analysis should report trends and outliers and all voices • QI: You should see an exploration of diversity of perspectives in the analysis, but also a clear indication of weight of evidence (i.e. is this one single parent father’s views or are you hearing from a range of voices or a representative body?) It should be clear when the analysis is referring to outliers or exceptions. Are there any gaps? Whose voice is missing? • QI: The piece should also discuss outliers not just main patterns. • Clear and coherent reporting: How well can the route from data to conclusions be seen? • QI: Clear conceptual links from the presentation of the information to the analysis (i.e. is the graph going up or down and does the text reflect this? Is 10% or 51% a ‘significant minority’?). • QI: The paper should be well structured, clearly written, accessible, with key messages highlighted or summarised.

  11. Assessing Quality of Evidence: Reflexivity • How clear are the assumptions and values that have shaped the evidence? • QI: Discussion of main assumptions; research team perspectives and values; consideration of how bias may have crept in; funding. • QI: Published in a reputable source/ journal or by reputable publisher Research shows that coffee hydrates as much as water (Jan ‘14) Research undertaken by the Coffee Growers Federation

  12. Assessing Quality: Check List • Findings are clear • Meets aims and objectives • Contexts of the study are clear • Date of evidence • Appropriate method • Representativeness of the sample • Data collection and recording appropriate • Analysis is clear and trends discussed are consistent with the evidence presented • Authors/sponsors of evidence • Reputable source

  13. Examples: Assessing Quality in Practice

  14. Growing up in Scotland (2011): Changes in child cognitive ability in the pre-school yearshttp://www.scotland.gov.uk/Publications/2011/05/31085122/0 • Findings are clear: Yes • Meets aims and objectives: Yes • Contexts of the study are clear: Scotland based, interviews with parents and teachers ensuring broad geographical coverage and different school settings and on single age cohort • Date of evidence: latest 2013; based on fieldwork in 2012 • Appropriate method: longitudinal study (high quality for looking at how individual cohorts develop). Also useful for cross-sectional analysis to assess changes in time (x% more breastfeed in 2012 compared with 2009) • Representativeness of the sample: nationally representative (gold standard) • Data collection and recording appropriate: transcriptions, electronic recording of data and available for further analysis • Analysis is clear and trends discussed are consistent with the evidence presented: quotes, clear sub-population analysis (gender, age, SIMD, rural/urban) • Authors/sponsors of evidence: SG sponsored, contracted out undertaken by Scotcen • Reputable source: SG publication (peer reviewed, accountable, transparent etc)

  15. Effective Provision of Pre-School Educations (EPPE): Sammons, P., Sylva, K., Melhuish, E.C., Siraj-Blatchford, I., Taggart, B., Elliot, K. and Marsh A. (2004). The Continuing Effects of Pre-school Education at Age 7Years. London: DfES / Institute of Education, University of London • Findings are clear: yes, well written • Meets aims and objectives: yes • Contexts of the study are clear: need to check EPPE study website to know it’s England based • Date of evidence: 2004 publication, fieldwork 2003, so dated, but we have very limited longitudinal data looking at the impact of pre-school education on long term outcomes. With other studies (Millennium Cohort), we can back up findings and (GUS) anticipate trends so ensure we ask the right questions so that we have this evidence in the Scottish context in the future. Until then, this is as good as we have and need to caveat the context and date when we use the findings. • Appropriate method: yes: longitudinal and survey based with assessment of children so can check progress of same child and not cross-sectional (which could get some bias for comparisons) • Representativeness of the sample: main website shows large scale, nationally representative sample analysable by SIMD and subnational contexts • Data collection and recording appropriate: Not clear • Analysis is clear and trends discussed are consistent with the evidence presented: yes • Authors/sponsors of evidence: authors are Insititute of Education (reputable, and known expertise in longitudinal analysis, funded by DfES • Reputable source: Yes (IoE)

  16. Example 2: Barlow J, Coren E, Stewart-Brown S (2012) Meta-analysis of the effectiveness of parenting programmes in improving maternal psychosocial health. • Contexts of the study/Date of evidence/Appropriate method/Contexts/Represnetativeness of the sample: Meta-analysis: We searched (13 specified) electronic databases on 12 November and 5 December 2012 for randomised controlled trials in which participants had been allocated to an experimental or a control group, and which reported results from at least one scientifically standardised measure of parental psychosocial health. We included a total of 48 studies that involved 4937 participants and covered three types of programme: behavioural, cognitive-behavioural and multimodal. Not clear what geographical coverage is made, but suspect it’s international given database coverage. • Findings clear/ Analysis is clear and trends discussed are consistent with the evidence presented:Overall, we found that group-based parenting programmes led to statistically significant short-term improvements in depression, anxiety, stress, anger, guilt, confidence and satisfaction with the partner relationship. However, only stress and confidence continued to be statistically significant at six month follow-up, and none were significant at one year. There was no evidence of any effect on self-esteem. None of the trials reported on aggression or adverse effects. The limited data that explicitly focused on outcomes for fathers showed a statistically significant short-term improvement in paternal stress. We were unable to combine data for other outcomes and individual study results were inconclusive in terms of any effect on depressive symptoms, confidence or partner satisfaction. • Authors/sponsors of evidence: Cochrane Collaboration • Reputable source: Cochrane Library

  17. IRISS: This is where it starts collection: Eight case studies on early years. (March 2013) http://www.iriss.org.uk/resources/where-it-starts-collection • Findings are clear: Yes • Meets aims and objectives: ‘The purpose of the case studies is to capture some of the experiential knowledge held by professionals working in the early years on what supports positive outcomes, to share this knowledge more widely across the sector and to provide inspiration to others’. (how is it presented to inspire? Subjective?) • Contexts of the study are clear: yes • Date of evidence: ? Assume 2012/13 given date of publication • Appropriate method: desire to capture knowledge of practitioners and inspire so case studies in nurseries is ideal to get depth of understanding of examples and issues • Representativeness of the sample: potential limitations: only statutory and third sector case studies (not private sector?); Scotland-focused but mostly city-based and most appear to be based in poorer areas • Data collection and recording appropriate: not clear • Analysis is clear and trends discussed are consistent with the evidence presented: not a great deal of material given emphasis of report, but enough to recognise this is sound analysis based on what works in specific contexts • Authors/sponsors of evidence: IRISS working for practitioners (bias of reporting style and emphasis), funded in part by SG • Reputable source: Yes: IRISS has good reputation for high quality research in a format that’s accessible for practitioners

  18. Evaluation of The Dundee Families Project - http://www.york.ac.uk/chp/hsa/papers/spring02/scott.pdf (project designed to support families made homeless due to antisocial behaviour) • Findings are clear: Yes, but wordy in places • Meets aims and objectives: yes, an evaluation of a project • Contexts of the study are clear: Yes: Clear policy context of policy position in Dundee for dealing with homelessness due to antisocial behaviour and what this policy will do. Confined solely to Dundee so easy to consider transferability. • Date of evidence: paper delivered 2002, refers to policy context 1998. Very dated but specific content is clear so applicability and transferability can be considered. • Appropriate method: yes. mixed methods, including interviews, surveys, desk based admin data and caseload reviews, vignettes (small case examples) appropriate for this client group. • Representativeness of the sample: range of people considered, including those on the programme and those who chose not to engage. Hard to reach groups because homeless due to antisocial behaviour • Data collection and recording appropriate: not presented • Analysis is clear and trends discussed are consistent with the evidence presented: yes, and clear when it’s a significant majority or a lone voice. Would have been helpful to see verbatim quotes especially given vignette approach. • Authors/sponsors of evidence: Academic from Dept urban Studies, Uni of Glasgow (expect non bias and adherence to ethical standards and committees); independent of Dundeejointly funded by the Scottish Executive, Dundee Council and NCH Action for Children Scotland • Reputable source: conference paper generally to be considered less reputable because not been peer reviewed, but based on published research so gains credibility

  19. 2. Can We Use This Evidence? • Did it work? • To what extent did it work? What was the scale of change? • In what context did it work? (and how different is that from my context?) • What were the (other) critical success factors (and can I also ensure we have these present at our site)? • Is it cost effective? Causality = having confidence that you know what made the change happen

  20. Did it Work and to What Extent? • Once you’re confident that the evaluation has been of good enough quality, you need to satisfy yourself that programme participants did better relative to non participants on a comparable outcome. • This difference needs to be statistically significant. Statistical Significance = a result is unlikely to have happened by chance alone

  21. Did it Work and to What Extent? Statistical Significance = a result is unlikely to have happened by chance alone • To determine if a result is statistically significant, a researcher would have to calculate a p-value, which is the probability of observing an effect given that the null hypothesis (usually set at 5%). A statistically significant result is one in which the p-value for obtaining that result is less than 5%, which is formally written as p<.05s (also the same as a 95% confidence interval) • Statistical significance is best when you include effect size. • Meta Analysis of Parenting Programmes: Overall, we found that group-based parenting programmes led to statistically significant short-term improvements in depression, anxiety, stress, anger, guilt, confidence and satisfaction with the partner relationship. However, only stress and confidence continued to be statistically significant at six month follow-up, and none were significant at one year. There was no evidence of any effect on self-esteem.

  22. Did it work and to What Extent? Effect size= a way of quantifying the size of the difference between two groups • It allows us to move beyond the simplistic, 'Does it work or not?' to the far more sophisticated, 'How well does it work in a range of contexts?’ • By placing the emphasis on the size of the effect rather than its statistical significance (which conflates effect size and sample size), it enables us to better interpret effectiveness. • GUS: at age 5, children with a degree educated parent have a vocabulary ability around 6 months ahead of the average level obtained by GUS children at 5 years and around 18 months ahead of those whose parents have no qualifications, who are therefore around 12 months behind the average ability.

  23. In What Context Did it Work? Qualitatively assessing impact= What are the critical success factors • IRISS: What Supports Positive Outcomes? • Quarriers Family Centre: Links between home and nursery are vital. What works? • integrate parents and practitioners • Peer support • Be a hub for other services (talks on money management, confidence building, antenatal appointments etc) • Have an open door policy and welcome all families and their children • Be a role model for healthy attachment • Widen understanding of what learning is (parents stay and play) • Different views on how best to support children and families in the early years. • It isn’t quantified but the reporting demonstrates strength of findings (order and weight given to a theme) • Full context is reported to identify similarities with your setting. What’s different and is it the context that is driving the success? • If outcomes are dependent upon the unique context (setting or people) then the likelihood of being able to transfer the idea is poor.

  24. What are the critical success factors? Effective Provision of Pre-School Educations (EPPE): Sammons, P., Sylva, K., Melhuish, E.C., Siraj-Blatchford, I., Taggart, B., Elliot, K. and Marsh A. (2004). The Continuing Effects of Pre-school Education at Age 7Years. London: DfES / Institute of Education, University of London The findings indicate pre-school has a positive impact on children’s progress over and above important family influences. The quality of the pre-school setting experience as well as the quantity (more months but not necessarily more hours/day) are both influential. The results show that individual pre-school centres vary in their effectiveness in promoting intellectual progress over the pre-school period and indicate that better outcomes are associated with certain forms of provision. Often leadership and people skills are cited as a critical success factor. Whether you can use a piece of evidence may therefore depend upon getting your site ready and having the human resources required to deliver the intervention.

  25. Is it Cost Effective? • DEFINITION: cost-benefit analysis is a programme’s actual impact on outcomes. The impact a programme or policy has on a particular outcome – say, a child’s ability to read – is translated into a financial benefit for the state, for the child themselves, and for society more widely. For example, better literacy is associated with higher lifetime earnings – and hence increased tax take – and reduced likelihood of school exclusion and youth offending. • http://investinginchildren.eu/ Cost-benefit analysis : Incredible Years Parenting Programme and Triple P Positive Parenting Porgramme Level 4

  26. IS it Good Quality? Can You Use it? • Good enough evidence • Pragmatism • Political decisions • What’s the risk?

  27. Judge how big your leap of faith will be

More Related