1 / 12

Methods and Approaches to investigate the UK Education System

Methods and Approaches to investigate the UK Education System. Sandra McNally, University of Surrey and Centre for Economic Performance, London School of Economics. Overview. Some questions addressed in the CEP Education Programme Data Core methodological issues.

london
Download Presentation

Methods and Approaches to investigate the UK Education System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methods and Approaches to investigate the UK Education System Sandra McNally, University of Surrey and Centre for Economic Performance, London School of Economics

  2. Overview • Some questions addressed in the CEP Education Programme • Data • Core methodological issues

  3. Some current themes in the CEP Education Research Programme • What works in schools… • Teachers • Peers, neighbours and compositional effects • Parental preferences and admissions • Higher education in the UK

  4. What works (or not) in schools: CEP research

  5. Most important data sets • The National Pupil Database (potentially linkable with UCAS and HE data) • Longitudinal data sets which can be linked with administrative data, e.g. Longitudinal Survey of Young People in England; Millennium Cohort Study

  6. Core methodological issues • What is an appropriate counter-factual? – and therefore can we establish causality? • For whom is the causal effect identified? 3. Can we look at longer-term effects? 4. Can we do a cost-benefit analysis? 5. Can we extrapolate outside the study context?

  7. Approaches • Randomised Control Trials (Education Endowment Fund) • Difference-in-difference approaches • Instrumental Variable Approaches • Regression Discontinuity Approaches • OLS/Propensity Score Matching

  8. Randomised Control Trials • ‘Treatment’ is randomly assigned amongst target population • An example: Design of an ‘information campaign’ about the costs and benefits of staying-on in education targeted to Year 10 students in London schools (McGuigan, McNally, Wyness, 2012)

  9. Difference-in-Difference Approaches • Compare the outcomes of a treatment and control group before and after the policy. • An example: Structured way to teach reading (‘literacy hour’) was piloted in schools in some Local Authorities before it was rolled out to the rest of the UK (Machin and McNally, 2008)

  10. Instrumental Variable Approaches • Trying to find a variable that predicts the outcome variable only via the impact it has on the ‘treatment’ variable. • An example Using changes in the minimum school-leaving age to estimate the impact of additional education on future earnings (Harmon and Walker, 1995).

  11. Regression Discontinuity Designs • Making use of a ‘cut-off’ for eligibility to a programme/policy. Compare the outcomes of people either side of this ‘cut-off’. • An example Using administrative cut-offs in school admissions policies to identify the relationship between when children start school and their later outcomes. Crawford, Dearden and Greaves (2013).

  12. OLS/Propensity Score matching • Trying to measure the effect of a ‘treatment’ while controlling for all observable characteristics. • Example Looking at whether the increase in ‘non-native’ English speakers has an influence on the educational attainment of native English speakers (Geay, McNally and Telhaj, 2012)

More Related