1 / 81

Introduction to Alis

Introduction to Alis. Dr Robert Clark ALIS Project Manager. Ensuring Fairness. Principles of Fair Analysis : Compare ‘Like’ with ‘Like’ Appropriate Baseline Reflect Statistical Uncertainty. The Analysis. Linear Least Squares Regression. Subject X. A / B. C. 0 2 4 6 8.

Download Presentation

Introduction to Alis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Alis Dr Robert Clark ALIS Project Manager

  2. Ensuring Fairness

  3. Principles of Fair Analysis : • Compare ‘Like’ with ‘Like’ • Appropriate Baseline • Reflect Statistical Uncertainty

  4. The Analysis

  5. . Linear Least Squares Regression Subject X A / B C 0 2 4 6 8

  6. . Linear Least Squares Regression Subject X -ve VA +ve VA Residuals Regression Line (…Trend Line, Line of Best Fit) Outcome = gradient x baseline + intercept Correlation Coefficient (~ 0.7)

  7. Alf Bob Subject A Chris Subject B Low Ability Average Ability High Ability Baseline Score Measuring Value-Added – An Example National Trend ‘Average’ Student A B -ve C Result +ve D E U The position of the national trend line is of critical importance

  8. Physics Maths Psychology Sociology Latin Photography English Lit Some Subjects are More Equal than Others…. A B >1 grade Grade C D E C B A A* Average GCSE Principle of Fair Analysis No1 : Compare ‘Like’ with ‘Like’

  9. Subject Choices Predicted Grades Maths, Physics, Chemistry, Economics C, C, C/D, C/D Sociology, Communication Studies, Drama, Media B, B/C, B/C, B/C Some Subjects are More Equal than Others … Performance varies between subjects, thus analysing and predicting each subject individually is essential. e.g. Student with Average GCSE = 6.0

  10. Standardisation of Residuals • (Raw) Residuals can be used to examine an individual’s performance • Standardised Residuals are used to compare performance of groups • Standardised Residuals are independent of year or qualification type • For a class, subject, department or whole institution the Average Standardised Residual is the ‘Value-Added Score’ • Standardised Residual = Residual / Standard Deviation (National Sample) • When using Standardised Residuals then for an individual subject where N = number of results in the group (for combinations of subjects consult the relevant project) • 95% Confidence Limit = 2.0 x Standard Error • 99% Confidence Limit = 2.6 x Standard Error • 99.7% Confidence Limit = 3.0 x Standard Error

  11. Subjects Covered… • A / AS Levels • Applied A / AS levels (including dual award) • International Baccalaureate • BTec Nationals (Diploma, Certificate, Award) • CACHE DCE • OCR Nationals • ifs Diploma / Certificate in Financial Studies • Limited pool of level 2 (BTec First)

  12. How to Administer the Project

  13. Submit a registration form (Y11 May onwards….) • We need this before we can process any data • We need this even if you are registering as part of a consortium • Choose Basic / Full (Basic + Attitudinal surveys) and whether you wish to do baseline test • Submit student details – ‘Registration Spreadsheet’ (Y12 Mid Sept onwards….) • This gives us student name details, GCSE scores and the subjects they are studying • We always need this even if the students are sitting a baseline test • Send spreadsheet once students are confirmed on courses (i.e. not first day of term….) • Organise baseline testing – ‘Adaptive Test’ (End Y11 June 15th onwards….) • This can happen before, at the same time as or after sending us the registration spreadsheet (2 above) • Student details appear in ‘Check List’ on web site • Early prediction are available for students with Adaptive Test scores as soon as they appear in the Check List. This function is removed one Alis has generated offical predictions (pdf reports). • Don’t forget to click ‘Testing Complete’ once you have finished testing your students.

  14. Prediction Reports Generated • Prediction reports, Intake Profiles, Adaptive Test data (IPR) • Reports created after receipt of Registration Spreadsheet • Guaranteed turnaround 4 weeks • Normal deliverable turnaround 2 weeks • When adaptive test data is ready (‘Testing Complete’ clicked), repots are updated. • Maintain Data • Keep reports up to date by using the Subject Editor on the Alis+ secure website to add and remove students from subject registrations and request updated feedback • Submit Entries Data (Y12 & Y13 March / April) • For institutions offering A / AS options, submit EDI entries files to Alis • Entries data Matched and Check lists issued (Y12 & Y13 May - July) • These need to be completed to ensure complete matching of candidate numbers to names held by Alis to ensure all EDI exam results are successfully processed in August

  15. Results Collection (Y12 & Y13 August) • Submit A / AS results to Alis via EDI Results Files • Submit Other quals (IB, BTEc etc) to Alis using results spreadsheet (can opt to submit A / AS data in spreadsheet as well instead of EDI files) • Submit results as soon after results day as possible • Preliminary VA Feedback (Beginning of September) • Preliminary feedback generated by 1st Monday in September. Prompt return of results in August leads to early feedback • Trend data not fixed, values may be subject to change • Definitive VA Feedback (End of September) • Trend data locked and feedback generated. Letter & CD sent to schools / colleges. • Maintain Data • Update results data (missing grades, withdrawals, remarks, appeals etc) using the Results Editor on the Alis+ secure website and request updated feedback.

  16. Nov Nov Nov Nov Feb Feb Feb Feb Jan Jan Jan Jan April April April April June June June June Y11 Sept Sept Sept Sept Oct Oct Oct Oct Dec Dec Dec Dec March March March March May May May May July July July July Aug Aug Aug Aug Y12 Y13 Y14 Typical Timeline Registration Form 15th CABT Early Preds Prediction Reports (+Y13) Registration Form Entries Collection & Matching R Matching Checklists CABT (+ Early Preds) Registration SSheet Value Added Feedback Entries Collection & Matching Results Collection R Matching Checklists Value Added Feedback Results Collection

  17. Baseline Assessment

  18. Choice of Baseline • Average GCSE Score • CABT (Computer Adaptive Baseline Test) Why 2 Baselines ?

  19. Why 2 Baselines ? • Average GCSE correlates very well to A-level / IB etc, but by itself is not sufficient…. • What is a GCSE ? • Students without GCSE ? • Years out between GCSE & A-level ? • Reliability of GCSE ? • Prior Value-Added ? Principle of Fair Analysis No2 : Appropriate Baseline

  20. Average GCSE = 6 Average GCSE = 6 Average GCSE = 6 The Effect of Prior Value Added Beyond Expectation +ve Value-Added In line with Expectation 0 Value-Added Below Expectation -ve Value-Added Do these 3 students all have the same ability ?

  21. Appropriate Baseline • Do students with the same GCSE score from feeder schools with differing value-added have the same ability ? • How can you tell if a student has underachieved at GCSE and thus can you maximise their potential ? • Has a student got v.good GCSE scores through the school effort rather than their ability alone ? • How will this affect expectation of attainment in the Sixth Form ? • Can you add value at every Key Stage ? Baseline testing provides a measure of ability that (to a large extent) is independent of the effect of prior treatment.

  22. Computer Adaptive Baseline Test (CABT) • Test performed online – results automatically transmitted to CEM. • Minimal installation / setup required - if any. • Adaptive – difficulty of questions changes in relation to ability of student. • Efficient – no time wasted answering questions that are far too easy or difficult. • Wider range of ability • Less stressful on students – more enjoyable experience than paper test. • Less demanding invigilation. • Test designed to be completed in 1 hour or less. • No materials to courier In 2010 / 2011 over 68,000 students sat this test in Alis To try it out… www.intuproject.org/demos

  23. Understanding Your Students: Baseline & Predictive Feedback

  24. Intake Profiles

  25. Intake Profiles (Historical)

  26. Full Alis 2009 Demo School (999) Banana, Brian Banana, Brian IPR... Studying : Maths Physics Chemistry Biology ?

  27. Prediction Reports Probability of achieving each grade Expected Grade

  28. Which predicted grades are the most appropriate for this student ?

  29. Predictions Based on GCSE (7.0) B B C B B Predictions Based on Test (106) C B D B C What is this Student’s ability ? What Grades should we expect her to get ? If she gets C’s instead of B’s, is this a problem ?

  30. Why is the predicted grade not always equal to the highest bar ? Predicted (‘expected’) grade Most likely grade

  31. Prediction Reports Subject Report

  32. A2 vs AS predictions and the impact of the A* Grade

  33. Worked Examples: Baseline Data & Predictions

  34. Refer to the Intake Data on the next 2 slides For each school what deductions might you make ? What implications are there (if any) for teaching & learning ?

  35. School A

  36. School B

  37. Refer to the Y12 data on the next 2 slides. What impact might there be on the pupil’s learning ? What subjects would you be worried about them studying ? Note : Non Verbal section includes Perceptual Speed and Accuracy, Pattern Matching, logical reasoning and dice folding

  38. Y12 - Pupil D

  39. Y12 – Pupil E

  40. Refer to the data on the next 3 slides. Does the data show any ‘warnings’ about future potential achievement? Based only on the information provided, what would be realistic subject targets for the students, and why?

  41. Student 1

  42. Student 2

  43. Student 3

  44. Worked Examples: Target Setting

  45. Basing Targets on Prior VA – One Methodology from an Alis School • Discuss previous value added data with each HoD • Start with an agreed REALISTIC representative figure based, if available on previous (3 years ideally) of value added data • add to each pupil prediction, and convert to grade (i.e. in-built value added) • Discuss with students, using professional judgment and the chances graphs, adjust target grade • calculate the department’s target grades from the addition of individual pupil’s targets

  46. Discussion • Assess the merits and concerns you may have with this value-added model of setting targets

More Related