1 / 33

Local Assessment Validity Study

Local Assessment Validity Study. Rayne A. Sperling Jonna M. Kulikowich The Penn State Research Team. Regulation Guiding the Research.

luigi
Download Presentation

Local Assessment Validity Study

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Local Assessment Validity Study Rayne A. Sperling Jonna M. Kulikowich The Penn State Research Team

  2. Regulation Guiding the Research “Students shall demonstrate proficiency in reading, writing and mathematics on either the State assessments administered in grade 11 or 12 or local assessment aligned with academic standards and State assessments under § 4.52 (relating to local assessment system) at the proficient level or better to graduate.”

  3. Purpose • For those students who failed to reach proficiency on the PSSA during the 11th grade administration or the 12th grade retake administration, what are the local assessments districts use to measure proficiency? • Objectives • To describe local assessments submitted by districts • To report the alignment of local assessments to proficiency standards • To describe districts’ reported local assessment practices as measures used to establish proficiency to the standards

  4. Collecting Local Assessments • Two requests were sent by PDE (7/28/09, 8/12/08) • Materials and practices submitted to PDE by October 7, 2008 were included in the study • PDE recorded date of receipt on submissions • Documents were boxed and shipped to Penn State, University Park • Stored by randomly generated code in Penn State suite

  5. Response Rates • Approximately 85% of districts responded to the PDE request for local assessments • Response rates were roughly equal across rural, urban, and suburban school districts • Characteristics of non-reporting districts were also examined

  6. Districts Overall *Although there are 501 school districts in PA, 4 do not have high schools, thus are not eligible for this investigation ** These numbers exclude Districts of the First Class

  7. Reporting Districts 7

  8. Non-Reporting Districts

  9. Local Assessments: Materials • Districts reported use of a variety of materials and practices as their local assessment • There was considerable variance in the nature of the materials submitted by districts • Some districts submitted a cover letter describing their local assessment • Some districts submitted multiple binders or boxes of materials as their local assessment • Whatever the district submitted became their local assessment ‘folder’

  10. Materials Submitted • Materials included curricula, portfolio strategies, published and commercial assessments, district and teacher constructed assessments • Districts reported use of more than 60 different published or commercially-available assessments • Districts submitted assessments with varying types of item formats as local assessments

  11. Materials Submitted *Types of Materials Submitted by Districts Overall (n = 418, districts that sent materials) Districts may have submitted more than one type of material

  12. Expert Panels • Educators from Pennsylvania were invited to serve on an expert panel to review local assessments • Characteristics • The expert panel was comprised of 24 members • The panel was equally divided into two teams, a 12-member panel for Reading and a 12-member panel for Mathematics • Among the panelists,18 are classroom teachers, six of whom are chairs in their respective departments; several teach at nearby colleges. • Two panelists are administrators (a high school principal and a curriculum supervisor) • Two are PSSA coaches (a literacy coach and a Mathematics coach)

  13. Expert Panels (continued) • One is an instructional specialist • One is an educational assessment specialist • Experience • Twenty-two panelists reported having experience with school-level curriculum development • Half of the panel members reported direct experience with development of their districts’ local assessments • Eight panelists participated in development of the PSSA on tasks such as item analysis • Panelists’ overall years of experience in education ranged from 7 to 37 years • Together they brought nearly 450 years of experience to the coding task

  14. Expert Panels Procedure • Panels met in State College in October • Trained by national advisors to code district folder materials for alignment to proficiency standards • Established criteria to evaluate reported practices for determining proficiency

  15. Materials Alignment 0 = No content areas represented; no alignment of outcomes to standards. 1 = Some content areas represented; some outcomes are aligned. 2 = Many to most content areas represented; most outcomes are aligned. 3 =All content areas represented; all outcomes are aligned.

  16. Materials Alignment

  17. Materials Alignment

  18. Materials Alignment

  19. Materials Alignment

  20. Alignment Ratings Score range for all groups is 0 – 3. Ratings were given for the folder of district materials

  21. Local Assessments • There was also considerable variance in the manner in which districts reported that they implement local assessments • Some districts reported they did not have a local assessment • Some districts reported that the local assessment was not necessary for graduation decisions • Some districts reported that their curriculum was aligned to standards and that passing courses was the local assessment

  22. Assessment Practices Expert Panels examined local assessment practices A survey of 42 representative practices was rated before exposure to local assessments and again after rating the alignment of local assessment materials Consensus criteria were established for assigning codes for assessment practices

  23. Assessment Practices 23

  24. Assessment Practices 24

  25. Assessment Practices 25

  26. Assessment Practices 26

  27. Results by Districts Score range for all groups is 0 – 3. 27

  28. Practices: Mathematics

  29. Practices: Reading

  30. Alignment: Mathematics & Reading 30

  31. Practices: Mathematics & Reading 31

  32. Conclusions There was considerable variance in both the nature of materials and practices reported by districts. As measured by the district folder, Mathematics assessments were more often aligned to standards than were Reading assessments. For Mathematics, thirty-one school districts (8.1%) had ratings of ‘3’ for both alignment and practice, while nineteen (5.1%) districts received ratings of ‘3’ for both alignment and practice in Reading.

  33. Conclusions (continued) • Based on criteria established by the panels, evidence of alignment to standards and practices that could result in valid measures of proficiency was present from 5 percent of school districts statewide given information submitted and reported about both their Mathematics and Reading assessments.

More Related