1 / 28

Maryvale

Maryvale. Regents Science Data Rollout. Reviewing Current Assessment Data. Initial reactions Where do you believe your students were MOST successful when compared to Erie 1? Where do you believe they struggled The MOST when compared to Erie 1?. Top 5 Color Green. Top 5 Color Red.

liana
Download Presentation

Maryvale

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Maryvale Regents Science Data Rollout

  2. Reviewing Current Assessment Data • Initial reactions • Where do you believe your students were MOST successful when compared to Erie 1? • Where do you believe they struggled The MOST when compared to Erie 1? Top 5 Color Green Top 5 Color Red

  3. Findings and Discussion • Look at the LIKE SCHOOL data and the test…yellow’s as well… • Discussion: WHY? Green? Yellow? Red? • Perceptions/Data? • Incorrect responses – distracters, non-exemplars, guess?? • Findings regarding last year’s assessment:

  4. Trend and Gap Data • Based on our findings with last year’s assessment…now let’s look at the last couple of years and see if we can find trends in the questions they are asking from each PI and gaps in “trended” areas. • WHY? Look at our Curriculum (specifically for Gaps). Is it spiraled? Are we teaching what they need to learn? And then, HOW are we teaching it?

  5. Look at Trend and Gap Data • Pick 3-5 PI’s where there is trend data, recent Q’s asked, AND gaps. • Use Deconstruction Template to Deconstruct the PI AND the Q’s for: Content – Standard 4 Skills –Standard 1,2,6,7 (skills) Cognitive Load HOTS

  6. Trend and Gap Data Findings • Based on our findings with last year’s assessment AND trend and gap data, let’s come up with a plan. • Summary and Findings: • Action Plan:

  7. Day 2

  8. Defining a Purpose • Why would we use common formative assessments? • What are the benefits? • What are the potential pitfalls? • How do we approach beginning this work?

  9. The Power of Common Formative Assessments Why are we doing this?

  10. Common Formative Assessment allows teachers to assume full control over curriculum, instruction, and assessment decisions. These decision are made by teams of teachers who come together as a professional learning community.

  11. Common Formative Assessments promote consistency across grade levels and provide for scaffolding between them. Teachers collaborate to build assessments that are valued by all and that serve students well.

  12. Using CFAs leads to real instructional change. Classrooms literally become laboratories and the findings guide our decisions.

  13. Using CFAs provides educators with multiple measures of assessment. Relying on standardized test data is not enough. We must study trends, content, AND skills and respond in ways that make sense.

  14. Common Reasons for Implementation CFAs are used to… • align curriculum • align practices • measure progress • improve scores • enhance professional relationships and promote collaboration Which reason do you value most? Why?

  15. The Foundational Premise of Positive Change? People are more important than scores.

  16. Conversation and Collaboration? These are the most important elements of the work. Without these pieces, the initiative will fail.

  17. Professional Learning Communities:Using Data and CFAs to Inform Instruction • Explore and deconstruct longitudinal data to define what is most important and needs. • Common Formative Assessment • Collaborative Scoring/Findings * Use of Multiple Measures of Formative Assessments or include in CFA’s*

  18. Using Formative Assessments • Deconstruct Longitudinal Data to Define Needs. • Administer a COMMON FORMATIVE ASSESSMENT (diagnostic), which is aligned to data needs and looks, in construct, like the state assessment. • Collaboratively score and draw findings. • Teach accordingly, and USE MULTIPLE MEASURES OF FORMATIVE ASSESSMENT aligned to defined areas of need. • Collaboratively score and draw findings. • Teach accordingly. • Administer a COMMON FORMATIVE ASSESSMENT (post-test), which is aligned to data needs and looks, in construct, like the state assessment. • Collaboratively score and draw findings. Did scores improve pre/post?

  19. Inch by inch it's a cinch, yard by yard it is hard!"

  20. The Road map to writing quality CFA's.. • Deconstructing the test items -- what vocabulary words and skills are in each item. • 2. What vocabulary and skills are written in the core curricula major understanding? • How is this concept being taught (steps, focus, activities) • 4. The assessment questions must be “written” ALIGNED to the rigor of BOTH the content AND skills (use bloom’s wheel). Assessment types may be combined if more than one skill or concept is being evaluated.

  21. Content/Concepts– the “what” in the curriculum. It is what teachers teach, what students learn, and the subjects or topics studied. Skills –it is how students demonstrate their understanding of the content. Skills require action verbs such as analyze, dissect, interpret, perform, and evaluate. Assessments – the evaluation tool used to determine whether students learned the content and can use the skills taught in the classroom

  22. from trends and GI gap data designed ALIGNED to content AND skills from assessments AND core vocab.! use gap data document and core verbs in core AND assessment

  23. How can they be written? • Selected response --multiple choice; t/f; matching • Constructed response—short answer; open response; extended response (rubric) • PERFORMANCE TASKS (rubric) Use pages 41-62 as a resource

  24. Process not product – a “scientific experiment”

  25. REFERENCES • Aligning the Curriculum: The Science. Jan Jacob. Erie 1 BOCES, West Seneca, New York. Oct. 2005. • Bernhardt, Victoria.1998. Data Analysis for Comprehensive Schoolwide Improvement. Larchmont, NY: Eye on Education. • English, Fenwick (2001). Deep Curriculum Alignment. Lanham, MD: Scarecrow Education • Formative Assessment Building. Larry Ainsworth. Erie 1 BOCES, West Seneca, New York. Sept. 2007. • Martin-Kniep, Giselle. 2005. Assessment Liasons Program. Albany, NY. • Nussbaum-Beach, Sheryl. “9 Principles for Implementation: The Big Shift.” (Weblog Entry). 21st Century Learning Collaborative. 28 March 2008. (http://21stcenturylearning.typepad.com/blog/2008/03/10-principles-f.html). • Popham, W. James (2008). Transformative Assessment. Alexandria, VA: Association for Supervision and Curriculum Development.

  26. Day 2 • Discussion • What worked? • What NEEDS work? • Where are you? • Where do you see yourself by the end of today? Quarter 1? Semester 1? End of the year?

  27. Checking for Alignment • Do/does the question align to the skill/content/both? • Are any of our questions NOT aligned? • Does the assessment match or go above the rigor of the gapped questions/standard? • Do the assessment items use the same terms that appear in the standards as opposed to more “student friendly” wording? • Do the assessment items align or resemble the formatting of the state assessment? • Are we addressing too many gapped areas/vocabulary/power standards in this assessment? • What do we need to change/modify?

  28. More Thoughts and Implementation Plan • Is there a range of low-level and high-level thinking and problem solving skills? • When will we implement this? • How will we score this? RUBRIC? • How will we use the data to inform our instruction? • What other types of assessments might we consider using within the unit or as an additional CFA?

More Related