1 / 38

2013-2014 Webinar s eries October 30, 2013 3:30 – 4:30 p.m.

Initiative Integration: Getting the most o ut of CCSS and Educator Effectiveness. 2013-2014 Webinar s eries October 30, 2013 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department of Education. Webinar Focus .

sunee
Download Presentation

2013-2014 Webinar s eries October 30, 2013 3:30 – 4:30 p.m.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Initiative Integration: Getting the most out of CCSS and Educator Effectiveness 2013-2014 Webinar series October 30, 2013 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department of Education.

  2. Webinar Focus • Establish rationale for integrating initiatives. • Identify Direct Access to Achievement content and processes that support effective integration of CCSS implementation and educator effectiveness systems. • Demonstrate the use of Action Research as a framework for integration.

  3. When our efforts aren’t integrated and aligned…how likely is actual progress?

  4. The likelihood of progress increases when we integrate essential elements of new initiatives, using a systems approach to align our efforts.

  5. Discussion Question: To what extent do educators in your district/school view CCSS and educator effectiveness as mutually supportive elements of a comprehensive system geared to improving student outcomes? Not at all. To some extent. More than they don’t. To a great extent.

  6. How do we encourage integration and alignment of initiatives rather than layering? Use a framework that works across initiatives!

  7. DATA Project Flashback! Critical Questions That Guide the Work • What do we want our students to learn?  • How will we know what students have learned?  • How do we respond when they don't learn?  • How do we respond when they do learn?

  8. DATA Project Flashback! :What Do We Want Students to Learn? • State Standards/Common Core Standards Identified • District Teams Determine Power Standards & Pacing • Standards Learning Targets • Where are they going? • How will they get there?

  9. DATA Project Flashback! How Will We Know What Students Have Learned? • Data Driven Instruction • Monitor Learning Using CFA’s (Common Formative Assessments) • SMART Goal: 80/80 • Teacher Assessment Goals, SB 290 • Analyze Instructional Strategies, Tools and Resources

  10. DATA Project Flashback! Common Formative Assessments CFA’s ARE • Thermometers • Quick & Painless • Defined by their use • Given Frequently What do they look like? • CFA’s come from a variety of sources • Quick checks • Item analysis • Common means common! Both tool and administration CFA’s ARE NOT: • Summative Assessments • OAKS

  11. DATA Project Flashback! How Do We Respond When They Don’t Learn? • Flexible groups based on specific needs for skill/concept • What do they need... • Intervention? • More practice? • Extension/Enrichment? • Who is most qualified to teach specific groups? • Tier III Intervention: The “Triple Dip” • Check out our POST INTERVENTION data!!

  12. DATA Project Flashback! How Do We Respond When They Do Learn? • Standards Focused • Go Deeper In Standard- Bloom’s • Examples: • Research Projects • Computer Lab Activities • Individualized Plans • Open Ended Problem Solving • Student Designed Learning

  13. DATA Project Flashback! Data Team Process • Collect and chart data • Analyze strengths and obstacles • Establish goals: set, review, revise • Select instructional strategies • Determine results indicators • Monitor and evaluate results

  14. DATA Project Flashback! Action Research—how might we use this framework for integration of data teams/ PLCs, CCSS implementation, and educator effectiveness?

  15. DATA Project Flashback! Educators can promote and model this! Action research encourages an inquiry mindset

  16. DATA Project Flashback!

  17. Connection to CCSS Implementation CCSS Implementation Challenges • Setting appropriate CCSS aligned goals for learning and growth. • Learner development and differences • Content and application of content • Planning instructional strategies and assessment • Engaging students in CCSS aligned instructional strategies. • Creating and/or selecting measures that assess students’ learning and growth relative to CCSS. • Analyzing results data to make meaning about student learning and progress. • Repeating the cycle.

  18. DATA Project Flashback! Teachers and leaders will need structures that provide support to meet the challenges

  19. For which of the following do you believe teachers will need the most support? • Collecting evidence to determine students’ learning needs for CCSS-based grade level curriculum. • Determining students’ learning needs based on the evidence. • Setting SMART goals for improving student achievement based on the evidence and CCSS. • Setting student learning objectives/goals. • Identifying, selecting or creating formative measures to assess student learning and growth for CCSS-based curriculum.

  20. Connect and Integrate Action Research + Data Teams/PLCs = system for integration and implementation support!

  21. Connect and Integrate Dedicated team time and effective team processes provide supportive environment for using action research framework. Data Teams/PLCS provide: • Structures—focused agenda, inquiry process (DATA Project Toolkit for Accountability)https://sites.google.com/site/oregontoolkit/ • Interpersonal processes—norms and roles for facilitating the inquiry process • Procedures—Action Research—observe/question, hypothesize, predict, test hypothesis, gather data, analyze and explain. • Teachers use data/evidence to reflect on challenges, successes and resources they brought to specific instructional task or situation.

  22. Connect and Integrate How do you support teams in doing action research as part of the teaming process? Use the Direct Access to Achievement Implementation Rubric and Team Observation Tool!

  23. DATA Project Flashback! Direct Access to Achievement Implementation Rubric Implementation Areas: Indicators Grouped as: Structures Processes and procedures Professional development • Leadership • Problem-solving through data analysis. • Curriculum and instruction • Assessment • Positive school climate • Family and community partnering

  24. DATA Project Flashback! Team Observation Tool Team Steps: Indicators provided: Descriptions of team actions that indicate Proficient or Exemplary behaviors. • Agenda and Minutes • Norms and Participation • Data Organization and Analysis • Analysis of Strengths and Obstacles • Goals • Instructional Strategies • Results Indicators • Meeting Self-Reflection

  25. Connect and Integrate Implementation Areas: Problem-solving through data analysis Curriculum and instruction Assessment Action Research Step: Observe and Question Teams within a junior high school have been given the task of assessing the alignment of the complexity of science texts/readings used at each grade level with expectations for text complexity as specified in CCSS.

  26. Connect and Integrate Observe & analyze data for trends, patterns and clues • Do recent assessment data reflect that students are able to comprehend text that is written on a grade appropriate level of complexity? • 45% of students are comprehending texts that are at appropriate complexity for grade level. • Science texts and other readings assigned by teachers in the teams range from 35% to 75% of readings are at appropriate complexity for grade level. • For which students? • Pre-Advanced Placement and Honors students tend to have more complex readings and higher % of students comprehending. • Under what conditions?

  27. Connect and Integrate Hypothesize & Predict: • If we increase texts/readings of sufficient complexity to at least 60% for all assignments regardless of pre-AP or honors enrollment, and we scaffold students who need support, then our students will improve their ability to comprehend complex texts aligned to grade level. • Set a SMART goal for the improvement expected: • 57% of students will demonstrate proficiency with grade level science comprehension on the post-assessment science reading task as compared to 45% on the pre-assessment science reading task.

  28. Connect and Integrate Test Hypothesis • Grade level teams assess the text-based and/or other source reading selections for the curriculum unit and select readings that reflect alignment with CCSS grade level complexity. • Teams decide on instructional strategies, supports for students and timeframe for post-assessment. • Teams decide on indicators that will help them know whether they implemented with fidelity. • Teachers implement the planned actions.

  29. Connect and Integrate Gather Data • Teachers monitor their implementation by collecting data on adult actions. • Percent of science reading assignments at the appropriate complexity for grade level. • Teachers administer and score the student post-assessment following the unit of instruction.

  30. Connect and Integrate Analyze Data & Explain • Analyze data for patterns, trends and clues • Did students improve their scores on comprehension on the post assessment? • Who improved? Who didn’t? To what extent? • Was improvement or lack of improvement associated with certain conditions? (adult data) The team determines new questions or clarifications based on their action research.

  31. Continue the Cycle!

  32. Connection to Educator Effectiveness The Evaluation and Professional Growth Cycle supports an inquiry mindset. Evaluation and Professional Growth Cycle: • Self-reflection • Goal-setting • Observation/collection of evidence • Formative assessment/evaluation • Summative evaluation

  33. Connection to Educator Effectiveness How does Action Research connect to educator effectiveness standards? Educational Leadership/ Administrator Standards Model Core Teacher Standards The Learner and Learning Learner Development, Learning Differences, Learning Environments Content Content Knowledge & Application of Content Instructional Practice Assessment, Planning for Instruction, & Instructional Strategies Professional Responsibility Professional Learning & Ethical Practice and Leadership & Collaboration • Visionary Leadership • Instructional Improvement • Effective Management • Inclusive Practice • Ethical Leadership • Socio-Political Context

  34. For which of the following do you believe leaders will need the most support? • Collecting evidence to determine teacher’s needs for professional practice. • Determining teacher’s needs for improvement based on the evidence. • Setting teacher goals for improving professional practice based on evidence of teachers’ needs. • Identifying, selecting and/or approving formative measures to assess student learning and growth for teacher goal setting. • Setting appropriate teacher goals for student learning and growth.

  35. Integration Leaders can use Action Research for a framework for instructional improvementand implementation of CCSS. • Teacher goal setting for professional practice, professional responsibilities, and student learning and growth. • Identify/create/select formative measure to assess student learning and growth based on CCSS. • Start with relevant student learning data or artifacts • Use the Action Research framework to consider student data and artifacts

  36. In Summary • Action Research is a procedure that can be embedded into the work of data teams/PLCs to • Increase effectiveness and reduce fatigueduring implementation, and • Ensure integration of efforts in implementing CCSS and educator effectiveness.

  37. Future Webinars Making it Stick: Going from Training to Implementation Practice • February 26, 2014 3:30 – 4:30 p.m. What Difference Is this Making? Evaluating Program Effectiveness and Fidelity of Implementation • April 23, 2014 3:30 – 4:30 p.m.

More Related