1 / 35

Data Driven Decision Making

Data Driven Decision Making. How do data driven decisions affect student achievement?. Four Basic Questions for Instructional Decision-Making. What do I want my students to know and be able to do? How will I know they know it and can do it? How do I account for the students’ performance?

keagan
Download Presentation

Data Driven Decision Making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Driven Decision Making How do data driven decisions affect student achievement?

  2. Four Basic Questions for Instructional Decision-Making • What do I want my students to know and be able to do? • How will I know they know it and can do it? • How do I account for the students’ performance? • What am I going to do for the students who don’t get it? (...and those who do get it?)

  3. Accountability • Since we are more accountable today than in the past, how must we proceed? • What must be done with the content we teach and the methods used by teachers?

  4. Accountability • New education initiatives and calls for increased accountability have raised the demands on schools to develop more effective, integrated methods for assessing student achievement. • Districts are now being pressured to align those methods not only within the school system itself, but also with state and district standards, and local curriculum and instructional practices.

  5. Problematic Issues • What are some problematic issues associated with district-wide and student-based assessments?

  6. Problematic Issues • Recent waves of new or expanded state-mandated assessments have added to the already lengthy list of student assessments used by schools. (Many Assessments) • Many schools work diligently integrating a multitude of assessments. They must blend the resulting information into an overall picture of student achievement (Interpreting Results)

  7. Problematic Issues • The task is a difficult one that calls for additional time, attention, and energy. (Time Constraints) • The array of available information contributes to the task of effectively analyzing and applying assessment results to decisions about instruction, the curriculum, or educational programs. (Applying the Information Collected)

  8. Common Results • What are some common results that occur after assessments have been completed and the data has been received?

  9. Common Results • Oftentimes, reams of computer-generated data about student performance may rest untouched on shelves. • Performance data being generated by students, may be logged into teacher grade books with no attempt to find common trends within classrooms or among various student demographic groups.

  10. Assessment as the Centerpiece of Reform • Most states have made student assessment the focus of their school reform and improvement efforts. Consider the following: • States have implemented a state student assessment system. • States have implemented assessments that measure student achievement relative to specified content and performance standards in core subjects such as reading, math, and science.

  11. Barriers to Implementing DDDM What are some barriers to implementing Data Driven Decision Making?

  12. Barriers to Implementing DDDM • Neither teachers nor administrators routinely receive formal training on how to assess students. • Neither teachers nor administrators routinely receive formal training on how to apply assessment information to instruction.

  13. Barriers to Implementing DDDM • Some administrators have difficulty making sense of reports from state assessments, instead relying heavily on traditional classroom quizzes, tests, and teacher observations for guidance. • Some administrators have teachers discuss student progress in various subject areas according to standardized rubrics, but they have not begun to analyze or use the data to enhance instruction.

  14. Keys to Implementing DDDM#1 Limit assessments to those with a specific purpose and those that contribute to a common “vision” for student achievement.

  15. Limit Assessments • Effective school-based student assessments consist of an organized set of assessment tools that are used for a clearly articulated purpose. (Why are we collecting this data?) • Ensure that time and resources are not wasted by “over-assessing” students or gathering redundant information that does not align with goals for student achievement. • Provide each stakeholder group, with the best information possible to make the decisions they need to make.

  16. Keys to Implementing DDDM#2 Align assessment, standards, curriculum, and instruction

  17. Assessment Development • Third Grade Math Anchors & Related Content • Scope and Sequence • Bench Mark Assessment Grade 3 Quarter 1 • Compass Reports • Resource Database • Laptop Project • 1st Quarter Results 6th grade

  18. M3.A.1.1 Apply place-value concepts and numeration to counting, ordering, grouping and equivalency. • M3.A.1.1.1 Match the word name with the appropriate whole number (up through 9,999). • M3.A.1.1.2 Differentiate between an even and odd number. • M3.A.1.1.3 Compare two whole numbers using greater than (>), less than (<) or equal to (=) (up through 9,999). • M3.A.1.1.4 Order a set of whole numbers from least to greatest or greatest to least (up through 9,999; limit sets to no more than four numbers). • M3.A.1.1.5 Match a symbolic representation of numbers to appropriate whole numbers (e.g., place value blocks, 7 hundreds, 4 tens and 8 ones, etc).

  19. Align assessment, standards, curriculum, and instruction • Alignment here refers to the degree of match between what students are expected to know and what information is gathered about what they do know (Webb, 1997). • When assessments are aligned with each other, the curriculum, or the standards prescribed by the state or district, there is a sense that they are working together to provide an overall picture of student achievement.

  20. Keys to Implementing DDDM#5 Define the purpose of assessment data.

  21. Why Bother with Assessment Data • Data leads to a teacher being able to: • Reflect on own practices • Generate new strategies to reach students • Make practical educational decisions • Meet the needs of individual student’s learning styles • Determine and reevaluate previous decisions for effectiveness • Ultimately, be a more engaged, effective, productive, confident, and happy educator • Data is only meaningful when it is linked to decisions about teaching. • Data is used to make decisions about individuals.

  22. How Do We Use the Data? • What is the student’s current level? • What is the student ready for next? • How can I support this? • Will I design an activity, have an interaction, offer a strategy, adapt the environment? • Data is used to make decisions about groups of students. • Planning is essential!

  23. From Where Do I Get the Data • Informal Assessments • Teacher observations, conversations • Anecdotal records, portfolios • Formal Standardized Testing • Terra Nova results from last year, Screening Assessment • Formal Classroom Testing • Reading program generated tests • Teacher-generated tests • Running Records • Formal and Informal Student Surveys • Student Interest Surveys • Classroom discussions • Learning Profiles • Teacher-child interviews

  24. Looking at the data helps the teacher with… • Small Group Instruction • Determining groups • Determining needs • Determining interests • Determining support

  25. Using Data to Assign Flexible Groups • Achievement Use performance on a reading measure. Students with similar scores are placed in the same group. • Mixed Achievement Students with various scores are placed in the same group. • Interest This group is based on interest inventories, student interviews, or attitude surveys. Assign students to a group based on interest or topic. • Skill or Instructional Need Students lacking in a skill or strategy are grouped together.

  26. Scenario #1 • While the students are in centers, Ms. Smith takes an anecdotal record on William. She notices that William is recognizing the names of some upper and lower case letters and can locate his own name. As she continues to observe, she also notes that William often incorrectly names letters that are easily reversed. She decides to have a group of four students meet so she can review the letters b, d, and p. What type of assessment was used? What grouping technique did she use? Informal Skill or Instructional Need

  27. Scenario #2 • While administering the district screening assessment, Ms. Cappello notices that four of her kindergarten students scored below the benchmark for identifying rhyming words. She decides to bring this group together to play several rhyming games. She thinks out loud for the students, and then encourages them to match pictures of rhyming words. What type of assessment was used? What grouping technique did she use? Formal Skill or Instructional Need

  28. Scenario #3 • In September, Mr. Jones uses the big book Mr. Grump during a shared reading experience conducted with the whole class. During this time, he notices that some readers seem to need additional instruction to strengthen their concepts about print, especially directionality and voice-print match. He decides to group these students together to provide this instruction What type of assessment was used? What grouping technique did she use? Informal Skill or Instructional Need

  29. Scenario #4 Running records have enabled Mrs. Harris to see that her twenty-two first graders have diverse literacy needs. Six of the students read at the same level. She pulls the group together and selects six copies of a book that is at their instructional reading level as determined by the running records. She facilitates a discussion of the book’s main character. What type of assessment was used? What grouping technique did she use? Formal Achievement

  30. Scenario #5 Mr. Doman has taken a close look at the students’ ability to activate background knowledge and monitor comprehension. He observes that they all have a pretty good handle on this, but he wants to introduce a new strategy. He models the strategy for a group of mixed ability students who have indicated an interest in sea-life. He chooses a non-fiction text on sharks. He then asks the students to choose a book from a stack of multi-leveled books on sea-life. The students are then given time to apply the new strategy and read their books. He includes the two students who are in another group and reading at the lowest level in the class. Mr. Doman assists as needed. What type of assessment was used? What grouping technique did she use? Informal Interest and Mixed Achievement

  31. Scenario #6 When Mrs. Anderson was working with her class on the story, “The Olympic Games: Where Heroes are Made,” she noticed some interesting behaviors by four of her students. While most of the class was able to find facts presented in the story, Javier, Julie, Drew, and Robin had difficulty finding facts they were asked to locate. For example, when Mrs. Anderson asked the class how many countries competed in the first modern Olympics, hands flew up, but Javier, Julie, Drew, and Robin kept flipping pages without finding the information. Mrs. Anderson has decided to take these students as a group and model how to locate information in a factual textbook format. What type of assessment was used? What grouping technique did she use? Informal Skill

  32. Scenario #7 When reading “Saguaro Cactus,” Mrs. Phillips noticed that Mary read the word “spiny” as “spinny.” When Mrs. Phillips asked what the word meant, Mary explained that it was when things spin really fast. Other students looked confused at Mary’s answer, but Joey and Adele shook their heads in agreement. Mrs. Phillips decided to review all of the vocabulary words for this story with this group. She made a game to help the students review the words, their pronunciation, and their meanings. What type of assessment was used? What grouping technique did she use? Informal Skill

  33. Monitor Student Progress Assessments that are administered and analyzed more frequently have been shown to be more effective in gauging student progress. Continuous monitoring of student progress means using data obtained from multiple sources—formal and/or informal—in ways that lead to sound decisions about how to flexibly tailor instruction to meet individual students’ needs.

  34. Identify Trends • Schools committed to measuring the rate of learning for individual students assess and analyze these data over long periods of time to begin to see trends. • These trends identify which children are learning, falling behind, or ready to move on to more challenging curricula. These trends help teachers individualize their instruction according to the different needs of their students and facilitate flexible groupings within classes based on students’ skills

  35. Conclusion • Assessment is a valuable tool in the education process and is often the first place reformers look to make changes. • High-quality school-based assessments let educators know what students have learned and what they have not, and what is being taught effectively and what needs to be taught better. • No real change can be made without an accurate, definitive picture of where the changes need to be made.

More Related