1 / 23

Creating a TAH Evaluation Plan

Creating a TAH Evaluation Plan. Using Logic Maps and Performance Indicators to Guide Program Evaluation. Jeff Sun jsun@sun-associates.com Zora Warren zwarren@sun-associates.com www.sun-associates.com www.sun-associates.com/taheval.htm www.sun-associates.com/evalws

blandrum
Download Presentation

Creating a TAH Evaluation Plan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Creating a TAH Evaluation Plan Using Logic Maps and Performance Indicators to Guide Program Evaluation

  2. Jeff Sun • jsun@sun-associates.com • Zora Warren • zwarren@sun-associates.com • www.sun-associates.com • www.sun-associates.com/taheval.htm • www.sun-associates.com/evalws • www.edtechevaluation.com/logicmaphow2.htm

  3. Our Basic Evaluation Model • Based on the authentic assessment component of project-based learning

  4. This Evaluation Process • Helps clarify project goals, processes, products • Revolves around indicators of success written for this particular project’s goals • Is highly qualitative and formative • Qualitatively, are you achieving your goals? • What adjustments can be made to your project to realize greater success? • Makes use of a variety of data sources • Generates the necessary reports for the U.S. Department of Education

  5. The Basic Process • Develop the Project Logic/Plan • A part of the proposal-writing process! • Identify Evaluation Questions • Derived from the RFP and the stated goals in the proposal • Are we doing what we need to do to support the purposes for which we were funded? • Create Performance Rubrics • These allow for authentic, qualitative, and holistic evaluation • Conduct Data Collection • Tied to indicators in the rubrics • Report • Formatively and summatively

  6. What are Logic Maps? • A graphic organizer for cause and effect • More about linking concepts than process flow • Not really the same as a flow chart • Details how your project will… • organize resources • in response to needs • to fulfill its ultimate goal • But actually…not in that order • Needs  Responses  Goals

  7. Sample Project Objectives (aka “Goals”) • Strengthen teacher content knowledge in American history • Help teachers help students achieve Historical Thinking Standards • Create a collaboration between participating districts and content providers • These will be the things we evaluate, because these are the things that we do. • Program evaluation is about evaluating the project’s work and progress • It is not about testing the underlying research hypothesis!

  8. Mapping Priorities - Objectives - Indicators

  9. Sample Evaluation Questions • These come from the basic indicators that were specified in the proposal… • To what extent has Our Project strengthened teachers’ knowledge of traditional American history? • To what extent has Our Project increased the capacity of high need districts to provide high quality American history instruction?

  10. Basic Performance Indicators • Teachers in project districts will demonstrate increased knowledge of traditional American history content • Participating districts will provide increased opportunities for students to participate in high quality American history courses

  11. Basic Indicator - Q1 • Teachers in project districts will demonstrate increased knowledge of traditional American history content • Teachers - particularly those from high-need districts - will show gains on pre/post tests of content knowledge • There is a connection between these gains and the particular professional development offered by the project’s consortium • An analysis of participant deliverables - the outputs of the professional development - shows increased teacher knowledge and skills

  12. Question 1 - Level 4

  13. Basic Indicator - Q2 • Participating districts will provide increased opportunities for students to participate in high quality American history courses • Increases in the demand for, and availability of, AP American history courses • Students (of participants) will show increased mastery of Historical Thinking Standards • Participants will increase their use of improved tools for learning such as information technology • Participants will create lessons, courses, and units of study that support the development of student historical thinking skills

  14. Question 2 - Level 4

  15. Evidence - Question 1 Indicator • What evidence would we need to gather to prove that we’re seeing what is described in that indicator? • Increased interest in the program as a result of participant testimonies (recruitment for the 2nd year) • Increase collaboration between participants – sharing of docs, peer collaboration • Participants can refer to the specific standards and can use the language of these standards in high-level discussions with students and each other • Increased use of instructional technology • Wider variety of primary sources used, increased comfort level, increased familiarity • Types of questions that teachers ask in the classroom – shows that they’re analytical • Types of answers that teachers can give to student questions • Types of resources teachers can direct students toward • How engaged students are, how frequently they are participating, etc. • Ask teachers how their evaluations of students will change after their PD experience • How students are able to transfer knowledge (access prior knowledge, etc.)….ASK OF TEACHERS as well • Looking at the teachers’ materials (their products)

  16. Evidence - Question 2 Indicator • What evidence would we need to gather to prove that we’re seeing what is described in that indicator? • Have teachers peaked student interest in the Jr. year so that there is a greater demand for AP in the Sr. year? • Increased awareness of history among administrators to increase the number of higher level courses offered. (could be long term) • Survey of student interest in History as a discipline. Interest in more classes? • Increased enrollment in (HS) history electives • A lot of what we’re looking for in Q1 applies here too • Participation in history-related after-school activities • Increase in the “value” given to history as a subject in districts (among teachers, admins – scheduling, parents?)

  17. Data • Needs to support/confirm the established indicators • Needs to be formative and qualitative • Can’t just be the results of a “test” at the end • Needs to draw from a wide variety of sources

  18. Next Steps? • Finalize the rubrics • Establish data collection “schedule” • Establish meeting schedule • Review performance against rubrics • Reporting

More Related