470 likes | 568 Views
How’s it Working? Evaluating Your Program. MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University. PERG. Founded 1976 Over 600 program evaluation and research studies in various educational settings
E N D
How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education, Lesley University
PERG • Founded 1976 • Over 600 program evaluation and research studies in various educational settings • Also offers professional development and consultation
Session participants will: • Be introduced to the basics of program evaluation through an example • Define a question or questions about their own program • Identify methods for collecting data that would help to answer their question/s • Discuss next steps
What is program evaluation? • A type of applied research focused on systematically collecting and analyzing data to help answer questions about a program, or some aspect of a program, in order to make decisions about it.
Purposes • Accountability • Program development • Generating knowledge
Formative vs Summative • Formative evaluation offers feedback along the way to improve programs • Summative evaluations “sum up” the results of a program at the end of a period of development or implementation.
Audiences • Funders • Program leaders • Program participants • Organizational partners • Others
Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection
An example: Evolutions • After school program begun in 2005, connected with Peabody Museum of Natural History at Yale University—initially involved approximately 40 low SES/ minority students
Evolutions program goals To provide opportunities for students to: • Prepare for post-secondary (college) education; • Learn about scientific—and other careers; • Expand their knowledge of and interest in science (science literacy); • Develop transferable skills for the future; and • learn about the Peabody Museum/museum careers.
Logic models • Map a coherent chain of connections between goals, resources, activities and what you expect (short term), want (over an intermediate period) and hope (in the long term) to happen. • They also reflect your assumptions and theory of action or change.
Logic Model Key Concepts
Logic models may look different.. Goal Long-term outcomes Activities Outputs Resources Rationale Mid-term outcomes Assumptions Short-term outcomes
Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection
Questions: Think Goldilocks • Specific but not too detailed • Important but not too broad in scope
Key Questions: Part One • How does EVO prepare students for college or high school? • How are EVO students involved in developing an exhibit at the museum? • Do students develop increased “science literacy,” as defined by EVO staff?
Key Questions: Part Two • How (if at all) do students express more confidence about and interest in doing science? • Are students more aware of careers in science? • How (if at all) do students demonstrate increased knowledge of the college application process, and develop criteria for choosing a college that meets their needs?
Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection
Data collection methods • Observation • Interviews/ focus groups • Surveys • Document/artifact review
Technical considerations: Validity • Will the data answer the questions? • Are we asking the right questions?
Triangulation • Is there adequate triangulation (use of multiple methods and/or data sources) to ensure validity?
Drafting your own matrix: What data will help you answer your questions?
Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection
Collecting data • Make sure your plan is doable given time and resources available. • Design instruments to focus your data collection, ensure consistency and avoid bias. • Be organized: take notes, develop a system for tracking/ filing your data.
Collecting data • Communicate clearly about what you are doing, why and how the findings will be shared and used. • Be mindful of human subjects protections. Does your organization have an institutional review board (IRB)?
The First Year: site visit • On-site data collection • Focus groups with students • Interviews with director, project staff • Observation of end of year event • Parent interviews
Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection
Analyzing data • What stands out? • What are the patterns? • What are the similarities? • What are the differences? • Is more information needed?
Reliability • Are the patterns in the data, or judgments about the data, consistent?
Validity, again • Is the data helping you answer the questions? • Is the data credible?
Evaluation process 1. Goals/ logic model 6. Reporting 2. Questions PROGRAM 5. Data analysis 3. Evaluation plan 4. Data collection
Reporting • Consider purpose and audience/s • Reporting relevant findings, questions/ recommendations • Engaging stakeholders in discussion • Using findings to inform next steps
Results of the first-year evaluation • The impact of the evaluation on EVO—more focused program, clearer objectives, suggestions for sustainability. • Evidence of program success: Retention, student engagement, positive changes in students’ view of doing science and scientists.
The Ongoing Evaluation--shaping the program: • Implementation of evaluator suggestions—examples: informational interviewing, developing a smaller exhibit, refining requirements for students
EVO: 2006-Today • Continued development and expansion of EVO—2006 until today: Expansion of the program from approximately 40 to more than 80 students, introduction of internships and Sci Corps. • Different areas of science focus—environmental awareness, geoscience, depending on funding sources.
Evaluation resources • W.K. Kellogg Foundation Evaluation Handbook www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf • Kellogg Logic Model Development Guide www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf • Basic Guide to Program Evaluation www.managementhelp.org/evaluatn/fnl_eval.htm
Evaluation resources • Program Evaluation & Research Group Lesley University 29 Everett St. Cambridge, MA 02138 www.lesley.edu/perg.htm 617-349-8172 perg@lesley.edu