1.41k likes | 1.67k Views
Basics of Evaluation: Parks and Recreation. Karla A. Henderson, Ph.D . Professor, North Carolina State University karla_henderson@ncsu.edu. Framework. Evaluation Inventory (handout) Needs Assessment Regarding Program Evaluation ( Types of Needs Explained)
E N D
Basics of Evaluation: Parks and Recreation Karla A. Henderson, Ph.D. Professor, North Carolina State University karla_henderson@ncsu.edu
Framework • Evaluation Inventory (handout) • Needs Assessment Regarding Program Evaluation (Types of Needs Explained) • Small Group Needs Identification (6 people per group) Importance (Very, Moderately, Slightly, Not) 5-10 topics to Discuss Together
What is Evaluation? Evaluation= systematic collection and analysis of data to address criteria to make judgments about the worth or improvement of something; making decisions based on identified criteria and supporting evidence Assessment= the examination of some type of need that provides a foundation for further planning Evaluation is sometimes like solving a mystery
Types of ProgramEvaluation Macro (System Evaluation) Micro (Activity or Event Evaluation)
Formal (Systematic) Evaluation MACRO and MICRO approaches------ • Provides rigor • Systematic gathering (procedures and methods) of evidence • Leads to decisions and action • Criteria • Evidence • Judgment
Assessments • Why assess? • eventually use the input to design programs and objectives • generates new program ideas • gives constituents a say • helps you be responsive • What is a need? A want? An intention? • Assessments determine all three of these so YOU can figure out how to promote what you are doing
Types of needs (Caution!) • Normative needs- what should be available • Felt needs- what individuals believe they would like • Expressed needs- needs fulfilled through participation
Potential Purposes: • Efficiency-How are we doing? (Management /Process Focused) • Effectiveness-What difference do our efforts make? (Impact/Outcome Focused)
Accountability Era—DO YOU AGREE? • What gets measured gets done • If you don’t measure results, you can’t tell success from failure • If you can’t see success, you can’t reward it • If you can’t reward success, you’re probably rewarding failure Reinventing Government, Osborne and Gaebler, 1992 University of Wisconsin-Extension, Program Development and Evaluation
Accountability Era • If you can’t see success, you can’t learn from it • If you can’t recognize failure, you can’t correct it. • If you can demonstrate results, you can win public support. Reinventing Government, Osborne and Gaebler, 1992 University of Wisconsin-Extension, Program Development and Evaluation
Evaluation Process: “Begin with the end in mind.” Covey (1990), 7 Habits of Highly Effective People
A thinking process used by evaluators: (from Richard Krueger) • Reflecting • Develop a theory of action. A logical sequence that results in change. • Begin with what is supposed to happen--the results. • Listening • Share the theory of action with others. • Measuring • Determine your measurement strategies--how you're going to look at the program • Adding value to the program • What can evaluation do to contribute to the program? How can evaluation make the program better, more enjoyable, focused on results, accountable, satisfying to participants and educators, etc?
Ways of Thinking: • Goal Based Thinkers - "We look for goals" • Audit Thinkers -"We investigate and find out what's wrong" • Utilization Focused Thinkers - "We make evaluation useful" • Empowerment Focused Thinkers - "We empower local people" • Positivistic Thinkers - "We are scientists" • Number Thinkers - "We count--and we do it well“ • Qualitative Thinkers - "We tell stories" (Richard Krueger)
Practical tips for successful evaluation: (Richard Krueger) • Involve others Utilization, impact and believability emerge from involving colleagues and clientele. If you want the information used then involve others! • Ask yourself: Do I have a program? and, Is it worthy of evaluation? • Consider your purpose for evaluating---- (see earlier slide) • Consider who wants the evaluation-Who requested it? • Use a variety of evaluation methods when possible. • Keep costs low by: Sampling strategically • Keep interest high by adding payoff to the participant. • Start with goals, but don't be unduly limited by goals • Consider "early evaluation" • Design the evaluation carefully. The evaluation should: • Enhance the program • Yield information beneficial to stakeholders • Conserve resources
3. Differences Between Assessment, Evaluation, and (Action) Research
Evaluation= systematic collection and analysis of data to address criteria to make judgments about the worth or improvement of something; making decisions based on identified criteria and supporting evidence Assessment= the examination of some type of need that provides a foundation for further planning Action Research-Evaluation leading to decisions/changes
Steps • Problem, Idea Identified • Problem Statement/Purpose Determined • Instrument/Method Chosen • Data Sources • Data Collection • Data Analysis • Conclusions/Recommendations
Evaluation Process: “Begin with the end in mind.” Covey (1990), 7 Habits of Highly Effective People
Areas to Evaluate • Personnel • Places • Policies • Programs • Participant Outcomes
Potential Purposes: • Efficiency-How are we doing? (Mgmt/Process Focused) • Effectiveness-What difference do our efforts make? (Impact/Outcome Focused)
Levels of Evaluation: • END RESULTS (Impact) • PRACTICE CHANGE (Outcomes) • KASA CHANGE (Knowledge, attitudes, skills, and aspirations)(Outcomes) • REACTIONS (SATISFACTION) (Outputs) • PEOPLE INVOLVEMENT (Outputs) • ACTIVITIES (Outputs) • INPUTS--RESOURCES
What is sampling? • A population is the theoretically specified aggregation of study elements. • A sample represents or is representative of a population
Types of Sampling • Probability • Non-Probability • Theoretical
Probability • Probability samplingSamples are selected in accord with probability theory, typically involving some random selection mechanism. Random Stratified Random Systematic Cluster • RepresentativenessQuality of a sample having the same distribution of characteristics as the population from which it was selected.
Nonprobability • Technique in which samples are selected in a way that is not suggested by probability theory. Purposive Convenience Quota Expert Snowball
WHO DOES THE EVALUATIONS? • Internal • You! • Staff • Agency Evaluation Personnel • External • Consultants • University Students! Regardless—YOU have to know your purpose, goals, and appropriate methods!
Timing • Assessments (planning) – find out where to begin based on what you know • Formative (process)- concerned with efficiency and effectiveness • Summative (product) – overall performance
Approaches to Needs Assessments • Literature/Professional Development • Advisory Groups • Structured Interviews (individual and focus groups) • Surveys
Formative Evaluation • Evaluation in Process • Allows Changes to be Made Immediately • Most often Focused on Inputs and Outputs
Summative Evaluation • At the END of something • “What was?” • Recommendations for the Future
Data Collection Methods MAJOR ONES: • Questionnaires/Surveys • Interviews (Individual and Focus Groups) (Pros and Cons)
Other Methods: • Systematic Observation • Checklists • Field Observations • Unobtrusive Measures • Physical Evidence • Archives • Covert Observations • Visual Analyses • Experimental Designs • Case Studies
To consider (see below for further info) • Logic Models—”outcome focused” • Trends Analysis • Benchmarking • Proragis (NRPA)
CAPRA Standards • 10.1 Systematic Evaluation Program There shall be a systematic evaluation plan to assess outcomes and the operational deficiency and effectiveness of the agency. • 10.2 Demonstration Projects and Action Research There shall be at least one experimental or demonstration project or involvement in some aspect of research, as related to any part of parks and recreation operations, each year.
CAPRA Standards • 10.3 Evaluation Personnel There shall be personnel either on staff or a consultant with expertise to direct the technical evaluation/research process. • 10.4 Employee Education There shall be an in-service education program for professional employees to enable them to carry out quality evaluations.
Evaluation Approaches—KEY POINTS! • Multiple LEVELS of evaluations: • Inputs (costs, personnel, etc.) • Outputs • Activities • People involvement • Reactions • Outcomes • KASA-knowledge, attitudes, skills, aspirations • Behavior CHANGE • Impacts • Long-term Benefits
What are the goals of the program? • What do we expect to happen? • What do we want participants to do, gain, learn? BEGIN WITH THE END IN MIND!!
Goals and Objectives • Goals • Broad, long-range statements that define the programs/services that are going to be provided • Objectives • Specific statements (about the attainable parts of the goal) that are measurable and have some dimension of time.
Objectives • Specific • Must be clear and concrete • Measurable • Must be some way to determine whether or not the desired results have been achieved • Achievable • Must be attainable and reality-based!!! • Relevant • Must be useful; must have worth to your organization • Time-limited/Time connected • Must specify a time frame for accomplishment Adapted from Edginton, Hanson, & Edginton, 1980