200 likes | 329 Views
Readings. Text: Riddick & Russell Ch1 stakeholders – p10 Ch 2 an evaluation system Proposal p25-36 Ch 4 – Lit Review Coursepack GAO report Ch 1 & 2, pp 177-185 MSU Sports pp 256-260. Evaluation is a process. Evaluability assessment Evaluation research proposal
E N D
Readings • Text: Riddick & Russell • Ch1 stakeholders – p10 • Ch 2 an evaluation system • Proposal p25-36 • Ch 4 – Lit Review • Coursepack • GAO report Ch 1 & 2, pp 177-185 • MSU Sports pp 256-260
Evaluation is a process • Evaluability assessment • Evaluation research proposal • Review of the proposal • Conduct evaluation • Report results • Implementation of findings
Evaluability Assessment • Is a program ready to be evaluated? • Description of program • Goals and objectives • Organization ready? Identify decision-makers • Political and social factors • Model of how program works • Are resources to conduct an evaluation available? • What part of program will be evaluated?
Steps for Evaluability Study • Program description – review documents • Identify targets, objectives, inputs, outputs • Interview key personnel • Scout program • Develop program model • Get agreement to proceed/cooperation
A Program Model Labor, Time Capital, $$$ Land/Facilities Use measures “Benefits to users” $$$ Process Outputs INPUTS Effects Impacts
Evaluation criteria • Effort – qnty & qlty of inputs • Performance- qnty & qlty of outputs • Adequacy – meet needs? • Efficiency – costs/benefits • Process – how & why pgm works? • Equity – who benefits, who pays?
Purposes of Proposal • Communicate with Client • Demonstrate your grasp of problem • Plan the study in advance, so others can evaluate the study approach • will it work? • have you overlooked something? • will results be useful to client? • Can we afford it?
Proposal Format 1. Problem Statement - define program to be evaluated/problem to be studied, users & uses of results. Justify importance of the problem/study. 2. Objectives : Concise listing . In evaluation studies, the objectives usually focus on the key elements of program to be evaluated & the evaluation criteria. These are the study objectives NOT the program objectives. 3. Background/Literature Review - place for more extensive history/structure of program. Focus on aspects most relevant to proposed evaluation. Discuss previous studies or the relevant methods. 4. Methods - details on procedures for achieving objectives - data gathering and analysis, population, sampling, measures, etc. Who will do what to whom, when, where, how and why? 5. Attachments - budget, timeline, measurement instruments, etc. NOTE: Most “programs” must be narrowed to specific components to be evaluated. Think of a “Program of studies” rather than a single evaluation study. The proposal should define this specific study & how it fits into a broader program of studies.
Sample Evaluation Objectives 1. Estimate benefits and costs of program 2. Estimate economic impacts of program on local community (social, environmental, fiscal). 3. Determine effects of program on target population. 4. Describe users and non-users of program 5. Assess community recreation needs, preferences 6. Determine market/financial feasibility of program 7. Evaluate adequacy or performance of program
Typical Research Objectives Describe a sample or population Identify/test relationships between variables in a population: statistical cause-effect Quantify the relationship Average income of MI Snowmobilers in 1998 is $45K SB with higher incomes spend more money After safety program, SB’s have fewer accidents SB spend per day = $25 + .4 * Income
Variable Terminology Variables : any characteristic that varies across individuals in a population (i.e. takes on different values for different individuals). Dependent variable is the one you are trying to predict or explain, usually the focus of your study Independent variables are the ones that help explain the dependent variable. In Program evaluation, the outcomes are generally the dependent variables and characteristics of program or target populations are independent. In a cause effect relationship, cause is independent & effect dependent.
MSU Sports Programs What is program(s)? Inputs, outputs, process Stakeholders Which piece to evaluate Evaluation criteria Methods to use
Methods Choices • Overall Approach/Design • Qualitative or Quantitative • Primary or secondary data • Survey, experiment, case study, etc. • Who to study - population, sample • individuals, market segments, populations • What to study - concepts, measures • behavior, knowledge, attitudes • Cost vs Benefit of Study
Qualitative vs Quantitative Approaches Qualitative Focus Group In-Depth Interview Case Study Participant observation Secondary data analysis Quantitative Surveys Experiments Structured observation Secondary data analysis
Qualitative vs Quantitative Quantitative Gen’l Laws Test Hypotheses Predict behavior Outsider-Objective Structured formal measures probability samples statistical analysis Qualitative Unique/Individual case Understanding Meanings/Intentions Insider-Subjective Unstructured open ended measures judgement samples interpretation of data Purpose Perspective Procedures
Primary or Secondary Data • Secondary data are data that were collected for some purpose other than your study,e.g. government records, internal documents, previous surveys • Choice between Primary /Secondary Data • Costs (time, money, personnel) • Relevance, accuracy, adequacy of data
Survey vs Experiment Survey - measure things as they are, snapshot of population at one point in time, generally refers to questionnaires (telephone, self-administered, personal interview) Experiment - manipulate at least one variable (treatment) to evaluate response, to study cause-effect relationships (field and lab experiments)
General Guidelines on when to use different approaches 1. Describing a population - surveys 2. Describing users/visitors - on-site survey 3. Describing non-users, potential users or general population - household survey 4. Describing observable characteristics of visitors - on-site observation 5. Measuring impacts, cause-effect relationships - experiments
Guidelines (cont) 6. Anytime suitable secondary data exists - secondary data 7. Short, simple household studies - phone 8. Captive audience or very interested population - self-administered survey 9. Testing new ideas - experimentation or focus groups 10. In-depth study - in-depth personal interviews, focus groups, case studies