300 likes | 455 Views
Got Grants? A Grant Evaluation Toolbox for Institutional Researchers. Bri Hays San Diego Mesa College. Daylene Meuschke College of the Canyons. RP Conference April 11, 2014. Introductions. Session Objectives. Explore evaluation tools and resources
E N D
Got Grants?A Grant Evaluation Toolbox for Institutional Researchers Bri Hays San Diego Mesa College Daylene Meuschke College of the Canyons RP Conference April 11, 2014
Session Objectives • Explore evaluation tools and resources • Discuss challenges and strategies for overcoming them in grant development and evaluation • Compare formative and summative evaluation approaches • Illustrate a program theory using a logic model • Summarize evaluation goals and activities using an evaluation plan
Poll: Who serves as the point of contact for grants at your college?
When is your research office typically brought into the grant conversation? • Grant development phase? • After the grant is awarded? • Just before the annual performance report is due?
Discussion: What other grant-related challenges do your research offices face?What other tools or resources would be helpful to you?
Program Evaluation: An Overview • 3 Major Categories of Evaluation: • Needs Evaluation/Needs Assessment • Process Evaluation • Outcomes Evaluation
Tools for Grant Development and Outcomes Evaluation • Logic Models • Framing the evaluation: What is the context? • Evaluation Plan Templates • Communication/Dissemination Plan Templates
What are they? And how can they help my research office? Logic Models
Logic Models: What Are They? • Clearly identify the specific aims of the program or project • Outline of the program/project theory, rationale, or model • Visual of how the program/project will work
How Can Logic Models Help You? • Visual outline can serve as basis for program planning and development • Communicate to internal and external stakeholders how your program/project works • Clearly identify links between program activities and outcomes, both short and long term • Set the stage for the evaluation plan • And…are a required component of an increasing number of federal grant applications
Formative Evaluation • Formative Evaluation • How is it working? • Is it working as intended? • Is it reaching the intended population? • What is working well? • What can be improved? • Continuous quality improvement! Formative Evaluation Cycle
Summative Evaluation • Summative Evaluation • What worked? • How do we know that it worked? • What didn’t work? • What are the lessons learned from the program or project? • How does this fit into the larger body of knowledge about this intervention/program/project?
Evaluation Plan Nuts and Bolts • Program/Project Objectives • What are the specific aims of the program or project? • Key Stakeholders • Evaluation Goals • What information does the grant team need? • Needs assessment? • Process evaluation? • Outcome evaluation? • Evaluation Plan Outline/Timeline • What are the major steps or tasks in the evaluation? • When do the internal and external stakeholders need the information?
Getting Started on Evaluation Plans • Examine the grant’s Request for Proposal (RFP) to determine the evaluation requirements. • Each grant will have different requirements for reporting and evaluation. • Develop a comprehensive evaluation plan, including: • Responsible parties • Data collection • Timelines • Dissemination
Sample Evaluation Plans (Continued) Upward Bound Grant Evaluation • Determined the requirements for the evaluation as set forth by the Department of Education. • Built these specifications into an evaluation template. • Developed a spreadsheet to track student data to their post-secondary completion. • Identified who was responsible for grant objectives. • Coordinated with the Upward Bound director and staff to ensure proper data collection. • Created a shared network drive to enable shared access to the data collection spreadsheet
Dissemination Plans • How will you share the results of the evaluation? • What types of information needs to be reported, and to whom? • When does information need to be shared with internal and external stakeholders? • What levels of detail are required for each report? • How will the information be used?
Sample Report: Comprehensive Annual Report • San Joaquin Delta College FIPSE Report on Learning Communities (see Appendices)
Sample Report: Evaluation Briefs • A brief is written to evaluate the grant. • The brief is often used as a stand-alone document. • A project director may also use the brief as support to their funding source. • Sections of the brief can be used for reference in an Ad Hoc report. • Recommendations are often included in the brief based on the evaluation.
Discussion: What other tools or resources would be helpful to you?
A Few Words of Advice • Consider the capacity of your office and the availability of data when developing the evaluation plan • Is an external evaluator required? • If so, what are their roles and responsibilities? • What are the research office’s responsibilities? • Is a data sharing agreement required? • What are the external (i.e., funding agency) reporting deadlines? • How much lead time does the project director need to review annual reports?
Evaluation Tools and Resources • American Evaluation Association • Kellogg Foundation Evaluation Handbook • Kellogg Foundation Logic Model Development Guide • University of Wisconsin-Extension Program Development and Evaluation Web Page • Centers for Disease Control and Prevention Program Performance and Evaluation Office Web Page • Western Michigan University Evaluation Center
Contact Information Daylene Meuschke Bri Hays Campus Based Researcher San Diego Mesa College bhays@sdccd.edu • Director, Institutional Research • College of the Canyons • Daylene.Meuschke@canyons.edu