1 / 73

The State of High School Equivalent Program (HEP) Evaluation

The State of High School Equivalent Program (HEP) Evaluation. U.S. Department of Education Office of Migrant Education Webinar August 22, 2013 2:00 pm – 3:30 pm EDT. Overview. Background and Purpose OME Review of Evaluation Reports Rubric for Review Evaluation Reports Reviewed

Download Presentation

The State of High School Equivalent Program (HEP) Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The State of High School Equivalent Program (HEP) Evaluation U.S. Department of Education Office of Migrant Education Webinar August 22, 2013 2:00 pm – 3:30 pm EDT

  2. Overview • Background and Purpose • OME Review of Evaluation Reports • Rubric for Review • Evaluation Reports Reviewed • Observations • Notable Evaluation Reports • Initial Grantee Feedback • Today’s Feedback • Next Steps

  3. Background and Purpose

  4. I. Background and Purpose • Compliance • Improvement

  5. Compliance: EDGAR • EDGAR -- Education Department General Administrative Regulations • 34 CFR §75.590 requires a grantee to evaluate annually: • The recipient’s progress in achieving the objectives in its approved application; • The effectiveness of the project in meeting the purposes of the program; and • The effect of the project on served participants

  6. Compliance: HEP Application Evaluation Plans • Each approved and funded grant application included an evaluation plan within the Narrative and Quality of Project Evaluation (Section 7) • “…In determining the quality of the evaluation the Secretary considers the following factors: • (i.) The extent to which the methods of evaluation are appropriate to the context within which the project operates. (3 points possible) • (ii.) The extent to which the methods of evaluation provide for examining the effectiveness of project implementation and strategies. (3 points possible) • (iii.) The extent to which the methods of evaluation will provide performance feedback and permit periodic assessment of progress toward achieving intended outcomes. (4 points possible).

  7. Compliance: HEP Application Section 7: Note Quality of the Project Evaluation: • Important note about the project evaluation: A strong evaluation plan should be included in the application narrative and should be used, as appropriate, to shape the development of the project from the beginning of the grant period. The plan should include benchmarks to monitor progress toward specific project objectives and also outcome measures to assess the impact on teaching and learning or other important outcomes for project participants. More specifically, the plan should identify the individual and/or organization that have agreed to serve as evaluator for the project and describe the qualifications of that evaluator. (continued)

  8. Compliance: Application Section 7: Note, continued The plan should describe the evaluation design indicating: • (1) What types of data will be collected • (2) When various types of data will be collected • (3) What methods will be used • (4) What instruments will be developed and when • (5) How the data will be analyzed • (6) When reports of results and outcomes will be available • (7) How the applicant will use the information collected through the evaluation to monitor progress of the funded project and to provide accountability information both about success at the initial site and effective strategies for replication in other settings. • Applicants are encouraged to devote an appropriate level of resources to project evaluation

  9. Compliance: APR: Evaluation References • Annual Performance Report (APR) – requirement for grantees • Includes evaluation material: • APR Section D Project Goals and Objectives • Section 2) Explanation of Progress (Includes Qualitative Data and Data Collection Information) maximum 2500 words 1. For each project objective and associated performance measures, indicate what data (quantitative and/or qualitative) were collected and when they were collected, the evaluation methods that were used, and how the data were analyzed. Clearly identify and explain any deviations from your approved evaluation plan, including changes in design or methodology, or the individual or organization conducting the evaluation…. 4. Indicate how you used your data and information from your evaluation to monitor the progress of your grant, and if needed, to make improvements to your original project plan (e.g., project activities and milestones) which are consistent with your approved objectives and scope of work.

  10. Improvement: Continuous Improvement Cycle

  11. OME Review of Evaluation Reports

  12. II. OME Review of Evaluation Reports OME initiated a review of evaluation reports received from HEP and CAMP grantees: • Requested current grantees submit evaluation reports by April 30, 2013 • 25 HEP and CAMP evaluation reports were submitted to OME • Evaluation reports addressed periods of review ranging from 2009 through 2012

  13. HEP Grantee Institutions and Agencies

  14. Rubric for Review

  15. III. Rubric for Review Rubric for review of evaluation reports: • Developed to permit consistent review of all submitted evaluation documents • Identified core information of particular interest to OME, based on approved applications • Scored on a general scale: • “0” – Does Not Meet Expectations • “1” – Needs Improvement • “2” – Meets Expectations • “3” – Exceeds Expectations

  16. Rubric for Report Review, continued • Scoring of components 1 – 5 was influenced by degree evaluation documents provided qualitative and/or quantitative data • Maximum score possible: 108 • Average HEP evaluation report score: 40.6

  17. Rubric: Outline of Components Rubric incorporated 5 components: • Component 1 – GPRA Results • Component 2 – Fidelity of Implementation to Design • Component 3 – Effectiveness of Project Design • Component 4 – Collaborative Agreements • Component 5 – Recommendations

  18. Rubric: Component 1 – GPRA 1 and GPRA 2 • Evaluation report scored in Component 1 on: • 1.a.b. Presentation of GPRA 1 and GPRA 2 results and related progress to meeting performance requirements • 1.c. Presentation of performance measures included in grantee’s approved application and target v. actual results • 1.d. Information on data validation for GPRA 1 and GPRA 2 results • Maximum points available – 12 points

  19. Rubric: Component 2 – Fidelity of Implementation to Design • Rubric Component 2 scored evaluation report’s qualitative or quantitative data for grantee level of fidelity to implementation of: • 2.a. Instructional Services Design • 2.b. Support Services Design • 2.c. Placement Services Design • 2.d. Management Plan Design • 2.e. Recruitment Plan Design • Maximum points available – 15 points

  20. Rubric: Component 3 – Effectiveness of Project Design • Component 3 details areas in Component 2 • Evaluation reports scored for information on specific topics including: • Review of eligibility screening tools • Use of student/staff surveys • Staff qualifications • Student stipends • Planned against actual costs • Student demographic information • Maximum points available – 57 points

  21. Rubric: Component 3.a. – Effectiveness of Project Design, continued • 3. Effectiveness of Project Design • 3.a. Instructional Services Design • 3.a.1. The report includes data on the number of instruction hours provided and schedule/availability of instruction hours to meet individual needs. • 3.a.2. The report includes data on the type and usage of eligibility screening tools. • 3.a.3. The report includes data on how instructional requirements of individual students are assessed and met.

  22. Rubric: Component 3.b.-3.d. – Support, Placement, Management • 3.b. Support Service Design • 3.c. Placement Services Design for HEP projects • 3.d. Management Plan Design. Analyzes the effectiveness of the grantee’s management plan design • 3.d.1. Teaching and administrative staff structure • 3.d.2. Senior Project administrative staff • 3.d.3. Qualifications of project instructors • 3.d.4 Professional Staff development • 3.d.5 Student Records • 3.d.6 Approved Procedures Policy Manual

  23. Rubric: Component 3.d.-3.e. – Management, Recruitment • 3.d.7. Financial/cost sharing issues with host institutions • 3.d.8. Student stipends • 3.d.9. Planned and Actual costs • 3.e. Recruitment Plan Design Report analyzes the effectiveness of the grantee’s recruitment plan design • 3.e.1. Demographic information • 3.e.2. Training for recruiters

  24. Rubric: Component 4 – Collaborative Agreements • In Component 4, evaluation reports scored based on information and analysis that addressed: • 4.a. Existing collaborative agreements with educational institutions or service providers • 4.b. Planned collaborative agreements • 4.c. Relationships with host institutions, including unresolved facilities, equipment/computers, HR and related issues that affect project performance • Maximum points available – 9 points

  25. Rubric: Component 5 – Recommendations • In Component 5, evaluations scored on recommendations for key areas that were linked to information included in the report: • 5.a. Instructional Services • 5.b. Support Services • 5.c. Placement Services • 5.d. Management Plan • 5.e. Recruitment Plan • Maximum points available – 15 points

  26. Evaluation Reports Reviewed

  27. HEP Evaluation Reports • 15 evaluation reports submitted from 14 grantees • Reports submitted from 10 states: Arizona, Arkansas, California, Kansas, Maine, New Jersey, Oklahoma, Pennsylvania, Texas, and Washington * One grantee submitted documents for two different periods of performance that were scored as separate evaluations

  28. Type of HEP Evaluations Submitted • Wide range of evaluation reports • Several reports focused specifically on: • Questionnaires • Student surveys • Site visit reports • Focus group results • Some reports were detailed, 15-30 page formative or summative evaluations with numerous appendices • Each document submitted to OME was reviewed through the rubric

  29. Type of HEP Evaluations Submitted

  30. Observations

  31. General Observations • Grantees determined the format for evaluation reports • All grantees may not have submitted all evaluation reports on file in response to OME request • Although approved applications define grantee evaluation processes, a number of submitted reports did not reference, or reflect fully, the processes indicated in the funded application • Few reports addressed implementation of a planned budget

  32. Observations: Component 1 – GPRA 1 and GPRA 2 • All HEP formative or summative evaluations provided Component 1 – GPRA 1 and GPRA 2 data • Five HEP evaluation documents presented general information on validation of data for GRPA results

  33. Observations: Component 2 – Fidelity of Implementation to Design • Formative and summative evaluations presented data on progress in meeting GPRA 1 and 2 and measurable performance objectives • Few evaluations presented data on grantee fidelity to implementation of design for instructional, support and placement services and management and recruitment plans • Expanded information might have been provided through: • Documentation of services • Student and staff survey results • Tutoring and counseling logs • Class attendance sheets

  34. Observations: Component 3 – Effectiveness of Project Design • Instructional Services • HEP evaluations provided minimal data on instructional services offered • Support Services • References in several evaluations were limited to a simple listing of the support services • One evaluation included detailed information regarding delayed stipend payments, related staff interviews, and a proposed interim solutions

  35. Observations: Component 3 – Effectiveness of Project Design, continued • Management Plan • Limited discussion was included in reports about: • Management plan • Senior management • Administrative staff • Security and confidentiality measures for student records • Fiscal management

  36. Observations: Component 3 – Effectiveness of Project Design, continued • Recruitment Plan • To evaluate the effectiveness of the recruitment plan design, few reports provided: • Demographic data • Selection and non-selection rates • Screening tools • References to trainings offered to recruiters

  37. Observations: Component 4 – Collaborative Agreements • Collaborative agreements key to helping grantees leverage Federal, State and local resources • Ineffective execution of agreements can be a detriment to grantee success • Few evaluations included discussion about the relationships or plans to expand collaborative partners

  38. Observations: Component 5 – Recommendations • Many evaluation documents did not provide extensive recommendations • Recommendations often did not link directly to discussion within the report • Five HEP evaluations made no recommendations • Texas A&M International University evaluation provided comprehensive recommendations that were directly related to issues in the report

  39. Notable Evaluation Reports

  40. Notable Evaluation Reports: Validation, Surveys Reports notable for data validation, survey results and collaborative agreements • Data Validation: • 2011-2012 Heritage University HEP • Survey Results: • 2011 Texas State Technical College, Harlingen HEP • Collaborative Agreements: • 2011 Texas State Technical College, Harlingen HEP

  41. Notable Evaluation Reports: Overall Notable Overall Evaluations: • 2012 Hartnell Community College District, Hartnell College HEP • 2009-2010 Texas A&M International University (TAMIU) HEP • 2011 Texas State Technical College, Harlingen HEP

  42. Initial Grantee Feedback (HEP-CAMP ADM Session)

  43. Initial Grantee Feedback • Group 1: Effectiveness of Project Design - Umbrella • requires support from the following designs • Management plan • Recruitment plan • Support services • Placement services • Instructional services design as the “handle” that supports the umbrella.

  44. Initial Grantee Feedback • Group 2: Rubric = Trash Receptacle • Repurpose the rubric by simplifying it • Most importantly, emphasize • Fidelity to your grant application and objectives • No. Served is important, fidelity to recruitment plan • GPRA 1 and GPRA 2, important to meet/exceed

  45. Initial Grantee Feedback • Group 3: Successful Evaluation is Dependent Upon • Data Validation • Data on Performance Measures (GPRAs) • Graduate HEP • Postsecondary, upgraded employment, military

  46. Initial Grantee Feedback • Group 4: HEP-CAMP Success = School • Takes nourishment and collaboration with • Agencies • K-12 MEP • Private Sector

  47. VIII. Today’s Feedback

  48. Today’s Feedback Please consider the rubric, and provide feedback on the rubric. • What component(s) would you keep? Why? • What component(s) would you eliminate? Why? • What component(s) would you add? Why?

  49. IX. Next Steps – Provide Continued Feedback

  50. Recommendations for Grantee Evaluations • In the future, OME may develop guidance/outline for evaluation reports • Further grantee input is requested by e-mail to: Edward.Monaghan@ed.gov • Resources may include: • HEP-CAMP Toolkithttp://www.hepcamptoolkit.org/hep/index.php/overseeing-your-grant/evaluation/ • MEP Evaluation Toolkit http://results.ed.gov/sites/results.ed.gov/files/pe-toolkit.pdf

More Related