1 / 33

A Strategic Measurement and Evaluation Framework to Support Worker Health

A Strategic Measurement and Evaluation Framework to Support Worker Health . COMMITTEE ON DHS OCCUPATIONAL HEALTH AND OPERATIONAL MEDICINE INFRASTRUCTURE June 10-11, 2013 Ron Z. Goetzel, Ph.D. , Emory University and Truven Health Analytics.

caelan
Download Presentation

A Strategic Measurement and Evaluation Framework to Support Worker Health

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Strategic Measurement and Evaluation Framework to Support Worker Health COMMITTEE ON DHS OCCUPATIONAL HEALTH AND OPERATIONAL MEDICINE INFRASTRUCTURE June 10-11, 2013 Ron Z. Goetzel, Ph.D. , Emory University and Truven Health Analytics

  2. Workplace Health Promotion/Health Protection Programs: What Should be Evaluated? • Structure • Process • Outcomes

  3. LOGIC MODEL: WORKSITE PROGRAMS HEALTH PROMOTION/PROTECTION • STRUCTURE • PROCESS Employees • OUTCOMES Modified Worksite Health Promotion (Assessment of Health Risk with Follow-Up) Logic Model adopted by the CDC Community Guide Task Force

  4. EVALUATION MEASURES 4 4

  5. Program Structure Structure defines the program -- how does it work – the WHAT, HOW & WHEN? • Individual components, e.g., HRA, feedback reports, mailings, internet services, high risk counseling, referral to community resources, incentives • Environmental components, e.g., organizational policies, cafeteria/vending machine choices, time off for health promoting activities, senior management support, access to physical activity programs, walking paths, shower/change facilities, healthy company culture

  6. http://www.cdc.gov/niosh/docs/2010-140/

  7. Environmental Assessment Tool J Occup Environ Med. 2008 Feb;50(2):126-37.

  8. Checklist of Health Promotion Environments at Worksites

  9. Leading By Example Assessment Am J Health Promot. 2010 Nov-Dec;25(2):138-46..

  10. HERO SCORECARD  Sample Results Based on ABC Inc.’s response and database average as of [May 1, 2009]. http://www.the-hero.org/scorecard_folder/scorecard.htm accessed 5/12/12 . 10

  11. CDC WORKSITE HEALTH SCORECARD http://www.cdc.gov/dhdsp/pubs/worksite_scorecard.htm

  12. PROGRAM PROCESS Program process evaluation defines how well the program is carried out: • Participation rates • Satisfaction with the program/process/people • Completion rates

  13. PROGRAM PROCESS COMPONENTS • GOAL: To summarize program implementation and to form hypotheses about how implementation may affect program outcomes • To monitor progress during a program implementation and to inform potential adjustments to the program to improve program quality • Program Fidelity (quality) - how the program was implemented • Dose Delivered (completeness) – frequency and intensity of the program • Dose Received (satisfaction) - how participants react to the intervention • Program Reach (participation rate) –The proportion of eligible (employees) that participated in the various components of the programs?

  14. EXAMPLE ASSESSMENT

  15. Satisfaction

  16. Performance Rating

  17. Program Impacts

  18. PROGRAM OUTCOMES • Program outcomes are evaluated by determining whether program objectives are achieved, at a given level of quality, and within a defined time framework • Health outcomes • Behavior change • Risk reduction • Medical care outcomes • Health care utilization • Health care costs • Productivity outcomes • Absenteeism • Disability • Workers’ compensation/safety • Presenteeism

  19. RESEARCH DESIGN • Pre-experimental • Quasi-experimental • True experimental Validity of results increases as you move down this list All are tools that can help understand the impact of the program

  20. NON-EXPERIMENTAL DESIGN(PRE-EXPERIMENTAL) Program start

  21. GENERAL TREND OR PROGRAM EFFECT? Program start

  22. Same people Before the Intervention Intervention Period Savings? PROBLEMS WITH A PRE-EXPERIMENTAL DESIGN:REGRESSION TO THE MEAN • The most simple analysis may produce the wrong answers

  23. REGRESSION TO THE MEAN

  24. RESEARCH DESIGN: QUASI-EXPERIMENTAL Pretest posttest with comparison group 01 X 02 Experimental Group -------------------------- 01 02 Comparison Group 24

  25. ANNUAL GROWTH IN NET PAYMENTS Annual growth in costs, Highmark, Inc.For matched-participants and non-participants over four years` Start of Program

  26. RETURN ON INVESTMENT AND NET PRESENT VALUE Savings Return on Investment (ROI)= Program Cost = $1 break-even Net Present Value (NPV)= Savings – Program Cost = $0 break-even

  27. Cost-Benefit (ROI) Analysis Wellness Program Costs, Highmark, inflation-adjusted to 2005 dollars

  28. Evaluators must explicitly state the intervention pathway and metrics used to measure: The “cause” or actual intervention The “effect” – proximate and/or ultimate outcomes that result from the intervention Hypotheses that outcomes are “caused” by the HP program must be articulated and tested Assessing Causality Ultimate Outcomes HP Program Proximate Outcomes Effect Cause

  29. CRITICAL STEPS TO SUCCESS Financial ROI Reduced Utilization Risk Reduction Behavior Change Improved Attitudes Increased Knowledge Participation Awareness

  30. HEALTH RISKS – BIOMETRIC MEASURES -- ADJUSTED Results adjusted for age, sex, region * p<0.05 ** p<0.01

  31. HEALTH RISKS – HEALTH BEHAVIORS -- ADJUSTED Results adjusted for age, sex, region * p<0.05 ** p<0.01

  32. ADJUSTED MEDICAL AND DRUG COSTS VS. EXPECTED COSTS FROM COMPARISON GROUP Average Savings 2002-2008 = $565/employee/year Estimated ROI: $1.88 - $3.92 to $1.00

  33. Summary • Evaluation of Health Promotion/Protection Programs is doable, but tricky • Know your audience – the level of sophistication in conducting financial analyses varies significantly – well done studies are complex and expensive • It’s easy to come up with the “wrong” answer if the proper research design is not used • Ask for help – good evaluation studies require a team of individuals with diverse backgrounds and skill sets • Tell the truth, the whole truth, even if it means saying the program didn’t work

More Related