730 likes | 924 Views
2. Presentation Goals. Enable participants to:understand the requirements of EPA's Environmental Results Orderdevelop a common understanding of performance measurement terminology and concepts.understand the steps involved in developing performance measures using a logic model approach.. 3.
E N D
1. SIG Performance Measurement Workshop
February 28, 2005
Presented to: Performance Track Conference Participants
Presented by:
Yvonne M. Watson, Evaluation Support Division
U.S. EPA’s National Center for Environmental Innovation
Office of Policy, Economics, and Innovation
2. 2 Presentation Goals
Enable participants to:
understand the requirements of EPA’s Environmental Results Order
develop a common understanding of performance measurement terminology and concepts.
understand the steps involved in developing performance measures using a logic model approach.
3. 3 Orientation Exercise In small groups, discuss and complete the following incomplete sentences:
A program is …
Performance Measurement is …
Performance Measurement leads to …
Program Evaluation is …
Program Evaluation leads to…
One thing we want to learn from this workshop is…
After completing the sentences, select a reporter.
4. 4 Session Agenda
Module 1: Building a Common Understanding of Performance Measurement and Program Evaluation
Module 2: Planning for Performance Measurement
Module 3: Developing Performance Measures
5. Module 1:
6. 6 Definitions
Performance measure – a metric used to gauge program or project performance.
Performance measurement – the ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures.
Indicators – measures, usually quantitative, that provide information on program performance and evidence of a change in the “state or condition” in the system.
Indicators measure the “state of” something, typically in the natural environment. Performance measures help us assess the effect of our programs. As defined by King County DNRP.
7. 7 What is the difference between an Indicator and a Performance Measure?
8. 8 Uses of Performance Information
Provide an unbiased way to assess performance.
Assess allocation of resources to support activities.
Set program priorities (difficult to do without evaluation).
Assess whether program/project goals are being met.
Provide information for policy/project decision-making.
Demonstrate value to stakeholders and public
Good management practices
Return on investment
9. 9 Uses of Performance Information (Cont’d)
Adopt new program or project approaches or change work processes.
Coordinate program or project efforts with other internal or external programs/organizations.
Set new or revise existing performance goals/objectives.
Provide information needed to conduct an evaluation.
10. 10 Limitations and Pitfalls in Performance Measures
Provide descriptive data, not rigorously evaluative.
Can encourage undesirable behavior such as goal displacement
May require too much time and effort.
Can be ignored, not automatically utilized.
11. 11 Program Evaluation Defined
While program evaluation can take many forms, it is generally described as an individual, systematic study that uses objective measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.
The aim is to --“Decrease uncertainty; Increase understanding”.
12. 12 What can evaluation do for you?
Ensure program goals & objectives are being met
Determine if allocated resources are yielding the greatest environmental benefit
Identify what works well, what does not and why
Identify program areas that need improvement
13. 13 Orientations for Measurement & Evaluation
PERFORMANCE MEASUREMENT
Accountability
What objectives/outcomes have been accomplished?
PROGRAM EVALUATION
Learning & Program Improvement
What outcomes have been achieved and why?
What aspects of my program lead to these outcomes?
What roles did context play in my outcomes?
14. 14 Differences between PM and PE Performance Measurement
Ongoing monitoring and reporting of accomplishments.
Examines achievement of program objectives.
Describes program achievements in terms of outputs, outcomes in a given time against a pre-established goal.
Early warning to management.
Program Evaluation
In-depth, systematic study.
Conducted periodically or on ad-hoc basis.
Examines achievement of program objectives and broader range of information on program performance than is feasible to monitor on an on-going basis.
Explains why the results occurred.
Longer term review of effectiveness.
15. 15 Relationship between PM and PE Performance measurement data provide information needed to conduct the evaluation and assess program performance.
Lack of performance measurement data is a major obstacle to conducting an evaluation.
16. 16 Performance Measurement Framework
17. Module 2:
18. 18 I. Identify Stakeholders/Develop PM/PE Plan
19. 19 Identify Stakeholders PM Team Step A: Identify key stakeholders and team members
Individuals responsible for designing, implementing and reporting performance measures.
Staff with intimate knowledge of the program/project.
Persons with a vested interest in the conduct/impact of the program/project.
Identify a SKEPTIC!
20. 20 The Performance Measurement Plan (Outline) Plan Components:
Project/program mission
Primary audience
Scope
Program Description/ Logic Model
Context (organizational, management, political)
Role, expectations for program staff, participants, and key stakeholders
Performance measurement questions
Data collection/analysis
Reporting
Feedback Loop
Resources (Staff and Budget)
Timeline
21. 21 II. Describing the Program
22. 22 Something to Consider
Successful programs –
Have a well articulated, research-based, experience-based theory or road map.
Follow the road map during the trip!
23. 23 LOGIC MODEL
24. 24 Elements of the Logic Model
Resources/Inputs: Programmatic investments available to support the program.
Activities: Things you do– activities you plan to conduct in your program.
Outputs: Product or service delivery/implementation targets you aim to produce.
Customer: User of the products/services. Target audience the program is designed to reach.
Outcomes: Changes or benefits resulting from activities and outputs.
Outcome Structure
Short-term (Attitude)– Changes in learning, knowledge, attitude, skills, understanding
Intermediate (Behavior) – Changes in behavior, practice or decisions
Long-term (Condition) – Changes in condition
External Influences: Factors that will influence change in the affected community.
25. 25 Benefits of Logic Modeling Communicates the performance story of the program or project.
Focuses attention on the most important connections between actions and results.
Builds a common understanding among staff and with stakeholders.
Helps staff “manage for results” and informs program design.
Finds “gaps” in the logic of a program and work to resolve them.
26. 26 Steps in the Logic Model Process Establish a stakeholder work group and collect documents.
Define the problem and context for the program or project.
Define the elements of the program in a table.
Verify the logic table with stakeholders.
Develop a diagram and text describing logical relationships.
Verify the Logic Model with stakeholders.
Then use the Logic Model to identify and confirm performance measures, and in planning, conducting and reporting performance measurement and evaluation.
27. 27 Step 1. Establish a stakeholder work group and collect documents and information.
Convene/consult a stakeholder work group
provides different perspectives and knowledge
attempts agreement on program performance expectations
Review sources of program or project documentation
Strategic and operational plans
Budget requests
Current metrics
Past evaluations
Conduct interviews of appropriate staff
28. 28 Step 2. Define the problem the program addresses and the context.
29. 29
30. 30 Step 4. Verify the logic table with Stakeholders.
31. 31 Step 5. Develop a diagram and text describing logical relationships. Draw arrows to indicate/link the causal relationships between the logic model elements.
Limit the number of arrows. Show only the most critical feedback loops.
Work from both directions (right-to-left and left-to-right)
There are many different forms of logic model diagrams.
32. 32 Step 6. Verify logic with stakeholders.
Seek review from the same, or an even broader, group of stakeholders.
Check the logic - again
How-Why Questions. Start with Outcomes and ask “How?” Start at Activities, ask “Why?”
If-Then Questions. Start at Activities and move along to Outcomes asking “If this, then that?”
Compare to what units in the organization do and define their contributions to the outcomes.
Check the logic by checking it against reality.
33. Logic Modeling Exercise: 1 Brief application of logic modeling
34. 34 Some Sample Logic Models“Can you See the logic?” Is the program’s outcome structure described and is it logical?
Are the program’s customers described and are they the right customers, given the outcomes?
Are the program’s major resources, processes, and outputs described and are they logically consistent and sufficient to achieve outcomes?
Is the program’s context described?
39. 39 “Z” Logic In real life program’s achieve their strategic results through a series of actions.
Action A produces a set of outcomes that become inputs to Action B.
Action B produces a set of outcomes that become inputs to Action C.
Action C produces a set of outcomes that lead to the final strategic goal of the program.
These actions could be thought of as nested programs within the larger program.
40. 40 “Z” Logic
41. ‘Z’ Logic Model for an Energy R,D,&D Program
42. 42 III. Developing Performance Measurement Questions
43. III. Developing Performance Measurement Questions
44. 44 Performance Measurement Questions What are they?
Questions designed to assess progress/ accomplishments of various aspects of a program/project.
Performance measurement questions ask/tell you what your program is doing.
45. Performance Questions Across the Performance Spectrum
46. 46 Steps for Developing Performance Questions
Review the purpose and objectives of the performance measurement system.
Review the program mission/goals and objectives.
Review the logic model and identify potential questions to be addressed.
Review existing evaluation results that identify areas of poor performance.
Identify the aspects of the program you are responsible for measuring and reporting on as required or identified by legislation, in a service standard or strategic plan.
Identify the aspects of your program/project you are most interested in measuring performance.
Generate a potential list of questions.
47. 47 Basic Steps Cont’d
For each question, ask, “Why is the question important? How will the information be used, by whom?” (for measurement or evaluation)
Identify the standard/baseline that will be used to assess the information (e.g., target level of performance).
Assess the feasibility of the question in terms of data collection, analysis and reporting.
Prioritize your questions and select a final list – you won’t be able to answer all questions.
Check-in with mgmt & stakeholders.
Revisit your questions periodically and continue to check in with management.
48. Performance Measurement Questions Exercise: 2 Linking performance questions to the logic model. What are the important questions you need to ask?
49. Module 3:
50. 50 IV. Developing Measures
51. Types of Performance Measures
52. Performance Measures Exercise:3 Brief application of performance measurement terminology
53. 53 Something to Consider… As you think about developing your performance measures, remember to keep in mind your program or project’s sphere of influence – the elements of the logic model you have direct or indirect control/influence over.
54. ORD Is Using Logic Models to Communicate Program Theory and Outcomes
55. ORD Is Using Logic Models to Communicate Program Theory and Outcomes
56. ORD Is Using Logic Models to Communicate Program Theory and Outcomes
57. 57 Steps for Developing Measures Step 1: Identify and Define the Measures
Step 2: Evaluate/Assess the Measures
Step 3: Choose the Most Important Measures
58. 58 Step 1: Identify and Define the Measures A. Review existing requirements specified in:
Legislation
Strategic plan
Court Orders
59. 59 Step 1: Identify and Define the Measures B. Consider/ask what information is needed to assess whether your program/project is meeting its goals and objectives. Examine your program/project’s existing:
Mission
Goals
Objectives
Service standards
60. 60 Step 1: Identify and Define the Measures C. Review the logic model -- activities, outputs, and outcomes from across the performance spectrum and identify the type of measure you would like to track – resource, output, outcome, customer satisfaction etc.
D. Other performance measurement or evaluation reports/scientific literature.
61. 61 Step 1: Identify and Define the Measure E. Generate a list of potential measures seeking out those that are the best gauges for tracking progress for your program.
Resources: research and evaluation information from this and related programs.
F. Discuss possible contextual factors with your team that could influence the program either positively or negatively and generate measures for those that seem most likely to influence the program so that you’ll be able to gather evidence regarding the influence.
62. 62 Step 1: Identify and Define the Measures Type of Data
Raw Numbers (tons of VOCs reduced)
Averages (mean tons of VOCs reduced)
Percentages (% of dry cleaners reporting VOC reduction)
Ratios (Cost per ton of VOCs reduced)
Rates (tons of VOCs reduced per 100 dry cleaners)
Unit of Measure
Is it appropriate to the measure? Are supporting data available in this specific unit of measure?
63. Examples of Performance Measures Across the Performance Spectrum
64. Performance Measures Exercise: 4 Brief application of performance measure development
65. 65 Step 2: Evaluate the Measures Evaluate/assess the feasibility of the measures in terms of:
data quality
data collection
analysis
reporting
66. 66 Step 2: Evaluate the Measures forData Quality
Reliability
Validity
Objectivity
67. 67 Step 2: Evaluate the Measures forData Collection Evaluate each measure listed against the following criteria:
Data availability
Frequency of data collection
Data available for use
Continued data collection
Supports an acceptable baseline
Overall implementation cost
68. 68 Step 2: Evaluate the Measures for Analysis Will the measure change over time and in response to implementation of the program/organization to pick up the measures of the program?
Do you have direct or near direct control over this measure, or is it significantly affected by external factors? If so, list the major factors.
Can the impact of these external factors on the measure be taken into account?
69. 69 Step 2: Evaluate the Measures for Reporting Identify and analyze the prospective reporting format for each measure:
Identify how frequently this measure should be reported.
Identify how the indicator will be measured and reported, that is, identify the display format (for example, charts, tables, diagrams, text).
Verify that this reporting format is meaningful to the intended audience.
70. 70 Step 3: Choose the Most Important Measures Assess the value of the measures in relation to the goals and objectives of the program.
Required
Important
Interesting
Select final list of measures – you won’t be able to collect data for all measures.
Check in with managers and stakeholders
71. 71 Tips for Developing Measures For each measure ask…
Does the measure clearly relate to the project goal and objective?
Is the measure important to management and stakeholders?
Is it possible to collect accurate and reliable data for the measure?
Taken together, do the measures accurately reflect the key results of the program, activity or service?
Is there more than one measure for each goal and/or objective?
Are your measures primarily outcome, efficiency or quality measures?
72. 72 Criteria for Useful Performance Measures
73. 73