1 / 16

The Measurement and Evaluation of the PPSI Oregon Pilot Program

The Measurement and Evaluation of the PPSI Oregon Pilot Program. Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office of Policy, Evaluation Support Division United States Environmental Protection Agency Jennifer Nash Director of Policy and Programs

ash
Download Presentation

The Measurement and Evaluation of the PPSI Oregon Pilot Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office of Policy, Evaluation Support Division United States Environmental Protection Agency Jennifer Nash Director of Policy and Programs Product Stewardship Institute, Inc.

  2. Purpose of the Presentation • Provide an overview of the process of integrating measurement and evaluation into the Oregon Pilot Program and describe the work of the evaluation committee

  3. Presentation Outline • Introduction to Program Evaluation • Participatory, Utility-Focused Evaluation • Integrating Evaluation into the OR Pilot • Evaluation Questions • Data and Measures • Questions, Comments, Discussion

  4. Program Evaluation • Definition • A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. • Orientation/Approaches to Evaluation • Accountability • External Audience • Learning & Program Improvement • Internal/External Audiences • Conduct a program evaluation to gain value-added insight: • Learn what works well, what does not, and why • Learn how a program could be improved • Help inform front-end program design • Provide performance information for accountability purposes

  5. Measurement and Evaluation Program Evaluation: A systematic study with a definable methodology that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. Performance Measurement: A basic and necessary component of program evaluation that consists of the ongoing monitoring and reporting of program progress, using pre-selected performance measures.

  6. The Evaluation Committee: Participatory, Utility-Focused • Participatory, Utility-Focused Evaluation • Evaluation Committee • Funding and Support • Purpose • Progress until now • Evaluation Template • Evaluation Questions • Audiences • Approach • Data Sources

  7. Building Measurement & Evaluation into the OR Pilot Program Program 1. Evaluation Methodology 2. Evaluation Policy 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model PPSI Oregon Pilot Program Documentation Questions 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Context 2. Audience 3. Communication 4. Use Measures

  8. The Program • is our program • Describing the Program • Evaluation Team • Mission • Goals and objectives • Logic model 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design

  9. Evaluation Questions • What are the critical questions to understanding the success of the OR program? • What contextual factors may influence the answers to each question? • Who are the audiences for each question? • What’s the best way to communicate with each audience? • How might each audience use the answer to each question? 1. Context 2. Audience 3. Communication 4. Use

  10. Measures • What can we measure to answer each question? • Where is the information for each measure? • How can we collect the information? What will be our strategy? • What analytical tools will give us the most useful information? • How will we implement the collection strategy? • How will we manage the data? 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management Measures

  11. Documentation • Evaluation Methodology • Systematic approach structures a methodology • Evaluation Policy • Guides strategy and planning for evaluation and program management • Method and Policy become part of the OR Program…and the National System? 1. Evaluation Methodology 2. Evaluation Policy Documentation Measures

  12. Building Measurement & Evaluation into the OR Pilot Program Program 1. Evaluation Methodology 2. Evaluation Policy 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model PPSI Oregon Pilot Program Documentation Questions 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Context 2. Audience 3. Communication 4. Use Measures

  13. General Evaluation Questions • Was the Pilot program collaborative & cooperative? • What is the Paint Stewardship Organization and how did the financing mechanism work? • Did consumer awareness and/or behavior change? Why? • Was the Oregon paint management system environmentally beneficial, economical, and convenient? • What are alternative products and markets for post-consumer paint? • How is the evaluation useful to the rollout of a national paint management system?

  14. Measures and Data • Data Sources • Consumers • PPSI • OR DEQ • PSO • Collection Methods • Focus Group • Ongoing reporting • Document review • Survey & Interview • Potential Measures • Stakeholder ratings of collaboration • Inventory of outreach materials • Change in consumer recycling behavior • Volume collected • Cost/unit collected • Change in demand for paint products

  15. Discussion • Challenges ahead • Timeframe • Baselines • Reporting • Interpreting results • What is most important to understand about how the program is working? • What are your concerns?

  16. Thank You!! Environmental Evaluators Network LinkedIn Group: environmental evaluators network www.nfwf.org/een Jennifer Nash Matt Keene 617.236.4853 202.566.2240 jennifer@productstewardship.us Keene.Matt@epa.gov Director of Policy and Programs Evaluation Support Division Product Stewardship Institute, Inc. Office of Policy U.S. Environmental Protection Agency

More Related