1 / 10

RTF Savings Estimation Guidelines for Custom Measures and Program Impact Evaluation

RTF Savings Estimation Guidelines for Custom Measures and Program Impact Evaluation. March 13, 2012 Regional Technical Forum Presented by: Michael Baker, SBW Richard Ridge, Ridge and Assoc. Important Assumptions. RTF will be silent on IPMVP-adherence

nevan
Download Presentation

RTF Savings Estimation Guidelines for Custom Measures and Program Impact Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RTF Savings Estimation Guidelines for Custom Measures and Program Impact Evaluation March 13, 2012 Regional Technical Forum Presented by: Michael Baker, SBW Richard Ridge, Ridge and Assoc.

  2. Important Assumptions RTF will be silent on IPMVP-adherence Evaluators may or may not be third-parties Agencies (individual utilities, BPA, ETO and NEEA) need separate portfolio control but there is mutual benefit in collaboration where possible Comparable reliability of savings across all estimation types (UES, standard protocol, custom protocol, and program impact evaluation)

  3. Guidelines Structure – Existing and Proposed • Adopted • Main Body • Scope and Purpose • Measure Specification • Unit Energy Savings (UES) • Standard Protocols for Site-Specific Savings Estimates • Custom Protocol for Site-Specific Savings Estimates • Program Impact Evaluation • App A: Guidelines Checklist • App B: UES Measure Summary Sheet • App C: Standard Protocol Example and Template • Draft • App D: Estimating Savings for Custom Measures • App E: Program Impact Evaluation Protocols • Proposed • App F: Statistical Energy Modeling (Site-Specific and Cross-sectional Models) • App G: Physical Energy Modeling (Bin and Simulation) • App H: Sampling (based in part on BPA Sampling Guide) • App I: Glossary (based in part on BPA Glossary) • App J: Curriculum Guide (based in part on BPA examples throughout its guidelines and protocols)

  4. Programs that Need Studies and Program-Level Relative Error • Selecting programs to study • Large programs (>10% of portfolio) • changing measure or customer mix, or • not evaluated in last two years • Innovative with large expected savings • Sum of small programs does not exceed 20% of portfolio • Every 4 years for all programs • Relative error in program savings estimate • +/- 20% at a confidence level of 80% • Free of substantial bias • Portfolio error expected to be significantly less

  5. Selecting a Program Impact Evaluation Approach • UES Measures • RTF-Approved UES • Savings Guidelines 6.1 - inspection-based verification of a sample • Other UES Measures • Baseline (Current Practice or Pre-Conditions) determines need for pre- data or current practice study • Requires primary data collection for a sample • Standard Protocol Measures • Conduct faithful application review for a sample • No primary data collection by evaluator (Cxdocumentation proves installation) • When measures fail to meet faithful application criteria, there may beotheroptions • Custom Protocol Measures • Re-estimate savings for a sample • No primary data collection by evaluator if site-specific reporting conforms to Appendix D • When measures fail to conform to Appendix D, there may be other options

  6. Impact Estimation Roadmap

  7. Custom Protocol Method Selection

  8. Roles of Program Operator and Evaluators • The program operator should • Collect data and estimate savings for standard protocol and custom protocol measures • Maintain a transparent and well-documented program-tracking database • Conduct pre-conditions data collection consistent with study-specific specifications and training and oversight provided by the evaluator • The evaluator should do • Everything else

  9. Decisions Should D and E remain appendices or be merged into the main body Should we continue to assume that proposed appendices (F – J) will be developed?

  10. Next Steps • Comments on draft Appendices by March 27th • Second draft distributed April 10th with annotated Main Body showing needed changes • RTF April 17th meeting • Discussion of 2nd draft • Action • Adoption, or … • Strategy for further dialog

More Related