1 / 70

Stormwater Management Challenges

Massachusetts Stormater Technology Evaluation Program Features, Updates What to look for when evaluation BMPs November 8, 2012. Stormwater Management Challenges. Variability of Flows (Duration, Frequency, Intensity) Objectives: peak control or water quality treatment?

shing
Download Presentation

Stormwater Management Challenges

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Massachusetts Stormater Technology Evaluation ProgramFeatures, UpdatesWhat to look for when evaluation BMPsNovember 8, 2012

  2. Stormwater Management Challenges • Variability of Flows (Duration, Frequency, Intensity) • Objectives: peak control or water quality treatment? • Different water quality constituents require different treatment mechanisms • Site-to-site variability of quantity and quality • Maintenance of non-centralized treatment units • Monitoring and measurement

  3. Water Quality Stormwater Constituents Sediment Nutrients: nitrogen and phosphorous Oil, grease, and organic chemicals Bacteria and viruses Salt Metals http://www.txnpsbook.org, 2002

  4. Stormwater ConstituentsMedian Concentrations Source: U.S. EPA, Nationwide Urban Runoff Program, 1983.

  5. Detention Basins • TSS Removal Efficiency: • 60-80% average • 70% design • Key Features: • Large area • Peak flow control • Maintenance: low • Cost: low to • moderate

  6. Constructed Wetlands • Removal Efficiency: • 65-80% average • 70% design • Key Features: • Large area • Peak flow control • Biological treatment • Maintenance: low to moderate • Cost: marginally higher than wet ponds http://www.txnpsbook.org, 2002 Source: MADEP/MACZM Massachusetts Stormwater Management, Volume 2: Stormwater Technical Handbook, March 1997

  7. Water Quality Swales • Removal Efficiency: • 60-80% average • 70% design • Key Features: • Higher pollutant removal rates than drainage channels • Transport peak runoff and provide some infiltration • Maintenance: low to moderate • Cost: low to moderate http://www.txnpsbook.org, 2002 Source: MADEP/MACZM Massachusetts Stormwater Management, Volume 2: Stormwater Technical Handbook, March 1997

  8. Inlets and Catch Basins • Removal Efficiency: • 15-35% average • 25% design • Design Features: • Debris removal • Pretreatment • Maintenance: moderate to high • Cost: low to high Source: MADEP/MACZM Massachusetts Stormwater Management, Volume 2: Stormwater Technical Handbook, March 1997

  9. Innovative BMPs - Advanced Sedimentation • Removal Efficiency: • 50-80% average • 80% design • Design Features: • small area • Oil and Grease control • Maintenance: moderate • Cost: moderate Rinker Inc, 2002

  10. Innovative BMPs - Hydrodynamic • Removal Efficiency: • 50-80% average • 80% design • Design Features: • small area • Oil and Grease control • Maintenance: moderate • Cost: moderate Vortechs Inc, 2002

  11. Innovative BMPs – Media Filtration • Removal Efficiency: • 50-80% average • 80% design • Design Features: • small area • Oil and Grease control • Maintenance: moderate • Cost: moderate Stormwater Management Inc, 2002 Massachusetts Stormwater Technology Evaluation Project, UMass

  12. Stormwater Mgt/Mon Challenges • Difference between peak control and treatment objectives • Different water quality constituents require different treatment mechanisms • Variability of Flows (Duration, Frequency, Intensity) • Site-to-site variability of quantity and quality • Maintenance of non-centralized treatment units • Consistent monitoring -> reliable performance comparison

  13. Water Quality Monitoring TARP- Technology Acceptance Reciprocity Program • Address technology review and approval barriers in policy and regulations; • Accept the performance tests and data from partner’s review to reduce subsequent review and approval time; • Use the Protocol for state-led initiatives, grants, and verification or certification programs; and • Share technology information with potential users in the public and private sectors using existing state supported programs CA IL MA MD NJ NY PA VA TX

  14. Performance Verification - TARP • Storm Event Criteria to Sample • More than 0.1 inch of total rainfall. • A minimum inter-event period of 6 hours, where cessation of flow from the system begins the inter-event period. • Obtain flow-weighted composite samples covering a minimum of 70 % of the total storm flow, including as much of the first 20 % of the storm as possible. • A minimum of 10 water quality samples (i.e., 10 influent and 10 effluent samples) should be collected per storm event. • Determining a Representative Data Set • At least 50 % of the total annual rainfall must be sampled, for a minimum of 15 inches of precipitation and at least 15, but preferably 20, storms. Massachusetts Stormwater Technology Evaluation Project, UMass

  15. Performance Verification - TARP • Stormwater Sampling Locations • Sampling locations for stormwater BMPs should be taken at inlet and outlet. • Sampling Methods • Programmable automatic flow samplers with continuous flow measurements should be used • Grab samples used for: pH, temperature, cyanide, total phenols, residual chlorine, oil and grease, total petroleum hydrocarbons (TPH), E coli, total coliform, fecal coliform and streptococci, and enterococci. • Stormwater Flow Measurement Methods • Primary and secondary flow measurement devices are required.

  16. Massachusetts Stormwater Technology Evaluation Project, UMass

  17. BMPPerformance Comparison Table Massachusetts Stormwater Technology Evaluation Project, UMass

  18. Massachusetts Stormwater Technology Evaluation Project, UMass

  19. Massachusetts Stormwater Technology Evaluation Project, UMass

  20. Massachusetts Stormwater Technology Evaluation Project, UMass

  21. Massachusetts Stormwater Technology Evaluation Project, UMass

  22. Massachusetts Stormwater Technology Evaluation Project, UMass

  23. Massachusetts Stormwater Technology Evaluation Project, UMass

  24. BMPPerformance Comparison Table Massachusetts Stormwater Technology Evaluation Project, UMass

  25. Massachusetts Stormwater Technology Evaluation Project, UMass

  26. What do we look for in a study? • Documented Quality Control • Standardized, documented methods • Third party studies • Particle size: mean < 100 microns; distribution 55% sand, 40% silt, 5% clay • Influent sediment concentration 100 – 300 mg/l

  27. What do we look for in a study? Field Studies • 15+ storms • Consecutive storms • Sample all year • Some adverse conditions Lab Studies • 15 test runs • Flow rates: 25%, 50%, 75%, 100%, 125% • Scour test: 50%, 100% initial sediment loading, 125% flow rate

  28. MASTEP Rating System Category 1: TARP-compliant field study or equivalent lab study data available for this product Cat. 2: Sound field or lab study data available – some caveats Cat. 3: Data of moderate scientific validity exists – significant caveats Cat. 4: Reliable performance lacking

  29. Reasons for lower ratings • Narrow / inappropriate focus • Small sample size • Inadequate documentation • Faulty methods

  30. Impact of Particle Size on Performance

  31. Impact of Particle Size on Performance

  32. Characterizing the Size Distribution of Particles in Urban Stormwater by Use of Fixed-Point Sample-Collection Methods Selbirg and Bannerman USGS Open-File Report 2011–1052

  33. Removal Efficiency Calculation Source: MADEP/MACZM Massachusetts Stormwater Management, Volume 2: Stormwater Technical Handbook, March 1997

  34. Removal Efficiency Calculation Source: MADEP/MACZM Massachusetts Stormwater Management, Volume 2: Stormwater Technical Handbook, March 1997

  35. Small Sample Size • Too few storms (field) / test runs (lab) • < 50% annual rainfall (field) • < 15” rain (field) • Short duration / missing winter months

  36. Inadequate Documentation /Faulty Methods • No QAPP, no QC or raw data • Non-standard or undocumented methods • Sampling, analysis errors • No scour test / missing storms

  37. Higher rating does NOT mean better performance MASTEP evaluates quality ofperformance DATA NOT BMP Performance Results

  38. When lower ratings might be acceptable • Narrow / inappropriate focus • Conditions match your site? • Small sample size • Multiple studies, similar results? • Inadequate documentation • Trusted lab / testing agency? • Faulty methods – hardest to justify

  39. Massachusetts Stormwater Technology Evaluation Project, UMass

  40. MASTEP and the BMP Approval Process

  41. “The effectiveness of Proprietary BMPs varies with the size of the unit, flow requirements, and specific site conditions. The UMass Stormwater Technologies Clearinghouse database evaluates the quality of proprietary BMP effectiveness studies. MassDEP urges Conservation Commissions to use this database when verifying the effectiveness of Proprietary BMPs: www.mastep.net” • Excerpt from MA Stormwater Handbook • Volume 2 Chapter 4

  42. Two Ways to Approve or Deny the Use of Proprietary Stormwater BMPs 1. MassDEP has reviewed the performance of a technology as determined by TARP or STEP and assigned a TSS removal efficiency. • If the conditions under which it is proposed to be used are similar to those in the performance testing, presume that the proprietary BMP achieves the assigned TSS removal rate. • Look at sizing, flow and site conditions. 2. Issuing Authority makes a case-by-case assessment of a specific proposed use of a proprietary technology at a particular site and assigns a TSS removal efficiency. • Proponent must submit reports or studies showing effectiveness of BMP. • MassDEP strongly recommends using UMass Stormwater Technologies Clearinghouse database to ensure that reports and studies are of high quality (www.mastep.net). • Look at sizing, flow and site conditions. • For ultra-urban and constrained sites, proprietary BMPs may be the best choice.

  43. BMP Selection Decision Process Goal: Target Pollution Removal Efficiency (e.g. 80%) • For each component of system (i.e. treatment train): • If BMP has DEP-assigned % removal credit, use that (e.g. 35%) • Else if BMP listed on www.MASTEP.net • Use MASTEP info to advise what % to credit • Check category rating. Why downgraded? • Compare test site conditions with your site. Similar? • PSD, flow rates, watershed size, system size, etc. • Use BPJ to assign a % credit. • DON’T ASSUME a #1 PERFORMS> a #2, #2 > #3, etc. • DON’T JUST ACCEPT % Removal reported • 3. Else if not on MASTEP site, • * Obtain similar literature on PSD, flow rates, etc. • * Use BPJ to assign % credit.

  44. New at MASTEP

More Related