130 likes | 291 Views
Green Datacenter Initiatives at SDSC. Matt Campbell SDSC Data Center Services Manager mattc@sdsc.edu. SDSC Data Center Overview. 19,000 sq ft 13 MW of on-site power 100+ projects operating IT equipment in data center. Key Points. How do we illustrate energy loss and quantify savings?
E N D
Green Datacenter Initiatives at SDSC Matt CampbellSDSC Data Center Services Manager mattc@sdsc.edu
SDSC Data Center Overview • 19,000 sq ft • 13 MW of on-site power • 100+ projects operating IT equipment in data center
Key Points • How do we illustrate energy loss and quantify savings? • Establishing a common unit of measurement for data centers • What areas can be improved? • Mechanical • Electrical • IT equipment
Power Usage Effectiveness PUE = Total Facility Power IT Equipment Power • Industry accepted unit of measurement • 2+ is still the data center industry ‘norm’ • SDSC’s datacenter came was audited last year with a resulting PUE of 1.35 – 1.42. • Systems located at SDSC would save 28-65% on energy alone!
Energy Savings Multiply Courtesy Jack Pouchet, Emerson
Working on Many Fronts Distributing Higher-Voltage(277/480V) Power to the floor to reduce transformer and distribution losses. Experimenting with DC-OnlyPower Distribution Efficient SystemLifecycles “Free Cooling”Opportunities High-efficiency UPS Retrofit Air handling retrofit with variable frequencydrive fans and modulating chilled water valves to match cooling to real-time loads. Cold / Hot Aisle Containment
Data Center Improvements to Lower PUE • Mechanical Systems • Aisle containment • CRAH VFD retrofits • Blanking panels, floor brushes • Electrical Systems • High efficiency UPS • System Lifecycles • Virtualization • Consolidation of ‘ghost’ servers
Cold/Hot Aisle Containment Cold Aisle Containment Hot Aisle Containment Essentially a “heat chimney” on top of hot aisles with doors at each end. Flexible, can be phased incrementally Cold room as a whole (70-78F) Exhaust to enclosed hot aisle (100+F), ducting directly to cooling systems • Expect 50-80% better cooling airflow efficiency than traditional hot/cold aisle • Separates cold/hot air, eliminating the need for over-cooling, over-blowing • Allows for high temperature differentials, maximizing efficiency of cooling equipment • Knuerr Coolflex • First deployment in the United States • Requires a standardized datacenter • Enclosed cold aisles (70-78F) • Servers exhaust to room as a whole, which runs hot (90-100F)
Filling In The Gaps • Brushes to cover tile cutouts • Allows cables to pass through while blocking air • Cost effective blanking panels • Minimizes leakage
CRAH VFD Retrofits • Added variable frequency drives to computer room air handlers • Allows CRAH units to modulate based on control signal
Electrical Systems • High efficiency UPS units that can save ~500,000 kWh/year • Higher voltage to IT equipment • Providing 277/480V power on the floor to more efficiently power HPC systems without transformer and distribution losses • Exploration of DC power for IT equipment
Hardware Lifecycle Efficiency • Hardware retirement • Equipment refresh more often leverages technology improvements • Virtualization • Maximize equipment utilization • Consolidation of ‘ghost’ servers • Underutilized IT equipment • ‘Forgotten’ equipment