This document discusses greening data center operations. It notes that data centers consume large amounts of power, with US data centers using over 125 TWh of power per year. It then discusses measuring data center efficiency using Power Usage Effectiveness (PUE) and provides examples of efficient PUE levels. It also describes optimization strategies used at the SDSC data center like thermal containment and increased supply temperatures. Finally, it outlines a conceptual design for an ultra-efficient data center in collaboration with McGill University, aiming for a PUE of 1.06 using free cooling, ice storage, and low-cost hydroelectric power.
1. Greening Data Centers
Dallas Thornton
SDSC Division Director, Cyberinfrastructure Services
March 2, 2011
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
2. Data Centers Are Enormous Users of Power
US Data Centers (TeraWatt Hours per Year) US Televisions
(248 Million Units)
125
61
27
Sources: Report to Congress on Server and Data Center Energy Efficiency Public Law 109-431;
U.S. Environmental Protection Agency ENERGY STAR Program, August 2, 2007; Kaufman, Ron.
Television's Hidden Agenda. TurnOffYourTV.com, 2004
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
4. PUE Tabletop Reference…
PUE Level of Efficiency
3.0 Very Inefficient
2.5 Inefficient
Typical Server Rooms
From office conversions (worst) to basic hot/cold
2.0 Average aisle legacy data centers (better)
1.5 Efficient Optimized Data Centers
Hot/cold aisle containment, HVAC throttling based
1.2 Very Efficient on loads, and high‐efficiency UPSes
1.0 Ideal Greenfield Design in Canada
All of the above + innovative climate‐leveraging
Sources: Green Grid, 2008 UC NAM Data Center Audit,
2009 UCSD/SDSC NAM Data Center Audit, 2010
technologies and designs
SDSC/McGill University Joint Data Center Design
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
5. SDSC Data Center Overview
• ~19,000 sq. ft., 13 MW of on‐site power
• Regional co‐location data center for UC system
• 100+ projects from 6 campuses
• Energy efficient alternative to server closets, offices, etc.
• Home of SD‐NAP
• Many 10 Gb and 1 Gb connections to other organizations and networks:
• CENIC, Cox, Time Warner, Salk Institute, Scripps Research Institute, SDSC, etc.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
6. Optimizing Features
• Aisle Thermal Containment
• 15ᵒ ΔT from top to bottom of rack → 1ᵒ ΔT
• 10ᵒ ‐ 15ᵒ increase in return temperatures
• Cold aisle and hot aisle options
• Fire code considerations
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
7. Optimizing Features (Cont.)
• Increased Supply Temperatures
• Move to near top of ASHRAE spec. (80ᵒ F)
• Drives AHU return temperatures higher,
allowing more cooling from chilled water
• VFD Fans on AHUs
• Allows for fan energy savings… IF accurate
controls can be put in place.
• Adaptive Controls
• Address redundancy and inefficient cooling
• Allow ‘big picture’ control of cooling, throttling
based on real‐time loads
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
8. Optimizing Features (Cont.)
• Rack Blanking Panels
• Cost effective solutions: Coro‐plast
• Floor Brushes
• Conveyer belt brush: sold in varying lengths
• Efficient Electrical Systems
• 480V/277V or (even better) 400V/240V power
• Efficient UPS and generator configs
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
9. SDSC/McGill Data Center Conceptual Design
• Goal: Most Efficient Class One Data Center in North America
• Optimize Cooling Systems for Quebec Climate
• Evaporative free cooling – Primary cooling
• Seasonal ice storage – Top up cooling
• No compressor based cooling
• 1.06 PUE means UC could achieve full CapEx recovery in less
than 10 years with energy cost savings
• Lower‐cost, green hydro power
• $0.045/kWh vs. $0.08‐$0.15/kWh in California
• Design funded by grants from Canada‐California
Strategic Innovation Partnerships (CCSIP) and
CLUMEQ
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
10. Free Cooling Analysis with 65F CHWS
0.030
Data Source: Government of Canada - National Climate Data & Information Archive
0.028
Data Set: WMO #71627, Montreal/Pierre Elliott Trudeau Airport, Typical Year
0.026
Elevation: 118 feet
Humidity Ratio (lbs H2O per lbs dry air)
0.024
Air Pressure: 14.633224 psia
0.022
0.020 Auxillary Cooling
152 hrs/yr
0.018
0.016 Partial Free Cooling
1380 hrs/yr
0.014
0.012
0.010
0.008
Full Free Cooling
0.006 7228 hrs/yr
0.004
0.002
0.000
-30 -25 -20 -15 -10 -5 0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100
Dry Bulb Temperature (F)
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
11. Supplemental Cooling:
Seasonal Ice Storage Pond System
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
12. Supplemental Cooling:
Seasonal Ice Storage Pond System
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
14. Results
Supply Temperatures Annual Energy Use Mechanical Cooling Needed Water Usage
Water Cooled
Hours of Free Additional Load
Air Cooled
Cooling / year PUE at Extreme Cost
Cost Evaporation +
Air Cooled Water Cooled Energy Weather Blowdown ($5.52/1,000
( $0.058/ kWh) Hours per Year1 Carry Over
(wetbulb = gal)
68.7°F)
°C °F °C °F hrs/yr % of yr kWh/yr2 $ tons gallons gallons $
10% 90% 29.4 85.0 23.9 75.0 8,532 97% 1.06 74,543,000 $4,323,000 228 0 33,200,000 8,100,000 $228,000
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
15. Potential Facility‐Related Cost Savings
Assumptions
• 5 MW IT Load
• 24x7 Operation
Typical Local DC Efficient Local DC Ultra-Efficient
• 2.0 PUE • 1.35 PUE • 1.06 PUE
• 10 MW Consumption • 6.75 MW Consumption • 5.3 MW Consumption
• $0.10/kWh Power Costs • $0.10/kWh Power Costs • $0.05/kWh Power Costs
• $8.8M Power Bill • $5.9M Power Bill • $2.3M Power Bill
Potential Cost Savings of 74% and Energy Savings of 47%
Though Facility Changes Alone!
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
16. “Anyone who knows all the answers most
likely misunderstood the questions.”
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division