SlideShare ist ein Scribd-Unternehmen logo
1 von 59
HVAC for Data Centers
Sponsored by:
Join the discussion about this Webcast on Twitter at #CSEdatacenterHVAC
Today’s Webcast Sponsors:
Learning Objectives:
1.The audience will learn about codes and guidelines, such as
ASHRAE 90.1: Energy Standard for Buildings Except Low-
Rise Residential Buildings, and U.S. Green Building Council
LEED v4
2.Attendees will learn the relationships between HVAC
efficiency and power usage effectiveness (PUE)
3.Viewers will understand the advantages and drawbacks of
using an elevated IT equipment inlet temperature
4.Viewers will learn how running IT equipment at partial load
affects data center energy efficiency.
Bill Kosik, PE, CEM, BEMP, LEED AP BD+C,
HP Enterprise Business, Technology Services,
Chicago, Ill.
Tom R. Squillo, PE, LEED AP,
Environmental Systems Design Inc.,
Chicago, Ill.
Moderator: Jack Smith,
Consulting-Specifying Engineer and Pure Power,
CFE Media, LLC
Presenters:
Energy Code Requirements
for Data Centers
Tom R. Squillo, PE, LEED AP,
Environmental Systems Design Inc.,
Chicago, Ill.
Energy Code Requirements for Data Centers
International Energy Conservation
Code: IECC
• Adopted by eight states and many local
jurisdictions
• Written in enforceable code language
ASHRAE Energy Standard for
Buildings: ASHRAE 90.1
• Standard instead of code. Now written
in enforceable language so it can be
adopted locally
• Has more language specific to data
centers
California Building Energy Efficiency
Standards: Title 24
Energy Code Requirements for Data Centers
Where Do They Apply?
• Check local jurisdiction for specific
requirements or city energy codes
• Many local jurisdictions refer to
state codes
• IECC allows compliance with
ASHRAE 90.1 instead. This may
be an advantage in some instances
• Title 24 compliance required in
California
Current Energy Code Adoption Status (U.S. DOE)
Projected Energy Code Adoption by end of 2015
(U.S. DOE)
Energy Code Requirements for Data Centers
IECC – 2012:
• IECC delineates between simple
and complex systems. Stand
alone DX ac units may fall under
the simple category. Only very
small units under 33,000 Btu/h
capacity do not require
economizers
• All cooling systems with some
form of common piping distribution
fall under the complex category
• All complex systems require
economizers
Energy Code Requirements for Data Centers
IECC – 2012:
C403.4.1 Economizers. Economizers shall comply with Sections
C403.4.1.1 through C403.4.1.4
• This section requires either an air or water economizer
C403.4.1.1 Design capacity. Water economizer systems shall be
capable of cooling supply air by indirect evaporation and
providing up to 100% of the expected system cooling load at
outdoor air temperatures of 50 F dry bulb/45 F wet bulb and
below.
• Exception for small systems below 33,000 Btu/h
• Unlike ASHRAE 90.1, IECC has no specific exceptions for
data centers that allow lower dry-bulb/wet-bulb temperatures.
Dry-coolers are not allowed.
Energy Code Requirements for Data Centers
ASHRAE 90.1-2010
• Data centers, considered “process
cooling,” were excluded from the
requirements of ASHRAE 90.1-2007
• The 2010 version eliminates the
“process cooling” exemption, and
adds specific language for computer
rooms
• To comply with the IECC-2012 code,
ASHRAE 90.1-2010 may be used
instead.
Energy Code Requirements for Data Centers
ASHRAE 90.1 – 2010:
6.5.1 Economizers. Each Cooling System that has a fan shall include
either an air or water economizer meeting the requirements of Sections
6.5.1.1 through 6.5.1.4.
For data centers, economizers are not required for:
• Small fan-cooling units less than 135,000 Btu/hr or 65,000 Btu/hr,
depending on climate zone
• Extremely hot and humid climate zones
• Buildings with no central CHW plant, in which the total computer room
cooling capacity is less than 250 tons
• Buildings with a central CHW plant, and the computer room cooling load
is less than 50 tons
• Where cooling towers are not allowed
• Addition of less than 50-ton computer room capacity to existing building
• Various essential facilities (national defense, emergency response, etc.)
Energy Code Requirements for Data Centers
ASHRAE 90.1 – 2010:
6.5.1.2 Water Economizers
6.5.1.2.1 Design capacity. Water economizer systems shall be
capable of cooling supply air by indirect evaporation and by
providing up to 100% of the expected system cooling load at
outdoor air temperatures of 50 F dry bulb/45 F wet bulb and below.
• For data centers, the requirement is relaxed slightly to allow
100% economizer cooling at 40 F dry bulb/35 F wet bulb and
below
• The code also allows dry-coolers for data centers, but they must
provide 100% economizer cooling at 35 F dry bulb
Energy Code Requirements for Data Centers
Important Changes to ASHRAE 90.1-2013:
6.5.1.2 Water Economizers
• For data centers, the outdoor temperature limits for 100% water
side economization are not a single condition, but are based on
the individual climate zones
6.5.1.6 Economizer Humidification Impact. Systems with
hydronic cooling and humidification systems designed to maintain
inside humidity at a dew-point temperature greater than 35 F shall
use a water economizer if an economizer is required by Section
6.5.1.
• This essentially bans air side economizer systems for most
data center systems if using a prescriptive approach.
Energy Code Requirements for Data Centers
Important Changes to ASHRAE 90.1-2013:
6.6 Alternative Compliance Path
• For data centers, the HVAC systems can comply by meeting
minimum PUE requirements instead of Section 6.5-Prescriptive
Path.
• The minimum PUE values are based on the climate zone and
range from 1.30 to 1.61.
• PUE calculation is based on Green Grid Recommendation
document dated May, 2011.
• Option 1: Use peak PUE calculation (at 50% and 100% IT load)
• Option 2: Use annual PUE, calculated with an approved hourly
energy analysis program (DOE, BLAST, EnergyPlus, etc.)
Energy Code Requirements for Data Centers
Title 24 – 2013: Highlights
Specific to Data Centers
• Data centers are exempt from normal
economizer requirements
• Air or water economizer required
• Air economizer must provide 100%
economization at 55 F dry bulb
• Water economizer must provide 100%
economization at 40 F dry bulb/35 F
wet bulb
• Economizer exceptions exist for small
systems
• Nonadiabatic humidification (steam,
infrared) is prohibited
Energy Code Requirements for Data Centers
Title 24 – 2013: Highlights
Specific to Data Centers
• Variable speed supply fans required
for DX systems over 5 tons and all
CHW systems
• Supply fans shall vary airflow rate as a
function of actual load
• Containment required for data centers
with a design load exceeding 175
W/sq ft
• Containment exception for expansions
and racks below 1 W/sq ft
• Chilled water plants can have no more
than 300 tons of air-cooled chillers
Relationships Between HVAC
Efficiency and Power Usage
Effectiveness (PUE)
Bill Kosik, PE, CEM, BEMP, LEED AP BD+C,
HP Enterprise Business, Technology Services,
Chicago, Ill.
• Extreme regional variations in CO2 from electricity generation
• Determine appropriate balance of water and electricity usage
• Climate WILL impact HVAC energy use – select sites carefully
• Use evaporative cooling where appropriate
• Economizer strategy will be driven from climate characteristics
• Design power and cooling modularity to match IT growth
• Plan for power-aware computing equipment
• Use aisle containment or direct-cooled cabinets
• Design in ability to monitor and optimize PUE in real time
• Push for highest supply temperatures and lowest moisture levels
• Identify tipping point of server fan energy/inlet temperature
• Minimize data center footprint by using high-density architecture
DataCenterClimateSynergiesConvergence
Levels of Optimization
Air Cooled
Chiller
coupled to chilled
water coil in air
handling unit
(AHU)
Direct
Expansion
packaged in ahu
or separate DX
coil & remote
condenser
Water Cooled
Chiller
coupled with chw coil
in ahu. typical with
open CT & flat plate
HX
Air Cooled
Chiller
coupled to chw
coil in ahu.
typical with
closed CT
Interior AHU
direct outside
air with direct
evaporative
cooling
Exterior AHU
indirect outside
air and indirect
evaporative
cooling
CRAH
Unit
perimeter
air deliver
with chilled
water coil
In-Row Unit
close-coupled in
rack containment
system with
module fans and
chw coil
Rear Door
HX
Individual rack
door chw HX.
Passive
system with
no fan
Overhead
Coil
Module chw
coils. Passive
system with
no fan
Typical Data Center Cooling Strategies
Air Side Economizers Water Side Economizers
System 1 – DEC
Direct Outside Air
Economizer with
Direct Evaporative
Cooling
System 2 – IEC
Recirculating
(Closed) Air System
with Indirect
Evaporative Cooling
System 3 – IDEC
Recirculating (Closed) and
Direct Outside Air System 2
Stage Indirect-Direct
Evaporative Cooling
System 4 – IOA+EC
Indirect Air to Air HX
with Direct
Evaporative Cooling
in Secondary Air
System 5 –
OCT+FP HX
Direct (Open)
Evaporative Cooling
Tower with Flat Plate
System 6 – CCT
w/Spray
Indirect (Closed)
Cooling Tower with
Spray
Mechanical Cooling Options Mechanical Cooling Options
Cooling Configuration Options
Cooling Configuration Options
0.35 difference in PUE based
on climate and cooling system
type
PUE Varies with Climate
Impacts of Climate on Economization Strategy
This analysis shows the percent of total ton-hours that require mechanical cooling.
The graphs depict two systems with two water temperatures, 12°C and 18°C:
1. Air-cooled chiller with dry-cooled economization
2. Air-cooled chiller with evaporative-cooler economization
The indirect evaporative and indirect air cooling systems have the lowest
compressor energy used to cool the facility. The air-cooled DX and air-cooled
chiller systems have the highest compressor energy. The air-cooled chiller with
economizer is in the middle of the other options.
Impacts of Climate on Economization Strategy
Santiago, CHL
HVAC System and PUE
Five HVAC options are shown. Each option was analyzed using the same input
parameters such as climate attributes, air and water temperatures, etc. Each
system performs differently based on the inherent strategies used to cool the
data center and provide the proper airflow. For each option, the annual HVAC
consumption and annual PUE us shown.
HVAC System and PUE
Two options are shown.
The only difference
between the two options
is the location of the data
centers. Everything else,
including power, cooling,
and ancillary systems are
modelled identically.
Month by month PUE
values are shown as well
as monthly HVAC,
electrical losses, lighting
and other electrical
energy. The energy
consumption of the data
center located in
Singapore is markedly
higher based on the hot
and humid climate.
Elevated IT Equipment Temperatures
Tom R. Squillo, PE, LEED AP,
Environmental Systems Design Inc.,
Chicago, Ill.
Elevated IT Equipment Inlet Temperatures
Legacy Data Center Design
• Data center supply air set to
50 F to 55 F
• DX systems cycled on/off and
fought each other, with little
capacity control or
communication
• Chilled water temperatures of
40 F to 45 F
• No containment
• Wide variation of
temperatures entering IT
equipment
Elevated IT Equipment Inlet Temperatures
Why Were Low Supply Temperatures Needed?
• Design needed to take into account massive mixing of hot air with supply air
• Temperature of air entering IT equipment at tops of racks still acceptable
• Cold room allowed some ride-through if cooling failed
Why Was This Bad?
• Wastes Energy
– Too much airflow (low delta T)
– Inefficient chiller operation
– Limited economizer use
– Unnecessary dehumidification
• Hot spots
• Inconsistent temperature control
• Inconsistent humidity control Source: 42u.com
Elevated IT Equipment Inlet Temperatures
ASHRAE Thermal Guidelines
• Recommended range for IT inlet conditions
– Temperature: 64 F to 80.6 F
– Humidity: 41.9 F to 59 F dew point or 60% RH
• Extended range for other classes of IT equipment
Source: 42u.com
Elevated IT Equipment Inlet Temperatures
Advantages of Elevated Temperatures
• Increased equipment efficiency
– 1.5% to 2.5% increase in chiller efficiency per degree increase in chilled water
temperature
– Increasing CHW supply temperature from 50 F to 60 F decreases chiller energy by up to
25%
– Actual increase depends on chiller type and selection
• Decreased unwanted dehumidification at air handling units
– Coil temperature never gets below dew point if CHW temperature is raised
– Eliminates condensate removal issues
• Additional economizer hours
– Actual advantage highly dependent on climate and system type
– Longer equipment life
Air Side Economizer System: Phoenix, 60 F SA Temperature
• Economization
available for 5,038
hr/yr
• Chillers off when
outside air
temperature is below
60 F (1,910 hr)
• Economization
available for 7,396
hr/yr
• Huge gain in
economizer hours
due to dry climate
• Chillers off when
outside air
temperature is below
75 F (4,294 hr)
Air Side Economizer System: Phoenix, 75 F SA Temperature
Air Side Economizer System: Charlotte, 60 F SA Temperature
• Economization
available for 5,300
hr/yr
• More hours of full
economization
(3,778 hr) than
Phoenix
Air Side Economizer System: Charlotte, 75 F SA Temperature
• Economization
available for 5,630
hr/yr
• Due to humid climate,
increase in
economizer hours is
minimal
• Chillers off when
outside air
temperature is below
75 F (5,300 hr)
Water Side Economizer System: Phoenix, 60 F SA Temperature
• Economization
available for 3,829
hr/yr
• Chillers off when
outside air wet bulb
temperature is below
33 F (55 hr)
Water Side Economizer System: Phoenix, 75 F SA Temperature
• Economization
available for 8,629
hr/yr
• Huge gain in
economizer hours
due to dry climate
• Chillers off when
outside wet bulb
temperature is below
53 F (4,420 hr)
Water Side Economizer System: Charlotte, 60 F SA Temperature
• Economization
available for 4,174
hr/yr
• Chillers off when
outside wet bulb
temperature is below
33 F (1,009 hr)
Water Side Economizer System: Charlotte, 75 F SA Temperature
• Economization
available for 8,334 hr/yr
• water side economizer
has huge increase in
economizer hours
because less hours are
locked out due to OA
humidity
• Chillers off when
outside wet bulb
temperature is below
53 F (4,513 hr)
Elevated IT Equipment Inlet Temperatures
Bin Data Energy Use Analysis
• Design Criteria
– Typical enterprise/co-location data center load
• 10,000 sq ft
• 200 W/sq ft
• 2 MW of total IT load
– Within ASHRAE recommended conditions
• Supply temperature = 75 F
• Return temperature = 95 F
• Space dew point temperature between 42 F and 59 F
• Efficient adiabatic humidification used for analysis
Elevated IT Equipment Inlet Temperatures
Two Systems and two climates analyzed and compared:
System Options:
1. Direct Outside Air Economizer
• Multiple 500 kW capacity rooftop supply/exhaust AHUs
• OA Economizer control
• Air-cooled chiller system for supplemental cooling
2. Water-cooled Chillers with Cooling Towers
• High efficiency variable speed chillers (0.4 kW/ton)
• Induced draft cooling towers
• Plate and frame heat exchangers in series w/chillers
• CRAH units with high efficiency EC fans on raised floor
Climates:
1. Phoenix – Hot and dry
2. Charlotte – Warm and humid
Elevated IT Equipment Inlet Temperatures
MPUE
1.20
1.30
1.17
1.26
1.15
1.22
1.13
-
1,000,000
2,000,000
3,000,000
4,000,000
5,000,000
6,000,000
Direct
Outdoor Air
WC
Chillers/CT
Direct
Outdoor Air
WC
Chillers/CT
Direct
Outdoor Air
WC
Chillers/CT
Direct
Outdoor Air
WC
Chillers/CT
60°F Supply Air 65°F Supply Air 70°F Supply Air 75°F Supply Air
Phoenix Energy Consumption (kWh)
Chiller CHW Pump
CW Pump AHU Fan
Tower Fan Supply Fan
Exhaust Fan Humidification
1.34
Elevated IT Equipment Inlet Temperatures
MPUE
1.24
1.19
1.22
1.17
1.20
1.15
1.19
1.13
-
1,000,000
2,000,000
3,000,000
4,000,000
5,000,000
6,000,000
Direct
Outdoor Air
WC
Chillers/CT
Direct
Outdoor Air
WC
Chillers/CT
Direct
Outdoor Air
WC
Chillers/CT
Direct
Outdoor Air
WC
Chillers/CT
60°F Supply Air 65°F Supply Air 70°F Supply Air 75°F Supply Air
Charlotte Energy Consumption (kWh)
Chiller CHW Pump
CW Pump AHU Fan
Tower Fan Supply Fan
Exhaust Fan Humidification
Elevated IT Equipment Inlet Temperatures
Disadvantages of elevated temperatures
• Working conditions in hot aisle
– Hot aisle temperatures may rise above 100 F in some cases
– OSHA requirements may come into effect
– Think about temporary spot cooling for technology workers
• Temperature ratings of cables and sprinkler heads in hot aisle
– Some cabling rated for 40 C (104 F)
• Reduced ride-through time during cooling failures
– Critical server temperatures can be reached in minutes or seconds in some
cases
– Good containment will help reduce hot air recirculation, though may starve
servers if system airflow is interrupted
Elevated IT Equipment Inlet Temperatures
Conclusions:
• Increasing IT inlet temperatures can help reduce overall
energy use substantially by:
– Increasing chiller efficiency (10 degree rise can increase
efficiency up to 25%)
– Reduce humidification requirements
– Huge increases in economizer hours
• Be careful of very high temperature conditions in the hot
aisles affecting worker comfort and equipment ratings
• Advantages highly dependent on climate and system type
– Look at the psych chart for economizer and lock-out hours
– air side and water side economizer systems will be affected
differently
Partial Loads
Bill Kosik, PE, CEM, BEMP, LEED AP BD+C,
HP Enterprise Business, Technology Services,
Chicago, Ill.
PUE Sensitivity to Low IT Loads
How running
IT equipment
at partial load
affects data
center energy
efficiency.
• Multiple systems allow for
growth without over-
provisioning
• Modularity lowers fan
energy and increases
compressor effectiveness
• Modularity is not the same
as spreading the load
across multiple pieces of
equipment
Efficiency Through Modularity
Electrical losses will increase as the IT load decreases.
This increase must be included in cooling load at
different loading points.
Efficiency Through Modularity
Electrical System Topology and System Efficiency
As the cooling load is distributed over an increasing number of chillers, the
overall power (and energy) grows. To maintain the highest efficiency, the
chillers should be run as close as possible to their peak efficiency point.
Servers Are More Efficient but Use More Power
Trends in Server Turn-Down Ratio
Server Modularity
45 hot-plug cartridges
Compute, Storage, or
Combination
x86 , ARM, or
Accelerator
• Single-server = 45 servers
per chassis
• Quad-server =180 servers per
chassis (future capability) –
that is 1800 servers per
cabinet or 45 kW
Dual low-latency
switches
• Switch Module (45 x
1 GB downlinks)
Compute, Storage, or
combination x86 , ARM, or
Accelerator
The PUE values are predicted using data center energy use simulation techniques. Many assumptions are
made which affect the predicted energy use and PUE. These ranges are meant to be indicators of the PUE
envelope that might be expected based on sub-system efficiency levels, geographic location and methods
of operations. Detailed energy use simulation is required to develop more granular and accurate analyses.
Input datafor "high"PUEcase
Singapore, SGP
UseWater Economizer NO
UseAdiabatic Cooling NO
Lighting (w/SF) 1.50
Misc Power (%of IT) 6.0%
Electrical System Loss (%) 10.0%
Air-cooled evap temp (°F) 65.0
Fan Pressure 2.0
Input datafor "low"PUEcase
Helsinki, FIN
UseWater Economizer YES
UseAdiabatic Cooling YES
Lighting (w/SF) 1.00
Misc Power (%of IT) 4.0%
Electrical System Loss (%) 8.5%
Air-cooled evap temp (°F) 65.0
Fan Pressure 2.0
Codes and Standards References from Today’s Webcast
• HVAC: ASHRAE 90.1: Energy Standard for Buildings
Except Low-Rise Residential Buildings
• HVAC: ASHRAE 62.1, 62.2, and Air Movement
• International Energy Conservation Code
• U.S. Green Building Council LEED v4
• California Building Energy Efficiency Standards: Title 24
Bill Kosik, PE, CEM, BEMP, LEED AP BD+C,
HP Enterprise Business, Technology Services,
Chicago, Ill.
Tom R. Squillo, PE, LEED AP,
Environmental Systems Design Inc.,
Chicago, Ill.
Moderator: Jack Smith,
Consulting-Specifying Engineer and Pure Power,
CFE Media, LLC
Presenters:
Thanks to Today’s Webcast Sponsors:
Webcasts and Research
• Modular data center design
• HVAC: ASHRAE 62.1, 62.2, and Air
Movement
• 2013 HVAC, BAS state of the industry
report
HVAC for Data Centers
Sponsored by:
Join the discussion about this Webcast on Twitter at #CSEdatacenterHVAC

Weitere ähnliche Inhalte

Was ist angesagt?

Was ist angesagt? (20)

Hot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case StudiesHot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case Studies
 
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T'sData Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
 
Data Center Cooling Design - Datacenter-serverroom
Data Center Cooling Design - Datacenter-serverroomData Center Cooling Design - Datacenter-serverroom
Data Center Cooling Design - Datacenter-serverroom
 
Ashrae thermal guidelines svlg 2015 (1)
Ashrae thermal guidelines  svlg 2015 (1)Ashrae thermal guidelines  svlg 2015 (1)
Ashrae thermal guidelines svlg 2015 (1)
 
Datacenter best practices design and implementation
Datacenter best practices design and implementationDatacenter best practices design and implementation
Datacenter best practices design and implementation
 
The Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best PracticesThe Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best Practices
 
Data Center Floor Design - Your Layout Can Save of Kill Your PUE & Cooling Ef...
Data Center Floor Design - Your Layout Can Save of Kill Your PUE & Cooling Ef...Data Center Floor Design - Your Layout Can Save of Kill Your PUE & Cooling Ef...
Data Center Floor Design - Your Layout Can Save of Kill Your PUE & Cooling Ef...
 
CPD Presentation Evaporative cooling in data centres
CPD Presentation   Evaporative cooling in data centresCPD Presentation   Evaporative cooling in data centres
CPD Presentation Evaporative cooling in data centres
 
Hvac
HvacHvac
Hvac
 
HVAC Systems- A Complete Guide
HVAC Systems- A Complete GuideHVAC Systems- A Complete Guide
HVAC Systems- A Complete Guide
 
Data Center
Data CenterData Center
Data Center
 
Data Center Efficiency Workshop 2016
Data Center Efficiency Workshop 2016Data Center Efficiency Workshop 2016
Data Center Efficiency Workshop 2016
 
Datwyler data center presentation info tech middle east
Datwyler data center presentation info tech middle eastDatwyler data center presentation info tech middle east
Datwyler data center presentation info tech middle east
 
ASHRAE TC9.9 - DATA CENTER STANDARD AND BEST PRACTICES
ASHRAE TC9.9 - DATA CENTER STANDARD AND BEST PRACTICESASHRAE TC9.9 - DATA CENTER STANDARD AND BEST PRACTICES
ASHRAE TC9.9 - DATA CENTER STANDARD AND BEST PRACTICES
 
Modular Data Center Design
Modular Data Center DesignModular Data Center Design
Modular Data Center Design
 
Data Centre Design Guideline and Standards
Data Centre Design Guideline and StandardsData Centre Design Guideline and Standards
Data Centre Design Guideline and Standards
 
RESNET: HVAC Presentation
RESNET: HVAC PresentationRESNET: HVAC Presentation
RESNET: HVAC Presentation
 
VRF system presentation !
VRF system presentation !VRF system presentation !
VRF system presentation !
 
Simplifying Data Center Design/ Build
Simplifying Data Center Design/ BuildSimplifying Data Center Design/ Build
Simplifying Data Center Design/ Build
 
Chiller water systems
Chiller water systemsChiller water systems
Chiller water systems
 

Ähnlich wie HVAC for Data Centers

WP-59 The Different Types of Air Conditioning Equipment for IT Environments
WP-59 The Different Types of Air Conditioning Equipment for IT EnvironmentsWP-59 The Different Types of Air Conditioning Equipment for IT Environments
WP-59 The Different Types of Air Conditioning Equipment for IT Environments
zain kirmani
 
Cirrascale forest container march 2011
Cirrascale forest container march 2011Cirrascale forest container march 2011
Cirrascale forest container march 2011
Muchamad Jainuri
 
ClimateWizard_Brochure_AUS_CW005_REVE_0515 NEW
ClimateWizard_Brochure_AUS_CW005_REVE_0515 NEWClimateWizard_Brochure_AUS_CW005_REVE_0515 NEW
ClimateWizard_Brochure_AUS_CW005_REVE_0515 NEW
Hennie Verster
 
Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...
Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...
Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...
lizzabel
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
Alan Beresford
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
Emma Beresford
 
Design options for hvac distribution systems
Design options for hvac distribution systemsDesign options for hvac distribution systems
Design options for hvac distribution systems
JASON KEMBOI
 

Ähnlich wie HVAC for Data Centers (20)

WP-59 The Different Types of Air Conditioning Equipment for IT Environments
WP-59 The Different Types of Air Conditioning Equipment for IT EnvironmentsWP-59 The Different Types of Air Conditioning Equipment for IT Environments
WP-59 The Different Types of Air Conditioning Equipment for IT Environments
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for Cooling
 
Datacentre Uk
Datacentre UkDatacentre Uk
Datacentre Uk
 
Telehousenorth2presentationford sd2015
Telehousenorth2presentationford sd2015Telehousenorth2presentationford sd2015
Telehousenorth2presentationford sd2015
 
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative coolingTelehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
 
Cirrascale forest container march 2011
Cirrascale forest container march 2011Cirrascale forest container march 2011
Cirrascale forest container march 2011
 
ClimateWizard_Brochure_AUS_CW005_REVE_0515 NEW
ClimateWizard_Brochure_AUS_CW005_REVE_0515 NEWClimateWizard_Brochure_AUS_CW005_REVE_0515 NEW
ClimateWizard_Brochure_AUS_CW005_REVE_0515 NEW
 
Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...
Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...
Oct. 24 Webinar/Seminar: Is ASHRAE 90.1 2012 or iecc-2012 the right energy co...
 
Hvac systems presentation_slides
Hvac systems presentation_slidesHvac systems presentation_slides
Hvac systems presentation_slides
 
OCP liquid direct to chip temperature guideline.pdf
OCP liquid direct to chip temperature guideline.pdfOCP liquid direct to chip temperature guideline.pdf
OCP liquid direct to chip temperature guideline.pdf
 
HVAC Understandings before Design it
HVAC Understandings before Design itHVAC Understandings before Design it
HVAC Understandings before Design it
 
DC inverter technology in HVAC-R business
DC inverter technology in HVAC-R businessDC inverter technology in HVAC-R business
DC inverter technology in HVAC-R business
 
Mechanical Engineering Seminar 2017_3
Mechanical Engineering Seminar 2017_3Mechanical Engineering Seminar 2017_3
Mechanical Engineering Seminar 2017_3
 
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
 
Showcase ppt ver 8
Showcase ppt ver 8Showcase ppt ver 8
Showcase ppt ver 8
 
Showcase ppt ver 8
Showcase ppt ver 8Showcase ppt ver 8
Showcase ppt ver 8
 
Design options for hvac distribution systems
Design options for hvac distribution systemsDesign options for hvac distribution systems
Design options for hvac distribution systems
 
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data CentersHigh Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
 

Mehr von ConsultingSpecifyingEngineer

The State of the Electrical/Power Engineering Industry
The State of the Electrical/Power Engineering Industry The State of the Electrical/Power Engineering Industry
The State of the Electrical/Power Engineering Industry
ConsultingSpecifyingEngineer
 

Mehr von ConsultingSpecifyingEngineer (15)

Fire and Life Safety: Integration: Building Automation Systems and Fire/Life ...
Fire and Life Safety: Integration: Building Automation Systems and Fire/Life ...Fire and Life Safety: Integration: Building Automation Systems and Fire/Life ...
Fire and Life Safety: Integration: Building Automation Systems and Fire/Life ...
 
HVAC: New chiller requirements
HVAC: New chiller requirementsHVAC: New chiller requirements
HVAC: New chiller requirements
 
Critical power: Transfer switches and switchgear
Critical power: Transfer switches and switchgearCritical power: Transfer switches and switchgear
Critical power: Transfer switches and switchgear
 
Electrical Systems: Designing electrical rooms
Electrical Systems: Designing electrical roomsElectrical Systems: Designing electrical rooms
Electrical Systems: Designing electrical rooms
 
Critical Power: NFPA 110: Standard for Emergency and Standby Power
Critical Power: NFPA 110: Standard for Emergency and Standby Power Critical Power: NFPA 110: Standard for Emergency and Standby Power
Critical Power: NFPA 110: Standard for Emergency and Standby Power
 
Critical Power: Generators and Generator System Design
Critical Power: Generators and Generator System DesignCritical Power: Generators and Generator System Design
Critical Power: Generators and Generator System Design
 
Critical Power: Integrating Renewable Power into Buildings
Critical Power: Integrating Renewable Power into BuildingsCritical Power: Integrating Renewable Power into Buildings
Critical Power: Integrating Renewable Power into Buildings
 
Fire and Life Safety: Notification and Emergency Communication Systems
Fire and Life Safety: Notification and Emergency Communication SystemsFire and Life Safety: Notification and Emergency Communication Systems
Fire and Life Safety: Notification and Emergency Communication Systems
 
Critical Power: Circuit Protection in Health Care Facilities
Critical Power: Circuit Protection in Health Care FacilitiesCritical Power: Circuit Protection in Health Care Facilities
Critical Power: Circuit Protection in Health Care Facilities
 
Create Marketing Engineers Love
Create Marketing Engineers LoveCreate Marketing Engineers Love
Create Marketing Engineers Love
 
Industrial Branding: The Lost Art in the Industrial Marketplace
Industrial Branding: The Lost Art in the Industrial MarketplaceIndustrial Branding: The Lost Art in the Industrial Marketplace
Industrial Branding: The Lost Art in the Industrial Marketplace
 
Creating an Integrated Marketing Campaign for Impact and Results
Creating an Integrated Marketing Campaign for Impact and ResultsCreating an Integrated Marketing Campaign for Impact and Results
Creating an Integrated Marketing Campaign for Impact and Results
 
Marketing New Product Introductions in Mobility & SaaS: Ideation to Marketing...
Marketing New Product Introductions in Mobility & SaaS: Ideation to Marketing...Marketing New Product Introductions in Mobility & SaaS: Ideation to Marketing...
Marketing New Product Introductions in Mobility & SaaS: Ideation to Marketing...
 
Integrating the Marketing and Engineering Points of View in Marketing Communi...
Integrating the Marketing and Engineering Points of View in Marketing Communi...Integrating the Marketing and Engineering Points of View in Marketing Communi...
Integrating the Marketing and Engineering Points of View in Marketing Communi...
 
The State of the Electrical/Power Engineering Industry
The State of the Electrical/Power Engineering Industry The State of the Electrical/Power Engineering Industry
The State of the Electrical/Power Engineering Industry
 

Kürzlich hochgeladen

Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 

Kürzlich hochgeladen (20)

Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 

HVAC for Data Centers

  • 1. HVAC for Data Centers Sponsored by: Join the discussion about this Webcast on Twitter at #CSEdatacenterHVAC
  • 3. Learning Objectives: 1.The audience will learn about codes and guidelines, such as ASHRAE 90.1: Energy Standard for Buildings Except Low- Rise Residential Buildings, and U.S. Green Building Council LEED v4 2.Attendees will learn the relationships between HVAC efficiency and power usage effectiveness (PUE) 3.Viewers will understand the advantages and drawbacks of using an elevated IT equipment inlet temperature 4.Viewers will learn how running IT equipment at partial load affects data center energy efficiency.
  • 4. Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill. Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill. Moderator: Jack Smith, Consulting-Specifying Engineer and Pure Power, CFE Media, LLC Presenters:
  • 5. Energy Code Requirements for Data Centers Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill.
  • 6. Energy Code Requirements for Data Centers International Energy Conservation Code: IECC • Adopted by eight states and many local jurisdictions • Written in enforceable code language ASHRAE Energy Standard for Buildings: ASHRAE 90.1 • Standard instead of code. Now written in enforceable language so it can be adopted locally • Has more language specific to data centers California Building Energy Efficiency Standards: Title 24
  • 7. Energy Code Requirements for Data Centers Where Do They Apply? • Check local jurisdiction for specific requirements or city energy codes • Many local jurisdictions refer to state codes • IECC allows compliance with ASHRAE 90.1 instead. This may be an advantage in some instances • Title 24 compliance required in California
  • 8. Current Energy Code Adoption Status (U.S. DOE)
  • 9. Projected Energy Code Adoption by end of 2015 (U.S. DOE)
  • 10. Energy Code Requirements for Data Centers IECC – 2012: • IECC delineates between simple and complex systems. Stand alone DX ac units may fall under the simple category. Only very small units under 33,000 Btu/h capacity do not require economizers • All cooling systems with some form of common piping distribution fall under the complex category • All complex systems require economizers
  • 11. Energy Code Requirements for Data Centers IECC – 2012: C403.4.1 Economizers. Economizers shall comply with Sections C403.4.1.1 through C403.4.1.4 • This section requires either an air or water economizer C403.4.1.1 Design capacity. Water economizer systems shall be capable of cooling supply air by indirect evaporation and providing up to 100% of the expected system cooling load at outdoor air temperatures of 50 F dry bulb/45 F wet bulb and below. • Exception for small systems below 33,000 Btu/h • Unlike ASHRAE 90.1, IECC has no specific exceptions for data centers that allow lower dry-bulb/wet-bulb temperatures. Dry-coolers are not allowed.
  • 12. Energy Code Requirements for Data Centers ASHRAE 90.1-2010 • Data centers, considered “process cooling,” were excluded from the requirements of ASHRAE 90.1-2007 • The 2010 version eliminates the “process cooling” exemption, and adds specific language for computer rooms • To comply with the IECC-2012 code, ASHRAE 90.1-2010 may be used instead.
  • 13. Energy Code Requirements for Data Centers ASHRAE 90.1 – 2010: 6.5.1 Economizers. Each Cooling System that has a fan shall include either an air or water economizer meeting the requirements of Sections 6.5.1.1 through 6.5.1.4. For data centers, economizers are not required for: • Small fan-cooling units less than 135,000 Btu/hr or 65,000 Btu/hr, depending on climate zone • Extremely hot and humid climate zones • Buildings with no central CHW plant, in which the total computer room cooling capacity is less than 250 tons • Buildings with a central CHW plant, and the computer room cooling load is less than 50 tons • Where cooling towers are not allowed • Addition of less than 50-ton computer room capacity to existing building • Various essential facilities (national defense, emergency response, etc.)
  • 14. Energy Code Requirements for Data Centers ASHRAE 90.1 – 2010: 6.5.1.2 Water Economizers 6.5.1.2.1 Design capacity. Water economizer systems shall be capable of cooling supply air by indirect evaporation and by providing up to 100% of the expected system cooling load at outdoor air temperatures of 50 F dry bulb/45 F wet bulb and below. • For data centers, the requirement is relaxed slightly to allow 100% economizer cooling at 40 F dry bulb/35 F wet bulb and below • The code also allows dry-coolers for data centers, but they must provide 100% economizer cooling at 35 F dry bulb
  • 15. Energy Code Requirements for Data Centers Important Changes to ASHRAE 90.1-2013: 6.5.1.2 Water Economizers • For data centers, the outdoor temperature limits for 100% water side economization are not a single condition, but are based on the individual climate zones 6.5.1.6 Economizer Humidification Impact. Systems with hydronic cooling and humidification systems designed to maintain inside humidity at a dew-point temperature greater than 35 F shall use a water economizer if an economizer is required by Section 6.5.1. • This essentially bans air side economizer systems for most data center systems if using a prescriptive approach.
  • 16. Energy Code Requirements for Data Centers Important Changes to ASHRAE 90.1-2013: 6.6 Alternative Compliance Path • For data centers, the HVAC systems can comply by meeting minimum PUE requirements instead of Section 6.5-Prescriptive Path. • The minimum PUE values are based on the climate zone and range from 1.30 to 1.61. • PUE calculation is based on Green Grid Recommendation document dated May, 2011. • Option 1: Use peak PUE calculation (at 50% and 100% IT load) • Option 2: Use annual PUE, calculated with an approved hourly energy analysis program (DOE, BLAST, EnergyPlus, etc.)
  • 17. Energy Code Requirements for Data Centers Title 24 – 2013: Highlights Specific to Data Centers • Data centers are exempt from normal economizer requirements • Air or water economizer required • Air economizer must provide 100% economization at 55 F dry bulb • Water economizer must provide 100% economization at 40 F dry bulb/35 F wet bulb • Economizer exceptions exist for small systems • Nonadiabatic humidification (steam, infrared) is prohibited
  • 18. Energy Code Requirements for Data Centers Title 24 – 2013: Highlights Specific to Data Centers • Variable speed supply fans required for DX systems over 5 tons and all CHW systems • Supply fans shall vary airflow rate as a function of actual load • Containment required for data centers with a design load exceeding 175 W/sq ft • Containment exception for expansions and racks below 1 W/sq ft • Chilled water plants can have no more than 300 tons of air-cooled chillers
  • 19. Relationships Between HVAC Efficiency and Power Usage Effectiveness (PUE) Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill.
  • 20. • Extreme regional variations in CO2 from electricity generation • Determine appropriate balance of water and electricity usage • Climate WILL impact HVAC energy use – select sites carefully • Use evaporative cooling where appropriate • Economizer strategy will be driven from climate characteristics • Design power and cooling modularity to match IT growth • Plan for power-aware computing equipment • Use aisle containment or direct-cooled cabinets • Design in ability to monitor and optimize PUE in real time • Push for highest supply temperatures and lowest moisture levels • Identify tipping point of server fan energy/inlet temperature • Minimize data center footprint by using high-density architecture DataCenterClimateSynergiesConvergence Levels of Optimization
  • 21. Air Cooled Chiller coupled to chilled water coil in air handling unit (AHU) Direct Expansion packaged in ahu or separate DX coil & remote condenser Water Cooled Chiller coupled with chw coil in ahu. typical with open CT & flat plate HX Air Cooled Chiller coupled to chw coil in ahu. typical with closed CT Interior AHU direct outside air with direct evaporative cooling Exterior AHU indirect outside air and indirect evaporative cooling CRAH Unit perimeter air deliver with chilled water coil In-Row Unit close-coupled in rack containment system with module fans and chw coil Rear Door HX Individual rack door chw HX. Passive system with no fan Overhead Coil Module chw coils. Passive system with no fan Typical Data Center Cooling Strategies Air Side Economizers Water Side Economizers System 1 – DEC Direct Outside Air Economizer with Direct Evaporative Cooling System 2 – IEC Recirculating (Closed) Air System with Indirect Evaporative Cooling System 3 – IDEC Recirculating (Closed) and Direct Outside Air System 2 Stage Indirect-Direct Evaporative Cooling System 4 – IOA+EC Indirect Air to Air HX with Direct Evaporative Cooling in Secondary Air System 5 – OCT+FP HX Direct (Open) Evaporative Cooling Tower with Flat Plate System 6 – CCT w/Spray Indirect (Closed) Cooling Tower with Spray Mechanical Cooling Options Mechanical Cooling Options Cooling Configuration Options Cooling Configuration Options
  • 22. 0.35 difference in PUE based on climate and cooling system type PUE Varies with Climate
  • 23. Impacts of Climate on Economization Strategy This analysis shows the percent of total ton-hours that require mechanical cooling. The graphs depict two systems with two water temperatures, 12°C and 18°C: 1. Air-cooled chiller with dry-cooled economization 2. Air-cooled chiller with evaporative-cooler economization
  • 24. The indirect evaporative and indirect air cooling systems have the lowest compressor energy used to cool the facility. The air-cooled DX and air-cooled chiller systems have the highest compressor energy. The air-cooled chiller with economizer is in the middle of the other options. Impacts of Climate on Economization Strategy Santiago, CHL
  • 25. HVAC System and PUE Five HVAC options are shown. Each option was analyzed using the same input parameters such as climate attributes, air and water temperatures, etc. Each system performs differently based on the inherent strategies used to cool the data center and provide the proper airflow. For each option, the annual HVAC consumption and annual PUE us shown.
  • 26. HVAC System and PUE Two options are shown. The only difference between the two options is the location of the data centers. Everything else, including power, cooling, and ancillary systems are modelled identically. Month by month PUE values are shown as well as monthly HVAC, electrical losses, lighting and other electrical energy. The energy consumption of the data center located in Singapore is markedly higher based on the hot and humid climate.
  • 27. Elevated IT Equipment Temperatures Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill.
  • 28. Elevated IT Equipment Inlet Temperatures Legacy Data Center Design • Data center supply air set to 50 F to 55 F • DX systems cycled on/off and fought each other, with little capacity control or communication • Chilled water temperatures of 40 F to 45 F • No containment • Wide variation of temperatures entering IT equipment
  • 29. Elevated IT Equipment Inlet Temperatures Why Were Low Supply Temperatures Needed? • Design needed to take into account massive mixing of hot air with supply air • Temperature of air entering IT equipment at tops of racks still acceptable • Cold room allowed some ride-through if cooling failed Why Was This Bad? • Wastes Energy – Too much airflow (low delta T) – Inefficient chiller operation – Limited economizer use – Unnecessary dehumidification • Hot spots • Inconsistent temperature control • Inconsistent humidity control Source: 42u.com
  • 30. Elevated IT Equipment Inlet Temperatures ASHRAE Thermal Guidelines • Recommended range for IT inlet conditions – Temperature: 64 F to 80.6 F – Humidity: 41.9 F to 59 F dew point or 60% RH • Extended range for other classes of IT equipment Source: 42u.com
  • 31. Elevated IT Equipment Inlet Temperatures Advantages of Elevated Temperatures • Increased equipment efficiency – 1.5% to 2.5% increase in chiller efficiency per degree increase in chilled water temperature – Increasing CHW supply temperature from 50 F to 60 F decreases chiller energy by up to 25% – Actual increase depends on chiller type and selection • Decreased unwanted dehumidification at air handling units – Coil temperature never gets below dew point if CHW temperature is raised – Eliminates condensate removal issues • Additional economizer hours – Actual advantage highly dependent on climate and system type – Longer equipment life
  • 32. Air Side Economizer System: Phoenix, 60 F SA Temperature • Economization available for 5,038 hr/yr • Chillers off when outside air temperature is below 60 F (1,910 hr)
  • 33. • Economization available for 7,396 hr/yr • Huge gain in economizer hours due to dry climate • Chillers off when outside air temperature is below 75 F (4,294 hr) Air Side Economizer System: Phoenix, 75 F SA Temperature
  • 34. Air Side Economizer System: Charlotte, 60 F SA Temperature • Economization available for 5,300 hr/yr • More hours of full economization (3,778 hr) than Phoenix
  • 35. Air Side Economizer System: Charlotte, 75 F SA Temperature • Economization available for 5,630 hr/yr • Due to humid climate, increase in economizer hours is minimal • Chillers off when outside air temperature is below 75 F (5,300 hr)
  • 36. Water Side Economizer System: Phoenix, 60 F SA Temperature • Economization available for 3,829 hr/yr • Chillers off when outside air wet bulb temperature is below 33 F (55 hr)
  • 37. Water Side Economizer System: Phoenix, 75 F SA Temperature • Economization available for 8,629 hr/yr • Huge gain in economizer hours due to dry climate • Chillers off when outside wet bulb temperature is below 53 F (4,420 hr)
  • 38. Water Side Economizer System: Charlotte, 60 F SA Temperature • Economization available for 4,174 hr/yr • Chillers off when outside wet bulb temperature is below 33 F (1,009 hr)
  • 39. Water Side Economizer System: Charlotte, 75 F SA Temperature • Economization available for 8,334 hr/yr • water side economizer has huge increase in economizer hours because less hours are locked out due to OA humidity • Chillers off when outside wet bulb temperature is below 53 F (4,513 hr)
  • 40. Elevated IT Equipment Inlet Temperatures Bin Data Energy Use Analysis • Design Criteria – Typical enterprise/co-location data center load • 10,000 sq ft • 200 W/sq ft • 2 MW of total IT load – Within ASHRAE recommended conditions • Supply temperature = 75 F • Return temperature = 95 F • Space dew point temperature between 42 F and 59 F • Efficient adiabatic humidification used for analysis
  • 41. Elevated IT Equipment Inlet Temperatures Two Systems and two climates analyzed and compared: System Options: 1. Direct Outside Air Economizer • Multiple 500 kW capacity rooftop supply/exhaust AHUs • OA Economizer control • Air-cooled chiller system for supplemental cooling 2. Water-cooled Chillers with Cooling Towers • High efficiency variable speed chillers (0.4 kW/ton) • Induced draft cooling towers • Plate and frame heat exchangers in series w/chillers • CRAH units with high efficiency EC fans on raised floor Climates: 1. Phoenix – Hot and dry 2. Charlotte – Warm and humid
  • 42. Elevated IT Equipment Inlet Temperatures MPUE 1.20 1.30 1.17 1.26 1.15 1.22 1.13 - 1,000,000 2,000,000 3,000,000 4,000,000 5,000,000 6,000,000 Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT 60°F Supply Air 65°F Supply Air 70°F Supply Air 75°F Supply Air Phoenix Energy Consumption (kWh) Chiller CHW Pump CW Pump AHU Fan Tower Fan Supply Fan Exhaust Fan Humidification 1.34
  • 43. Elevated IT Equipment Inlet Temperatures MPUE 1.24 1.19 1.22 1.17 1.20 1.15 1.19 1.13 - 1,000,000 2,000,000 3,000,000 4,000,000 5,000,000 6,000,000 Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT 60°F Supply Air 65°F Supply Air 70°F Supply Air 75°F Supply Air Charlotte Energy Consumption (kWh) Chiller CHW Pump CW Pump AHU Fan Tower Fan Supply Fan Exhaust Fan Humidification
  • 44. Elevated IT Equipment Inlet Temperatures Disadvantages of elevated temperatures • Working conditions in hot aisle – Hot aisle temperatures may rise above 100 F in some cases – OSHA requirements may come into effect – Think about temporary spot cooling for technology workers • Temperature ratings of cables and sprinkler heads in hot aisle – Some cabling rated for 40 C (104 F) • Reduced ride-through time during cooling failures – Critical server temperatures can be reached in minutes or seconds in some cases – Good containment will help reduce hot air recirculation, though may starve servers if system airflow is interrupted
  • 45. Elevated IT Equipment Inlet Temperatures Conclusions: • Increasing IT inlet temperatures can help reduce overall energy use substantially by: – Increasing chiller efficiency (10 degree rise can increase efficiency up to 25%) – Reduce humidification requirements – Huge increases in economizer hours • Be careful of very high temperature conditions in the hot aisles affecting worker comfort and equipment ratings • Advantages highly dependent on climate and system type – Look at the psych chart for economizer and lock-out hours – air side and water side economizer systems will be affected differently
  • 46. Partial Loads Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill.
  • 47. PUE Sensitivity to Low IT Loads How running IT equipment at partial load affects data center energy efficiency.
  • 48. • Multiple systems allow for growth without over- provisioning • Modularity lowers fan energy and increases compressor effectiveness • Modularity is not the same as spreading the load across multiple pieces of equipment Efficiency Through Modularity
  • 49. Electrical losses will increase as the IT load decreases. This increase must be included in cooling load at different loading points. Efficiency Through Modularity
  • 50. Electrical System Topology and System Efficiency
  • 51. As the cooling load is distributed over an increasing number of chillers, the overall power (and energy) grows. To maintain the highest efficiency, the chillers should be run as close as possible to their peak efficiency point.
  • 52. Servers Are More Efficient but Use More Power Trends in Server Turn-Down Ratio
  • 53. Server Modularity 45 hot-plug cartridges Compute, Storage, or Combination x86 , ARM, or Accelerator • Single-server = 45 servers per chassis • Quad-server =180 servers per chassis (future capability) – that is 1800 servers per cabinet or 45 kW Dual low-latency switches • Switch Module (45 x 1 GB downlinks) Compute, Storage, or combination x86 , ARM, or Accelerator
  • 54. The PUE values are predicted using data center energy use simulation techniques. Many assumptions are made which affect the predicted energy use and PUE. These ranges are meant to be indicators of the PUE envelope that might be expected based on sub-system efficiency levels, geographic location and methods of operations. Detailed energy use simulation is required to develop more granular and accurate analyses. Input datafor "high"PUEcase Singapore, SGP UseWater Economizer NO UseAdiabatic Cooling NO Lighting (w/SF) 1.50 Misc Power (%of IT) 6.0% Electrical System Loss (%) 10.0% Air-cooled evap temp (°F) 65.0 Fan Pressure 2.0 Input datafor "low"PUEcase Helsinki, FIN UseWater Economizer YES UseAdiabatic Cooling YES Lighting (w/SF) 1.00 Misc Power (%of IT) 4.0% Electrical System Loss (%) 8.5% Air-cooled evap temp (°F) 65.0 Fan Pressure 2.0
  • 55. Codes and Standards References from Today’s Webcast • HVAC: ASHRAE 90.1: Energy Standard for Buildings Except Low-Rise Residential Buildings • HVAC: ASHRAE 62.1, 62.2, and Air Movement • International Energy Conservation Code • U.S. Green Building Council LEED v4 • California Building Energy Efficiency Standards: Title 24
  • 56. Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill. Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill. Moderator: Jack Smith, Consulting-Specifying Engineer and Pure Power, CFE Media, LLC Presenters:
  • 57. Thanks to Today’s Webcast Sponsors:
  • 58. Webcasts and Research • Modular data center design • HVAC: ASHRAE 62.1, 62.2, and Air Movement • 2013 HVAC, BAS state of the industry report
  • 59. HVAC for Data Centers Sponsored by: Join the discussion about this Webcast on Twitter at #CSEdatacenterHVAC