SlideShare a Scribd company logo
1 of 38
Aurora energy efficiency
Energy hungry datacenters

• Electricity used by data centers has doubled between 2000 and 2005
  alone!
• Servers are becoming more powerful , dense and more in number as
  well as storage becoming larger and larger
• Availability needs are on the rise
                     ALL OF THE ABOVE EQUALS
• MORE power consumed by the servers AND more consumed for
  cooling
Challenges
• Energy demanding servers pose several challenges
   –   Cost
   –   Energy waste
   –   Power availability
   –   Cooling
   –   Hot spots
   –   Carbon footprint
Motivations for energy efficiency
quote from Meijer Huber, LRZ
• Energy Efficiency and SuperMUC
• Motivation
   •Academic and governmental institutions in Bavaria use electrical
   energy from renewable sources
   •We currently pay 15.8 Cents per KWh
   •We already know that we will have to pay at least 17.8 Cents per
   KWh in 2013
Motivations for energy efficiency
quote from Steve Hammond, NREL
Motivation
•   Data centers are highly energy-intensive facilities
•   10-100x more energy intensive than an office.
•   Server racks well in excess of 30kW.
•   Surging demand for data storage.
•   ~3% of U.S. electricity consumption.
•   Projected to double in next 5 years.
•   Power and cooling constraints in existing facilities.
Sustainable Computing Why should we care?
•   Carbon footprint.
•   Water usage.
•   Mega$ per MW year.
•   Cost OpEx > IT CapEx!
Thus, we need a holistic approach to sustainability and TCO for the
entire computing enterprise, not just the HPC system
PUEs in various data centers
Source: Intel
Global bank’s best data center (of more than 100)   2.25             Air

EPA Energy Star Average                             1.91    Air/Liquid
Intel average                                       >1.80   Air
ORNL                                                1.25    Liquid
Google                                              1.16    Liquid coils,
                                                            evaporative tower,
                                                            hot aisle
                                                            containment
Leibniz Supercomputing Centre (LRZ)                 1.15    Direct liquid

National Center for Atmospheric Research (NCAR)     1.10    Liquid

Yahoo Lockport *(PUE declared in project)           1.08    Free air cooling +
                                                            evaporative cooling
Facebook Prineville                                 1.07    Free cooling,
                                                            evaporative
National Renewable Energy Laboratory (NREL)         1.06    Direct Liquid +
                                                            evaporative tower
WAYS TO IMPROVE PUE AND
ENERGY EFFICIENCY
Ways to improve PUE and energy efficiency
Total vs local energy optimization
Ways to improve PUE and energy efficiency
Acting at different levels

IT equipment level
• Increasing processor efficiency
• Increasing memory efficiency
• Increasing storage efficiency
• Optimizing networks (i.e. 3d-Torus vs fat tree networks)
• Optimizing algorithms
• Optimizing software (i.e. locality…)
• Optimizing the jobs scheduling to maximizing processors utilization
Data center level
• 50% of the energy entering a data centre goes into the «house load»,
   so it used for ancillary activities not directly related to the IT equipment
• Reducing the house load bring a considerable improvement of the
   data centre energy efficiency
3 main opportunity areas for energy efficiency

                 IT equipment
       1      Maximize Flops / Watt
               Maximize efficiency




                 Data Center
       2      Reduce House Load
                                        Reduce Cooling
                                      Energy consumption




                                       Optimize power
                Data Center or
                                        conversion
       3          ecosystem
              Reuse thermal energy
MAXIMIZE EFFICIENCY (FLOPS/WATT)
Energy efficient design
•   Eurora has been designed using standard component but making
    choices for the best energy efficiency possible
•   Eurora could benefit from the Eurotech experience of making the
    power conversion chain efficiency of the Eurotech Aurora system
    progressively increased from 89% to 97%
The approach has been:
•   Choice of the most efficient components in the market. That
    is, choosing components that minimize energy consumption
    giving the same functionality and performance
•   Choice of the best «working points» to top the components
    efficiency curves
•   Water cooling to lower the working temperature of components
    and maximize their efficiency and eliminate fans
Gain DC/DC conversion efficiency
• In the DC/DC choice a gain of over 2% in efficiency, from 95,5 % to
  98%
• Choice of the optimal current (I) to work on the top of the conversion
  curves




   Existing DC/DC conversion          New upgraded DC/DC conversion
Water cooling and efficiency
178 nodes – AMD Opteron 6128HE CPUs (Magny Cours) - 16GB RAM Measuremets
   taken by LRZ




   • With aircooling the CPU’s operate at about 5°C below maximum case
     temparture
   • Normal operation of an water cooled server is with water of 20°C, which is
     about 40°C below the maximum case temperature
Water cooling = No fans, Low noise

•   Ventilators consume 5-8% of peak power…per se a small contribution but the
    SUM of all of the contributions described gives a considerable positive delta in
    energy efficiency
Coldwater cooling
• Cold water often need chillers to be generate so it impacts negatively
  the PUE
• Ideally cold water should be generated by natural sources like lakes,
  rivers or by natural sources of cool, like cold climates, high mountain
  or geothermal exchange
• Eurotech can design solutions that accommodate the use of natural
  sources of cooling
ADDITIONAL EFFICIENCY!!!
REDUCE HOUSE LOAD
Efficiency and economics - Energy use in
data centers




Data from APC
Efficiency and economics - “typical” power
breakdown in datacenters




                                   Data from APC
Reducing cooling energy
Ways to reduce cooling energy consumption
• Air cooling optimization (hot and cold aisle containment…)
• Free cooling: avoid compressor based cooling (chillers) using cold air coming from
  outside the data center. Possible only in cold climate or seasonal
• Free cooling with heat exchangers (dry coolers). Dry coolers consume much less
  energy than chillers!
• Liquid cooling to increase the cooling efficacy and reduce the power absobed by chillers
• Liquid cooling with free cooling: the liquid is not cooled by chillers but by dry coolers
• Hot liquid cooling allows the use of dry coolers all year round and also in warm climates
• Liquid cooling using a natural source of
• Alternative approaches: spray cooling, oil submersion cooling




Eurotech Aurora approach:
• Direct Hot Water Cooling with no chillers but only dry coolers
Aurora liquid cooling infrastructure



                   Dry cooler




                        Filter

                                 Loop #1
                                                           Loop #6
  Heat
                                                                     Loop #12
exchanger
                                           Internal cooling Loop


                 Pump
Pumps consume energy but they
                       can control the flowrate

                       Increasing the flowrate is less
Chiller s              energy demanding that swicthing on
Dry Coolers
                       a chiller




                          LOOP #1
                                             LOOP #2

              heater
                                              By pass
Advantages of the Eurotech approach

Hot liquid cooling  no chillers  save energy
• Avoid/limit expensive and power hungry chillers with the only
  cooling method that requires almost always dry coolers only
• Minimize PUE and hence maximize energy cost savings
• Reuse thermal energy for heating, air conditioning, electrical
  energy or industrial processes
• “Clean” free cooling: no dust, no filters needed to filter
  external air

Direct liquid cooling via cold plates  effective cooling
• Allow very limited heat spillage
• Maximize the effectiveness of cooling allowing for hot water
   to be used (up to 55 °C inlet water)

Comprehensive  more efficiency
• Cools any source of heat in the server (including power
  supply)
Optimize power conversion
Standard power distribution steps




 Data from Intel
Moving towards DC reduces steps in power
conversion




 Data from Intel
Aurora power distribution




                                                       10 V
    230 V


                             48 Vdc
              Optional
              UPS




            97% efficiency            98% efficiency
ADDITIONAL GREEN!!!
THERMAL ENERGY RECOVERY
Minimize waste: thermal energy re-use

      Three Stages Cooling + Heat Recovery
                                                                1 MW

              0.13 MW
                                                Computing           Computing           Computing
                                                 system              system              system
                                        20° C               25° C               30° C
                                                 rack 1              rack 2             rack #n


  Liquid to Liquid
  Heat exchanger
        Liquid to Liquid Heat
             exchanger




                                           0.87 MW


                                30° C                                         55° C


    PUE < 1 !!                          Thermal energy re-use
Minimize waste: thermal energy re-use


 • The ability to effectively re-use the waste heat from the outlets increases
   with higher temperatures.
 • Outlet temperatures starting from 45°C can be used to heat buildings,
   temperatures starting from 55°C can be used to drive adsorption chillers.
 • Higher temperatures may even allow for trigeneration, the combined
   production of electricity, heating and cooling
 • Warm water can be used also in industrial processes
Thermal energy recovery and swimming pools
      Swimming pool 50 m, 4 lanes, 2m deep that looses 2°C per day if not heated
      The heat exchange system has 90% efficiency

      Volume water = 2,50m x 4 x 50m x 2m = 1000m^3 = 10^6 litri = 10^6 Kg
      Water specific heat= specificheat = 4186 Joule / Kg K
      Water target temperature = 28°C

      How much power do I need to keep the swimming pool at 28°C?

      P(W) = Q(Joule)/t(sec) = m(kg) * c_specif (Joule/Kg K) * deltaT (K)/t(sec) = 10^6
      Kg * 4186 Joule/Kg K * 2K ( 24*60*60 sec ) = 96900 W = 96,9 KW

      So we need a supercomputer generating roughly 110 kW.
      Assuming an energy efficiency of 900 Mflops/W…
      …to heat the swimming pool we would need to install a 100 Tflop
      system.
      That is, one Eurotech Aurora HPC 10-10 rack
IMPACT ON TOTAL COST OF
OWNERSHIP
Total cost of ownership
Total cost of ownership
 A comparison between datacenters: initial cost
                Both datacenters with roughly 1MW of IT equipment
                installed


                                      OPTIMAL Air Cooled         Hot Liquid Coloed
Values in K$                          Datacenter (PUE = 1.8)     datacenter (PUE=1.05)
Cost of IT (HW and SW)                                    $8,200                    $8,200
Facilities (building, raised floor,
fire system...)                                            $960                      $410
Racks and rack mngt software                               $220                      $100
Liquid cooling                                               $0                      $620
Total for network equipment                                $710                      $710
Cooling infrastructure/plumbing                          $4,280                      $580
Electrical                                               $5,710                    $3,880
TOTAL INVESTMENT COST                                   $20,080                   $14,500
Total cost of ownership
A comparison between datacenters: annualized TCO
            Both datacenters with roughly 1MW of IT equipment
            installed

                                        OPTIMAL Air Cooled         Hot Liquid Coloed
   Values in K$                         Datacenter (PUE = 1.8)     datacenter (PUE=1.05)
   Cost of energy                                         $2,690                     $1,060
   Retuning and additional CFD                                $5                         $0
   Total outage cost                                        $440                       $370
   Preventive maintenance                                   $150                       $150
   Annual facility and infrastructure
   maintenance.                                             $460                       $220
   Lighting                                                   $4                         $2
   Annualized 3 years capital costs                       $3,480                     $3,440
   Annualized 10 years capital costs                      $1,420                       $720
   Annualized 15 years capital costs                        $100                        $40
   ANNUALIZED TCO                                         $8,749                     $6,002
GREEN considerations
A comparison between datacenters



                           OPTIMAL Air Cooled     Hot Liquid Coloed
                           Datacenter (PUE = 1.8) datacenter (PUE=1.05)

     Total tons CO2 in 5   15,500                 26,600
     years
     Tons of CO2 saved     11070                  0
 CO2 savings
CO2 equivalent
          11000 tons of saved CO2 are equivalent
          to

          1500 cars that do not circualte for 1 yesr
          11500 saved adult trees
          15 Km2 of rain forest left untouched
Aurora hpc energy efficiency
Aurora hpc energy efficiency

More Related Content

What's hot

Aurora hpc solutions value
Aurora hpc solutions valueAurora hpc solutions value
Aurora hpc solutions valueEurotech Aurora
 
Data Center Lessons Learned
Data Center Lessons LearnedData Center Lessons Learned
Data Center Lessons LearnedTom Greenbaum
 
Data Center Cooling Design - Datacenter-serverroom
Data Center Cooling Design - Datacenter-serverroomData Center Cooling Design - Datacenter-serverroom
Data Center Cooling Design - Datacenter-serverroommarlisaclark
 
Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722
Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722
Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722Chris Dow
 
Total Liquid Cooling
Total Liquid CoolingTotal Liquid Cooling
Total Liquid CoolingIceotopePR
 
LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results! LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results! Daren Klum
 
Liquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systemsLiquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systemsEeu SC
 
Cirrascale forest container march 2011
Cirrascale forest container march 2011Cirrascale forest container march 2011
Cirrascale forest container march 2011Muchamad Jainuri
 
LiquidCool Solutions Value for Facebook
LiquidCool Solutions Value for FacebookLiquidCool Solutions Value for Facebook
LiquidCool Solutions Value for FacebookDaren Klum
 
Apc cooling solutions
Apc cooling solutionsApc cooling solutions
Apc cooling solutionspeperoca
 
Implementing Hot and Cold Air Containment in Existing Data Centers
Implementing Hot and Cold Air Containment in Existing Data CentersImplementing Hot and Cold Air Containment in Existing Data Centers
Implementing Hot and Cold Air Containment in Existing Data CentersSchneider Electric
 
How Row-based Data Center Cooling Works
How Row-based Data Center Cooling WorksHow Row-based Data Center Cooling Works
How Row-based Data Center Cooling WorksSchneider Electric
 
Critical Power: Integrating Renewable Power into Buildings
Critical Power: Integrating Renewable Power into BuildingsCritical Power: Integrating Renewable Power into Buildings
Critical Power: Integrating Renewable Power into BuildingsConsultingSpecifyingEngineer
 
Hot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case StudiesHot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case Studies42U Data Center Solutions
 
Competitive Assessment Updated
Competitive Assessment UpdatedCompetitive Assessment Updated
Competitive Assessment UpdatedRtopping
 
SDC42 Intergate.Seattle PDF
SDC42 Intergate.Seattle PDFSDC42 Intergate.Seattle PDF
SDC42 Intergate.Seattle PDFArleneECarvalho
 

What's hot (20)

Data center hvac
Data center hvacData center hvac
Data center hvac
 
Aurora hpc solutions value
Aurora hpc solutions valueAurora hpc solutions value
Aurora hpc solutions value
 
Data Center Cooling Strategies
Data Center Cooling StrategiesData Center Cooling Strategies
Data Center Cooling Strategies
 
Data Center Lessons Learned
Data Center Lessons LearnedData Center Lessons Learned
Data Center Lessons Learned
 
Data Center Cooling Design - Datacenter-serverroom
Data Center Cooling Design - Datacenter-serverroomData Center Cooling Design - Datacenter-serverroom
Data Center Cooling Design - Datacenter-serverroom
 
Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722
Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722
Case Study - Fujitsu_Malaga_Perth - (TIA 942) - 20110722
 
Total Liquid Cooling
Total Liquid CoolingTotal Liquid Cooling
Total Liquid Cooling
 
LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results! LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results!
 
Liquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systemsLiquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systems
 
Cirrascale forest container march 2011
Cirrascale forest container march 2011Cirrascale forest container march 2011
Cirrascale forest container march 2011
 
LiquidCool Solutions Value for Facebook
LiquidCool Solutions Value for FacebookLiquidCool Solutions Value for Facebook
LiquidCool Solutions Value for Facebook
 
Apc cooling solutions
Apc cooling solutionsApc cooling solutions
Apc cooling solutions
 
Implementing Hot and Cold Air Containment in Existing Data Centers
Implementing Hot and Cold Air Containment in Existing Data CentersImplementing Hot and Cold Air Containment in Existing Data Centers
Implementing Hot and Cold Air Containment in Existing Data Centers
 
How Row-based Data Center Cooling Works
How Row-based Data Center Cooling WorksHow Row-based Data Center Cooling Works
How Row-based Data Center Cooling Works
 
Critical Power: Integrating Renewable Power into Buildings
Critical Power: Integrating Renewable Power into BuildingsCritical Power: Integrating Renewable Power into Buildings
Critical Power: Integrating Renewable Power into Buildings
 
10 dos and donts of data centre design
10 dos and donts of data centre design10 dos and donts of data centre design
10 dos and donts of data centre design
 
Hot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case StudiesHot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case Studies
 
SDC42 Brochure
SDC42 Brochure SDC42 Brochure
SDC42 Brochure
 
Competitive Assessment Updated
Competitive Assessment UpdatedCompetitive Assessment Updated
Competitive Assessment Updated
 
SDC42 Intergate.Seattle PDF
SDC42 Intergate.Seattle PDFSDC42 Intergate.Seattle PDF
SDC42 Intergate.Seattle PDF
 

Similar to Aurora hpc energy efficiency

NESEA Building Energy 2015: PV and Heat Pumps
NESEA Building Energy 2015: PV and Heat PumpsNESEA Building Energy 2015: PV and Heat Pumps
NESEA Building Energy 2015: PV and Heat Pumpsfortunatmueller
 
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELBits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELinside-BigData.com
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingGraybar
 
Building Energy 2014: PV and Heat Pumps by Fortunat Mueller
Building Energy 2014: PV and Heat Pumps by Fortunat MuellerBuilding Energy 2014: PV and Heat Pumps by Fortunat Mueller
Building Energy 2014: PV and Heat Pumps by Fortunat Muellerfortunatmueller
 
US Trends in Data Centre Design with NREL Examples of Large Energy Savings
US Trends in Data Centre Design with NREL Examples of Large Energy Savings US Trends in Data Centre Design with NREL Examples of Large Energy Savings
US Trends in Data Centre Design with NREL Examples of Large Energy Savings JISC's Green ICT Programme
 
battery-thermal-management-in-evs-and-hevs-issues-and.pdf
battery-thermal-management-in-evs-and-hevs-issues-and.pdfbattery-thermal-management-in-evs-and-hevs-issues-and.pdf
battery-thermal-management-in-evs-and-hevs-issues-and.pdfCHANDAMAMATV
 
PAC 2.5 Efficiency is Attainable, What are you Waiting for?
PAC 2.5 Efficiency is Attainable, What are you Waiting for?PAC 2.5 Efficiency is Attainable, What are you Waiting for?
PAC 2.5 Efficiency is Attainable, What are you Waiting for?SchneiderITB
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016Alan Beresford
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016Emma Beresford
 
Telehousenorth2presentationford sd2015
Telehousenorth2presentationford sd2015Telehousenorth2presentationford sd2015
Telehousenorth2presentationford sd2015Telehouse
 
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative coolingTelehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative coolingTelehouse Europe
 
Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...
Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...
Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...Salman Haider
 
The importance of energy efficiency in RAC
The importance of energy efficiency in RAC The importance of energy efficiency in RAC
The importance of energy efficiency in RAC UNEP OzonAction
 
Adiabatic Technologies and Evaporative Cooling Effect
Adiabatic Technologies and Evaporative Cooling EffectAdiabatic Technologies and Evaporative Cooling Effect
Adiabatic Technologies and Evaporative Cooling EffectCAREL Industries S.p.A
 
DC inverter technology in HVAC-R business
DC inverter technology in HVAC-R businessDC inverter technology in HVAC-R business
DC inverter technology in HVAC-R businessCAREL Industries S.p.A
 
"Replacement of vapor compression system of domestic refrigerator by an eject...
"Replacement of vapor compression system of domestic refrigerator by an eject..."Replacement of vapor compression system of domestic refrigerator by an eject...
"Replacement of vapor compression system of domestic refrigerator by an eject...IRJET Journal
 

Similar to Aurora hpc energy efficiency (20)

NESEA Building Energy 2015: PV and Heat Pumps
NESEA Building Energy 2015: PV and Heat PumpsNESEA Building Energy 2015: PV and Heat Pumps
NESEA Building Energy 2015: PV and Heat Pumps
 
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELBits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for Cooling
 
Building Energy 2014: PV and Heat Pumps by Fortunat Mueller
Building Energy 2014: PV and Heat Pumps by Fortunat MuellerBuilding Energy 2014: PV and Heat Pumps by Fortunat Mueller
Building Energy 2014: PV and Heat Pumps by Fortunat Mueller
 
US Trends in Data Centre Design with NREL Examples of Large Energy Savings
US Trends in Data Centre Design with NREL Examples of Large Energy Savings US Trends in Data Centre Design with NREL Examples of Large Energy Savings
US Trends in Data Centre Design with NREL Examples of Large Energy Savings
 
Datacentre Uk
Datacentre UkDatacentre Uk
Datacentre Uk
 
PUE Reconsidered
PUE ReconsideredPUE Reconsidered
PUE Reconsidered
 
battery-thermal-management-in-evs-and-hevs-issues-and.pdf
battery-thermal-management-in-evs-and-hevs-issues-and.pdfbattery-thermal-management-in-evs-and-hevs-issues-and.pdf
battery-thermal-management-in-evs-and-hevs-issues-and.pdf
 
PAC 2.5 Efficiency is Attainable, What are you Waiting for?
PAC 2.5 Efficiency is Attainable, What are you Waiting for?PAC 2.5 Efficiency is Attainable, What are you Waiting for?
PAC 2.5 Efficiency is Attainable, What are you Waiting for?
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
 
Telehousenorth2presentationford sd2015
Telehousenorth2presentationford sd2015Telehousenorth2presentationford sd2015
Telehousenorth2presentationford sd2015
 
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative coolingTelehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
Telehouse North Two Presentation 2015 - Adiabatic and Evaporative cooling
 
Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...
Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...
Turbine Inlet Air Cooling (TIAC) - Case Studies - Economics - Performance - C...
 
Steve hammond nrel
Steve hammond nrelSteve hammond nrel
Steve hammond nrel
 
The importance of energy efficiency in RAC
The importance of energy efficiency in RAC The importance of energy efficiency in RAC
The importance of energy efficiency in RAC
 
Adiabatic Technologies and Evaporative Cooling Effect
Adiabatic Technologies and Evaporative Cooling EffectAdiabatic Technologies and Evaporative Cooling Effect
Adiabatic Technologies and Evaporative Cooling Effect
 
LogiCool Inrak Data Centre Cooling 10-60kW
LogiCool Inrak Data Centre Cooling 10-60kWLogiCool Inrak Data Centre Cooling 10-60kW
LogiCool Inrak Data Centre Cooling 10-60kW
 
DC inverter technology in HVAC-R business
DC inverter technology in HVAC-R businessDC inverter technology in HVAC-R business
DC inverter technology in HVAC-R business
 
"Replacement of vapor compression system of domestic refrigerator by an eject...
"Replacement of vapor compression system of domestic refrigerator by an eject..."Replacement of vapor compression system of domestic refrigerator by an eject...
"Replacement of vapor compression system of domestic refrigerator by an eject...
 

Aurora hpc energy efficiency

  • 2. Energy hungry datacenters • Electricity used by data centers has doubled between 2000 and 2005 alone! • Servers are becoming more powerful , dense and more in number as well as storage becoming larger and larger • Availability needs are on the rise ALL OF THE ABOVE EQUALS • MORE power consumed by the servers AND more consumed for cooling
  • 3. Challenges • Energy demanding servers pose several challenges – Cost – Energy waste – Power availability – Cooling – Hot spots – Carbon footprint
  • 4. Motivations for energy efficiency quote from Meijer Huber, LRZ • Energy Efficiency and SuperMUC • Motivation •Academic and governmental institutions in Bavaria use electrical energy from renewable sources •We currently pay 15.8 Cents per KWh •We already know that we will have to pay at least 17.8 Cents per KWh in 2013
  • 5. Motivations for energy efficiency quote from Steve Hammond, NREL Motivation • Data centers are highly energy-intensive facilities • 10-100x more energy intensive than an office. • Server racks well in excess of 30kW. • Surging demand for data storage. • ~3% of U.S. electricity consumption. • Projected to double in next 5 years. • Power and cooling constraints in existing facilities. Sustainable Computing Why should we care? • Carbon footprint. • Water usage. • Mega$ per MW year. • Cost OpEx > IT CapEx! Thus, we need a holistic approach to sustainability and TCO for the entire computing enterprise, not just the HPC system
  • 6. PUEs in various data centers Source: Intel Global bank’s best data center (of more than 100) 2.25 Air EPA Energy Star Average 1.91 Air/Liquid Intel average >1.80 Air ORNL 1.25 Liquid Google 1.16 Liquid coils, evaporative tower, hot aisle containment Leibniz Supercomputing Centre (LRZ) 1.15 Direct liquid National Center for Atmospheric Research (NCAR) 1.10 Liquid Yahoo Lockport *(PUE declared in project) 1.08 Free air cooling + evaporative cooling Facebook Prineville 1.07 Free cooling, evaporative National Renewable Energy Laboratory (NREL) 1.06 Direct Liquid + evaporative tower
  • 7. WAYS TO IMPROVE PUE AND ENERGY EFFICIENCY
  • 8. Ways to improve PUE and energy efficiency Total vs local energy optimization
  • 9. Ways to improve PUE and energy efficiency Acting at different levels IT equipment level • Increasing processor efficiency • Increasing memory efficiency • Increasing storage efficiency • Optimizing networks (i.e. 3d-Torus vs fat tree networks) • Optimizing algorithms • Optimizing software (i.e. locality…) • Optimizing the jobs scheduling to maximizing processors utilization Data center level • 50% of the energy entering a data centre goes into the «house load», so it used for ancillary activities not directly related to the IT equipment • Reducing the house load bring a considerable improvement of the data centre energy efficiency
  • 10. 3 main opportunity areas for energy efficiency IT equipment 1 Maximize Flops / Watt Maximize efficiency Data Center 2 Reduce House Load Reduce Cooling Energy consumption Optimize power Data Center or conversion 3 ecosystem Reuse thermal energy
  • 12. Energy efficient design • Eurora has been designed using standard component but making choices for the best energy efficiency possible • Eurora could benefit from the Eurotech experience of making the power conversion chain efficiency of the Eurotech Aurora system progressively increased from 89% to 97% The approach has been: • Choice of the most efficient components in the market. That is, choosing components that minimize energy consumption giving the same functionality and performance • Choice of the best «working points» to top the components efficiency curves • Water cooling to lower the working temperature of components and maximize their efficiency and eliminate fans
  • 13. Gain DC/DC conversion efficiency • In the DC/DC choice a gain of over 2% in efficiency, from 95,5 % to 98% • Choice of the optimal current (I) to work on the top of the conversion curves Existing DC/DC conversion New upgraded DC/DC conversion
  • 14. Water cooling and efficiency 178 nodes – AMD Opteron 6128HE CPUs (Magny Cours) - 16GB RAM Measuremets taken by LRZ • With aircooling the CPU’s operate at about 5°C below maximum case temparture • Normal operation of an water cooled server is with water of 20°C, which is about 40°C below the maximum case temperature
  • 15. Water cooling = No fans, Low noise • Ventilators consume 5-8% of peak power…per se a small contribution but the SUM of all of the contributions described gives a considerable positive delta in energy efficiency
  • 16. Coldwater cooling • Cold water often need chillers to be generate so it impacts negatively the PUE • Ideally cold water should be generated by natural sources like lakes, rivers or by natural sources of cool, like cold climates, high mountain or geothermal exchange • Eurotech can design solutions that accommodate the use of natural sources of cooling
  • 18. Efficiency and economics - Energy use in data centers Data from APC
  • 19. Efficiency and economics - “typical” power breakdown in datacenters Data from APC
  • 20. Reducing cooling energy Ways to reduce cooling energy consumption • Air cooling optimization (hot and cold aisle containment…) • Free cooling: avoid compressor based cooling (chillers) using cold air coming from outside the data center. Possible only in cold climate or seasonal • Free cooling with heat exchangers (dry coolers). Dry coolers consume much less energy than chillers! • Liquid cooling to increase the cooling efficacy and reduce the power absobed by chillers • Liquid cooling with free cooling: the liquid is not cooled by chillers but by dry coolers • Hot liquid cooling allows the use of dry coolers all year round and also in warm climates • Liquid cooling using a natural source of • Alternative approaches: spray cooling, oil submersion cooling Eurotech Aurora approach: • Direct Hot Water Cooling with no chillers but only dry coolers
  • 21. Aurora liquid cooling infrastructure Dry cooler Filter Loop #1 Loop #6 Heat Loop #12 exchanger Internal cooling Loop Pump
  • 22. Pumps consume energy but they can control the flowrate Increasing the flowrate is less Chiller s energy demanding that swicthing on Dry Coolers a chiller LOOP #1 LOOP #2 heater By pass
  • 23. Advantages of the Eurotech approach Hot liquid cooling  no chillers  save energy • Avoid/limit expensive and power hungry chillers with the only cooling method that requires almost always dry coolers only • Minimize PUE and hence maximize energy cost savings • Reuse thermal energy for heating, air conditioning, electrical energy or industrial processes • “Clean” free cooling: no dust, no filters needed to filter external air Direct liquid cooling via cold plates  effective cooling • Allow very limited heat spillage • Maximize the effectiveness of cooling allowing for hot water to be used (up to 55 °C inlet water) Comprehensive  more efficiency • Cools any source of heat in the server (including power supply)
  • 24. Optimize power conversion Standard power distribution steps Data from Intel
  • 25. Moving towards DC reduces steps in power conversion Data from Intel
  • 26. Aurora power distribution 10 V 230 V 48 Vdc Optional UPS 97% efficiency 98% efficiency
  • 28. Minimize waste: thermal energy re-use Three Stages Cooling + Heat Recovery 1 MW 0.13 MW Computing Computing Computing system system system 20° C 25° C 30° C rack 1 rack 2 rack #n Liquid to Liquid Heat exchanger Liquid to Liquid Heat exchanger 0.87 MW 30° C 55° C PUE < 1 !! Thermal energy re-use
  • 29. Minimize waste: thermal energy re-use • The ability to effectively re-use the waste heat from the outlets increases with higher temperatures. • Outlet temperatures starting from 45°C can be used to heat buildings, temperatures starting from 55°C can be used to drive adsorption chillers. • Higher temperatures may even allow for trigeneration, the combined production of electricity, heating and cooling • Warm water can be used also in industrial processes
  • 30. Thermal energy recovery and swimming pools Swimming pool 50 m, 4 lanes, 2m deep that looses 2°C per day if not heated The heat exchange system has 90% efficiency Volume water = 2,50m x 4 x 50m x 2m = 1000m^3 = 10^6 litri = 10^6 Kg Water specific heat= specificheat = 4186 Joule / Kg K Water target temperature = 28°C How much power do I need to keep the swimming pool at 28°C? P(W) = Q(Joule)/t(sec) = m(kg) * c_specif (Joule/Kg K) * deltaT (K)/t(sec) = 10^6 Kg * 4186 Joule/Kg K * 2K ( 24*60*60 sec ) = 96900 W = 96,9 KW So we need a supercomputer generating roughly 110 kW. Assuming an energy efficiency of 900 Mflops/W… …to heat the swimming pool we would need to install a 100 Tflop system. That is, one Eurotech Aurora HPC 10-10 rack
  • 31. IMPACT ON TOTAL COST OF OWNERSHIP
  • 32. Total cost of ownership
  • 33. Total cost of ownership A comparison between datacenters: initial cost Both datacenters with roughly 1MW of IT equipment installed OPTIMAL Air Cooled Hot Liquid Coloed Values in K$ Datacenter (PUE = 1.8) datacenter (PUE=1.05) Cost of IT (HW and SW) $8,200 $8,200 Facilities (building, raised floor, fire system...) $960 $410 Racks and rack mngt software $220 $100 Liquid cooling $0 $620 Total for network equipment $710 $710 Cooling infrastructure/plumbing $4,280 $580 Electrical $5,710 $3,880 TOTAL INVESTMENT COST $20,080 $14,500
  • 34. Total cost of ownership A comparison between datacenters: annualized TCO Both datacenters with roughly 1MW of IT equipment installed OPTIMAL Air Cooled Hot Liquid Coloed Values in K$ Datacenter (PUE = 1.8) datacenter (PUE=1.05) Cost of energy $2,690 $1,060 Retuning and additional CFD $5 $0 Total outage cost $440 $370 Preventive maintenance $150 $150 Annual facility and infrastructure maintenance. $460 $220 Lighting $4 $2 Annualized 3 years capital costs $3,480 $3,440 Annualized 10 years capital costs $1,420 $720 Annualized 15 years capital costs $100 $40 ANNUALIZED TCO $8,749 $6,002
  • 35. GREEN considerations A comparison between datacenters OPTIMAL Air Cooled Hot Liquid Coloed Datacenter (PUE = 1.8) datacenter (PUE=1.05) Total tons CO2 in 5 15,500 26,600 years Tons of CO2 saved 11070 0 CO2 savings
  • 36. CO2 equivalent 11000 tons of saved CO2 are equivalent to 1500 cars that do not circualte for 1 yesr 11500 saved adult trees 15 Km2 of rain forest left untouched

Editor's Notes

  1. NREL = National Renewable Energy Laboratory
  2. This slide shows that to getPUEs &lt; 1.8 dedicatedinfrastructureisneeded to implementeither free cooling or liquidcooling