Liquid cooling and heat re-use experiences · Cooling Total Liquid Cooling •Air is passed through...

Post on 28-Jul-2020

2 views 0 download

transcript

Liquid cooling

and heat re-use

experiences

Gert SvenssonDep. Director PDC, KTH

November 11, 2015

PDC Center for

High Performance Computing

Contents

• PDC Overview

• Heat re-use at PDC

• Liquid cooling

• Case studies

PDC Center for

High Performance Computing

KTH and PDC

• KTH Royal Institute of Technology is the largest

technical university in Sweden.

• PDC Center for High Performance Computing is

the largest supercomputer centre in Sweden

located at KTH.

2015-11-11 3STOCKHOLM

PDC Center for

High Performance Computing

PDC’s Mission

• Operation of world-class:

• Supercomputers

• Data storage & long time data archive

• Provide world-class:

• User support

• Training

• Education

• mainly to Swedish academia

• Take part in world-class research in the area

PDC Center for

High Performance Computing

Funding

• Part of SNIC (6 centers in a meta center)

• Most funding from the Swedish Research Council (VR) and KTH

• Many EU projects

• Some industrial collaboration (mainly Scania)

• Note: Academic use is free

PDC Center for

High Performance Computing

How is a supercomputer build today?

Cluster

NodesSimilar to computerwithout keyboardand monitor

Processors

Cores

PDC Center for

High Performance Computing

PDC Main System: Cray XC40 Beskow

2015-11-11 7STOCKHOLM

PDC Center for

High Performance Computing

Cooling at PDC background

• Listed buildings

• KTH located between the ”Royal

national city park” and the ”city

centre”

• District cooling since 2004

– Environmentally friendly (sea

water and central heat re-use)

– Has been fairly expensive but this

has changed recently

– Backup cooling needed

8

PDC Center for

High Performance Computing

Energy consumption at PDC

PDC Center for

High Performance Computing

First Heat Re-Use Project 2009 - 2014

• Cray XE6 600 kW

• Retrofitted an air cooled system with

water cooling

• Heating of one building (the Chemistry

Lab)

• Heating the incoming air to the building

10

PDC Center for

High Performance Computing

Collecting high enough temperature

• Our computer room air condition (CRAC) units takes water

input of 8 °C and produce 18 °C

• Room temperature normally around 20 °C

• CRAC units not sufficiently high temperature

• The Cray XE60 computer two versions:

• Water cooled (too low temperature)

• Air cooled (air output 35 – 40 °C)

• Decided to purchase the air cooled version and design the

cooling ourselves

PDC Center for

High Performance Computing

Cooling the computer

• Hot air collected by

industrial air – water heat

exchangers

• Custom made chimneys

attach the computer

racks and the heat

exchangers

PDC Center for

High Performance Computing

Server Racks

Air

21°C

Air

21°C

Air

Air

35 - 44 °C 35 - 44 °C

Water 16 °C

Water

~30 °C

Air

16 °C

Air

16 °C

Air-Water Heat Exchangers

From: Computer Room Air Conditioners CRAC)

PDC Center for

High Performance Computing

Building to heat

• Nearby chemistry

lab undergoing

renovations

• Mainly heated by

air

• Needs larger

amount of air

because of

chemicals

PDC Center for

High Performance Computing

Transporting water to the chemistry lab

• Use existing pipes from the district cooling system

• Change direction of flow when in re-use mode

Chemistry Building

PDC HeatRecyclingLoop(winter)

PDC

PDC Center for

High Performance Computing

Water distribution

• One heat exchanger for

re-use

• One for district cooling

• If everything fails use tap

waterDistrict Cooling System

Always external supply only in this section Own supply(winter)

PDC Computer Hall

CrayCRAC

8°C

18°C 19

°C33.5 °CExternal Heat Exchangers

6 °C

16°C

16 °C

Tap Water 33

°C19°C

PDC Center for

High Performance Computing

IRL

PDC Center for

High Performance Computing

Lessons learned

• Possible to build something custom-made but

lot of work

• Keep it as simple as possible

• Re-use existing infrastructure

• Different perspective in computer industry vs.

building industry

• Requires cold climate but

• Heat can be made into cold using absorption

chillers

PDC Center for

High Performance Computing

Evaluation Winter 2012 - 2013

Saved 540 MWh district cooling

Saved 270 MWh district heating

Saved 50 kEuro per season

Saved 50 000 kg CO2 per season

19

-15

-10

-5

0

5

100

50

100

150

200

250

300

350

400

450

500

25

-Oct-1

2

1-N

ov-1

2

8-N

ov-1

2

15

-No

v-1

2

22

-No

v-1

2

29

-No

v-1

2

6-D

ec-1

2

13

-De

c-1

2

20

-De

c-1

2

27

-De

c-1

2

3-J

an-1

3

10

-Jan-1

3

17

-Jan-1

3

24

-Jan-1

3

31

-Jan-1

3

7-F

eb-1

3

14

-Feb

-13

21

-Feb

-13

28

-Feb

-13

7-M

ar-1

3

14

-Mar-1

3

21

-Mar-1

3

28

-Mar-1

3

°CkW

Recovered power (kW) Outdoor temperature (°C)

PDC Center for

High Performance Computing

Second heat re-use project

• New Cray XC40 700 kW

• Water cooled

• Central heat re-use facility with heat pump

20

PDC Center for

High Performance Computing

Cray XC40 water cooling (hybrid cooling)

• CPU board is air cooled but the air is cooled by heat

exchangers and fans in the adjacent racks

• Simple solution but with lower water temperature

and some energy loss by the fans (more like cooling

doors)

21

PDC Center for

High Performance Computing

Cray XC40 water cooling

22

PDC Center for

High Performance Computing

Central heat re-use at KTH

PDC Center for

High Performance Computing

Experience

• Supercomputers excellent

heaters

• Almost constant heat

• Heat re-use really works and

pays off•Energy to KTH reduced

23,700 MWh/year (25

%)

•CO2 reduced 890 000

kg/year (44 %)

PDC Center for

High Performance Computing

Why liquid cooling

Liquid cooling is being adopted for a variety of

reasons:

• Silence (No fans)

• Resilience (More stable, efficient cooling)

• Efficiency (Fans replaced by pumps)

• Environment & Cost

– Higher temperature -> More free cooling

– Higher temperature -> Possibility for heat re-use

• Total Cost of Ownership (TCO)

• Hostile or Complex Environments (Not

depending on clean air)

2015-11-11 25STOCKHOLM

PDC Center for

High Performance Computing

Different types of cooling

2015-11-11 26STOCKHOLM

Air Cooling

Indirect Liquid Cooling

Direct Liquid Cooling

Total Liquid Cooling

• Air is passed through servers and then through a rear door or in row air-water heat exchanger.

• Liquid is taken direct to some components

• Fans are still needed.

• All components are cooled directly by liquid. Air side losses are minimised.

• Inside the DC, the system has no fans and breathes no air.

Slide courtesy by Iceotope

PDC Center for

High Performance Computing

ASHRAE Water Cooling

American Society of Heating, Refrigerating & Air-Conditioning Engineers

PDC Center for

High Performance Computing

Direct Liquid Cooling Widely Available

• Cooling with liquid more or less close to the

CPU/memory etc.

• Advantages:

• Low extra energy used for cooling (liquid pump)

• Efficient cooling – possibility to run CPU faster

• Depends of how close to CPU

• Higher liquid temperature than going through

air

• Free-cooling for longer time of the year

• Possibility for heat re-use

28

PDC Center for

High Performance Computing

PUE is not that important!

Heat re-use gives

more savings than

improving PUE

PDC Center for

High Performance Computing

Problems and some solutions

• Corrosion and bacteria in hot water• Filters, devices to take away air, chemicals, black

magic…

• Computers short life time (4 years vs. 10 -30 for infrastructure)

• Lack of standard for liquid cooling

• Different temperatures and pressures

• Different requirements on clean water

• Flexible solutions required!

PDC Center for

High Performance Computing

Experience

• High temperature is

everything• Collect heat close to the CPU

• Best: Direct Liquid Cooling

• Encapsulate the heat

• Don’t mix high and low temp

PDC Center for

High Performance Computing

Case study: HP Apollo 8000

• Closed passive heat pipes on the CPU cards

• Heat is transferred via heat transfer to a separate

liquid loop in the rack

32

PDC Center for

High Performance Computing

Heat Pipe Demo

PDC Center for

High Performance Computing

HP Case study continued

34

PDC Center for

High Performance Computing

HP case study continued

35

PDC Center for

High Performance Computing

HP case study.

• Modularized plumbing system

36

PDC Center for

High Performance Computing

2015-11-11STOCKHOLM 37

Another liquid: Mineral oil

• Green Revolution Cooling

• Fluid is dielectric

• Electronics can be submerged in

the fluid (not moving parts)

• One phase

• Heat is transported by circulation

• High temp possible (60C)

PDC Center for

High Performance Computing

Yet another liquid: 3M Novec

• Fluid is dielectric

• Electronics can be submerged in the fluid

• One phase

• Heat is transported by circulation

• Novec 7300 with boiling point 98°C is used

• Two phase system

• Boiling and Condensation

• Novec 649 with boiling point 49°C is used

• 3M claims the fluid is environmental safe (but I doubt)

• The fluid is very expensive

PDC Center for

High Performance Computing

Very simple submerged cooling

• Interconnect company doing cooling

on the side

• Box of Intel Phi with Extoll

interconnect

• Boiling Novec

• Water cooled coil

PDC Center for

High Performance Computing

2015-11-11STOCKHOLM 40

PDC Center for

High Performance Computing

Iceotope

PDC Center for

High Performance Computing

Iceotope: Secondary Coolant flow

45 degrees in

Up to 55 degrees out

PDC Center for

High Performance Computing

Rack with highest power

• RSC

• 400 kW/rack

• 1.2 PFLOPS

• 1024 Intel® Xeon Phi™ 5120D

PDC Center for

High Performance Computing

Lot of products available but not much

standardization

• Water temperatures

• Water quality

• Connectors

• ASHRAE is doing some

standards

44

Liquid cooling activity

PDC Center for

High Performance Computing

gert@pdc.kth.se

45

Questions?