+ All Categories
Home > Documents > September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC...

September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC...

Date post: 22-May-2020
Category:
Upload: others
View: 4 times
Download: 0 times
Share this document with a friend
26
LCS Value Proposition September 2014
Transcript
Page 1: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

LCS Value Proposition September 2014

Page 2: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Data Center Trends 2014

• Improving Data Center Power Efficiency & Reducing Carbon Footprints • Initiatives for raising data center temperatures

• Data center upgrades for cooling and power management

• Low power server alternatives (ARM) – HP Moonshot, etc.

• DCIM offerings for efficient management of DC infrastructure

• Increasing power density & core counts for HPC clusters with focus on energy efficiencies.

• “Green 500” configurations with pervasive adoption of GPGPU and Intel Phi cores for compute

• Moves to higher voltage power inputs to reduce copper and power distribution challenges

• Adoption of alternatives to traditional air cooling schemes

• Cold Plates utilizing water and ethylene water glycol fluids

• Heat Pipes used in conjunction with water based heat exchangers

• Single phase and phase change submersion cooling employing dielectric fluids

• Market growth for modular approaches to add Data Center capacity • Attractive economics, flexibility, and lead time compared to brick and mortar options

• Modular Data Center Market projected by 451 Research to be $2.5 Billion annually by 2015

© LCS 2014 - Proprietary and Confidential 2

Page 3: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

LCS Cooling Complements Trends

• Improving Data Center Power Efficiency • Reductions in “Power to Cool” at the device level of up to 98% (Traditional Air vs LCS)

• Typical reductions in overall data center power consumption of 40%*

• Inherent ability to recover and utilize waste heat for facility purposes

• Potential for achieving “True PUE’s” as low as 1.02

• Increasing Power Density • Vastly superior physics for heat transfer – volumetric heat capacity ~ 1500x that of air

• More compact packaging options – no need for fans

• Cooling system components can be remotely configured to increase IT density

• Modular Data Center Configurations • Lower cost infrastructure for fluid distribution and heat exchange

• More compact footprints than with typical air-cooled options

• Lower operating costs and maintenance requirements than air cooled configurations

• No considerations or provisions required for local air quality – sealed systems

© LCS 2014 - Proprietary and Confidential 3

* Based on comparison to Data Center Study of 2012 ASHRAE Report Findings

Page 4: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Fans are a Problem

Fans Waste Energy

• 15% of total datacenter energy is used to move air

• Additionally fans in the chassis can use up to 20% of IT power at the device level

• Fans are inefficient and generate heat that must be removed

Fans Waste Space

• Racks need room to breathe

• CRAC units require space around the racks

Fans Reduce Reliability

• Fans fail

• Thermal fluctuations drive solder joint failures

• Structure borne vibration frets electrical contacts

• Fans expose electronics to air • Oxidation/corrosion of electrical contacts • Exposure to electrostatic discharge events • Sensitivity to ambient particulate, humidity and temperature conditions

Four companies offer technologies that eliminate fans

• Iceotope

• Green Revolution Cooling

• HP

• LCS

© LCS 2014 - Proprietary and Confidential 4

Page 5: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Iceotope

• Iceotope mounts off-the-shelf motherboards inside sealed hot-swappable cartridges that are flooded with a dielectric fluid, 3M Novec.

• Novec, which remains a liquid, moves heat to a hot plate mounted on the side of the cartridge.

• Water circulates in a secondary circuit through the hot plate, and heat is transferred from Novec to water in the hot plate by conduction.

• Aster Capital, an investment group backed by Alstom, Schneider Electric and Solvay, recently invested $10 million in Iceotope

• Along with the investment Schneider announced that it intends to commercialize Iceotope technology

© LCS 2014 - Proprietary and Confidential 5

Page 6: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Green Revolution Cooling

• Green Revolution Cooling’s CarnotJetTM system circulates a mineral oil based dielectric fluid through a tank containing submerged IT devices.

• The system resembles a rack tipped over on its back, with modified servers inserted vertically into slots in the tank.

• Each 42U tank is filled with roughly 250 gallons of the mineral oil.

• Maintenance access is through the top of the tank

© LCS 2014 - Proprietary and Confidential 6

Page 7: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

HP

• HP’s Apollo 8000 system includes custom servers, racks and cooling distribution units.

• Rack dimensions are 24”x56”x94” and weigh 4700 lbs.

• Sealed heat pipes transfer heat by conduction to hot plates mounted on the sides of the servers.

• Heat is transferred by conduction from the hot plates to water-cooled heat exchangers mounted on rack sidewalls.

• Entering water temperature < 30oC

• The water side operates at sub-atmospheric pressure to limit damage when a leak occurs.

© LCS 2014 - Proprietary and Confidential 7

Page 8: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

LCS Technology Eliminates Fans

Patented Directed-Flow Technology

• No fans or other moving parts in the chassis

• Total liquid submersion in a eco-friendly dielectric fluid

• Rack-mounted devices are easy to maintain

• Within a device “cool” liquid is circulated directly to the components with the highest power density

• The remaining components are cooled by bulk flow as the dielectric fluid is drawn through the unit to a return manifold

• Electronics are decoupled from the environment

© LCS 2014 - Proprietary and Confidential 8

How it Works

Page 9: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Heat Dissipation

LCS Cooling System Elements • Pump supplies “cool” dielectric liquid to multiple IT racks

• If there is no energy recycling option “hot” fluid is circulated to an evaporative fluid cooler

• Incoming “cool” fluid can be as warm as 45°C for most applications

© LCS 2014 - Proprietary and Confidential 9

Page 10: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Reliability

© LCS 2014 - Proprietary and Confidential 10

LiquidCool technology decouples electronics from the room, enhancing reliability by eliminating the root causes of failure:

• Dramatic reduction in thermal fluctuations, which drive solder joint failures

• Much lower operating temperatures for the board and components (typically 20-30 C cooler device temps than with air)

• No oxidation/corrosion of electrical contacts

• No fretting corrosion of electrical contacts induced by structural vibration

• No moving parts within the device enclosure (fan failures are one of the highest service triggers for electronics)

• No exposure to electrostatic discharge events

• No sensitivity to ambient particulate, humidity, or temperature conditions

When maintenance is required to upgrade components:

• The swap out procedure takes less than 2 minutes with no measurable loss of fluid

• An IT device can be removed from a rack, drained, opened, serviced, reassembled, refilled, and reinstalled within a 15 minute turnaround window

Page 11: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Robust IP Portfolio

17 Issued & 21 Pending Patents

• Liquid tight server case with dielectric liquid for cooling electronics

• Extruded server case used with liquid coolant of electronics

• Computer with fluid inlets and outlets for direct-contact liquid cooling

• Case and rack system for liquid submersion cooling of an array of electronic components

• Computer case for liquid submersion cooled components

• Liquid submersion cooled computer with directed flow

• Gravity assisted directed liquid cooling of electronics

© LCS 2014 - Proprietary and Confidential 11

Page 12: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Any Shape or Size

8 servers with liquid-to-liquid cooling distribution unit

© LCS 2014 - Proprietary and Confidential 12

Industrial and embedded computing applications

Page 13: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Any Shape or Size

Liquid Submerged Computer with Passive Radiator

© LCS 2014 - Proprietary and Confidential 13

Liquid Submerged Computer with Stacked Boards

Page 14: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

64-Server Configuration

© LCS 2014 - Proprietary and Confidential 14

Connection to remote

CDU

Page 15: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Low Cost “Clamshell ”Server

© LCS 2014 - Proprietary and Confidential 15

• Motherboard sandwiched between two sealed enclosures • Rack-mountable • I/O connectors remain outside the liquid enclosure

Page 16: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Clamshell Server System

© LCS 2014 - Proprietary and Confidential 16

Page 17: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Example –Modular Data Center

Air-Cooled Input Power – 500kW

IT Power – 250kW

LiquidCool Input Power – 320kW

IT Power – 250kW

© LCS 2014 - Proprietary and Confidential 17

Page 18: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

1MW Hybrid HPC Module

© LCS 2014 - Proprietary and Confidential 18

Overall dimensions 12’ x 12’ x 42’

Page 19: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

200 KW Hybrid HPC Module

Liquid cooled section for compute Air cooled section for data storage & switches

Overall dimensions: 12’ x 12’ x 20’

© LCS 2014 – Proprietary and Confidential 19

Page 20: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Fluid Distribution for LSS 220 Rack

© LCS 2014 – Proprietary and Confidential 20

Page 21: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

250 kW Liquid-to-Liquid CDU

Fully Redundant Heat Exchangers, Pumps, and Pump Control Units

Approximate size: 48” wide x 48” deep x 30” high

© LCS 2014 – Proprietary and Confidential 21

Page 22: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Current Facility Layout (Air-Cooled)

• 50’ x 100’ Facility Space

• 115 racks of air cooled IT equipment

• Estimated IT compute power consumption of 500 kW

• 7 Air Handling Units fully operational at capacity

Federal Data Center Upgrade

© LCS 2014 - Proprietary and Confidential 22

Facility Space Reduction using LCS Servers

• 50’ x 100’ Facility Space

• 24 Liquid Cooled 48U racks w/72 servers/rack

• 500+ kW of IT compute capacity at 21 kW/rack

• Approximate footprint = 450 ft2

• Air Handling Units to Red may be decommissioned

• Remainder of space available for repurposing or

expansion

Page 23: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Transformative Reductions in Operational Energy Consumption

TROPEC Objective – Allow enterprise servers and communication/network equipment to

continuously operate in tropical environments with no mechanical cooling

• LiquidCool submitted a proposal for “Modular System for High-Efficiency Electronics Cooling at

Expeditionary Base Camps” in September 2013

TROPEC

© LCS 2014 - Proprietary and Confidential 23

• A comprehensive assessment of the LiquidCool

system has been completed at Lawrence Berkeley

National Laboratory

• US Navy (PACOM) has recommended that the

LiquidCool system be moved forward to the field

assessment phase

• Independent testing of the TROPEC system

achieved successful cooling of an HPC server and

cooling unit at ambient temperature of 101°F for 24

hours achieving a true PUE of 1.019

Page 24: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Benefits from Eliminating Fans

© LCS 2014 - Proprietary and Confidential 24

Cools High Power Electronics - Lower operating temperatures result in lower leakage current

- All internal components are kept within normal operating temperature ranges

Saves Energy - For LCS, device power-to-cool can be reduced by up to 98% vs. air-cooled devices

- Fr LCS, the “true” cooling PUE is ~1.03

- Waste heat easily can be recovered for other uses

Saves space - Higher rack density because there is no need for hot/cold aisles and no need to

circulate air in the racks

- For LCS, 64 IT devices in a 42U rack or 72 devices in a 48U rack

Enhanced Reliability

- Sealed fluid circuits prevent failures from corrosion and contamination

- Liquid submersion reduces thermal fatigue of solder joints

Operates Silently - Fan noise is eliminated

Page 25: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Additional LCS Benefits

© LCS 2014 - Proprietary and Confidential 25

Any Shape or Size

Green Revolution, HP, Iceotope

No Water

HP, Iceotope

Scalable

Green Revolution

Easy to Swap Devices

Green Revolution

Easy to Maintain Devices

Green Revolution, HP, Iceotope

Harsh Environment Deployments

Green Revolution, HP

Costs Less than Air

HP, Iceotope

Liquid Submerged Computer Operating

Underwater in an Aquarium

Page 26: September 2014 - LiquidCool Solutions · • Increasing power density & core counts for HPC clusters with focus on energy efficiencies. • “Green 500” configurations with pervasive

Questions

© LCS 2014 - Proprietary and Confidential 26

Rick Tufty, VP Engineering [email protected] 507-535-5829

Herb Zien, CEO [email protected] 414-803-6010

Jay Ford, VP Sales & Marketing [email protected] 847-370-7296

Steve Einhorn, Chairman [email protected] 414-453-4488


Recommended