of 59
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
1/59
Whos Killing Who?Not So Smart DC Decisions
Carrie Higbie, RCDD/NTSGlobal Director Data Center Solutions and Services_ .
TechTarget Ask the Experts and columnist
, ,Member IEEE, USGBC, WGBC, Uptime Institute, 7x24exchange,
Columnist Performance Networkin , SearchCIO,
SearchMobile, ZeroDowntime, ComputerWorld,IT World, Network World, Financial Review, Strategic Downtime
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
2/59
A Greener A roach toNetwork Cabling
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
3/59
Siemon Culture 3300 Acres ofonserve
Forest
ISO 14001
220 KilowattResponsible
Power Plant
Zero Landfill
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
4/59
Green Dat a Cent ers Begin
w reen n ras ruc ure
Choices
To truly build a green data center, many
factors come into play:
Maximize useful life
Power consumption/Thermal management
Reduce Material usage
Physical infrastructure represents an
opportunity for energy and resource savings
Siemon rovides both the ex ertise and
technology to put a green data centerinfrastructure plan into action.
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
5/59
Top Data
Concerns
Data CenterDynamics Research
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
6/59
According to the Uptime Institute the three year cost ofpowering and cooling servers is currently one and a half timesthe cost of urchasin the server hardware.
Server energy demand has doubled from 2000-2005
1.2% of the US electrical usage
Equal to 5 1,000 MW power plants (study by LBL and Stanford)
-centers TCO
EU directive to drive a 20% reduction in energy by 2020
U.S. Federal Executive Order 13423
Improve energy efficiency and reduce greenhouse gasemissions through reduction of energy intensity by:
3 percent annually through the end of fiscal year 2015 fora total reduction of 30%
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
7/59
2% (now) of overall power and emissions
o power s coo ng
29% is servers
Chassis switches
48 Port switches
-
CPU processing causes power spikes
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
8/59
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
9/59
Where will you be without planning?
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
10/59
CRACS are more efficient Networking/Servers/SANs operate at
higher temperatures
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
11/59
Temperature Rise Profiles
11
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
12/59
More devices
More data
PoE and PoE Plus is coming
40G and 100G are on the horizon Cable abatement is critical
We want and installation that is once and done
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
13/59
Virtualization
onso at on
Increasing storage requirements
Blade technologies
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
14/59
Facilities Security
Networking Servers
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
15/59
The Challenges
IT budgets versus Facilities budgets
Bud ets based on a % of construction costs
Budgets based on older numbers Reactionary decisions
Larger cables
Bigger pathways arger equ pmen - mm won wor or a s es
New technology for fiber increases strand count
Different budgets for different departments
Self serving vendors and misplaced alliances Poor designs leads to cable waste
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
16/59
TR 942 Design Considerations
Top of Rack is Allowed as an exception here only!
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
17/59
Data Center Design The StandardISO-IEC 24764 (Draft)
Top of Rack is Allowed as an exception here only and must be inrack or close proximity!
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
18/59
TIA-942
Horizontal and Vertical channels shall be run accommodating growth sothese areas do not have to be revisited
No more bus or direct connections unless specifically REQUIRED by the
No shared sheath media Category 6A recommended (echoed by Cisco 6A or 7) Note: this is an
- -
ISO/IEC 24764
Horizontal and Vertical channels shall be run accommodatin rowth sothese areas do not have to be revisited
All systems shall be connected via a structured cabling system, point topoint connections are allowed for short run cables in server cabinets only
Category 6A minimum (UTP or F/UTP) Class EA Category 7 and 7A/Class F and FA
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
19/59
10GBASE-EX, 10GBASE-LX, 10GBASE-SX (fiber) - , -
10GBASE-T (Copper)10GBASE-CX4 (Twinax point to point Copper - limited 5-15m)Infiniband (CX4 Cable, Fiber, 4 Pair Copper)ATM/SONET Equivalents (Varies with installation)SFP+ (limited to 1-5m)
Maintenance costs are normally based on original equipment purchase pricesFiber Channel and FC-BASET (2,4,6,8,16,32,64, 128 Gig over twisted pair)100G and 40G under development in IEEE as next speed increase
Twisted pair higher speed to followEEE new technology to address power may also include PoE
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
20/59
-
First chips 10-17W per port (proof of concept)
Gigabit was first introduced at 6.5W per port now
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
21/59
w tc
SFP+ >1W
- -
Server NIC
SFP+ ~18W
10GBASE-T ~7W Low Power Mode,
10GBASE-T ~15.5W 100m Mode
Will be less with EEE net average power
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
22/59
Comparison Chart of Annualized Costs
$2,000.00
Annualized Cost Comparison
$500.00
$1,000.00
$1,500.00
Dollars
$0.00
Annualized at 1G
Annualized at 10G
Category of Cabling System
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
23/59
Temperature Rise Profiles
23
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
24/59
Greatest Versatilit over time
Significant costs savings in switches
Easier to manage and document Point to point connections create
spaghetti
Locks you into a solution and a vendor
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
25/59
Impact to cooling
Ease in troubleshooting Data centers that have grown out of need rather than planning are a
mess
MAC work is more expensive than project related channels
MAC quality may or may not be as good The standards tell us to cable accommodating growth over the life of
the system (cabling should support 2-3 generations of electronics)
Recabling is NOT green recycling of plastics, waste of copper, etc.
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
26/59
s no
a simple environment A DC must be:
.
Flexible
TCO compliant
Data Centers have to be:
Carefully designed
Constantly managed
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
27/59
Killing Data Center Cooling Efficiency Air Damming
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
28/59
Cooling the cabinet
Airflow is critical
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
29/59
Whats Wrong with this?
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
30/59
Design Practices Where Does Green Come In?
EDAs
HDAs
MDA
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
31/59
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
32/59
-
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
33/59
Cables are Easier to Trace
Easier when blade changes are
required
Colder air at bottom of cabinet is
better for equipment Failure rate at top 1/3 of cabinet is 3x
greater than bottom 2/3s
power/cooling constraints
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
34/59
Any to All 4 connector
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
35/59
Point to Point
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
36/59
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
37/59
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
48 PORTPATCH
PANEL TO
PATCHING
PATCHING
PATCHING
PATCHING
PATCHING
PATCHING
PATCHING
PATCHING
PATCHING
PATCHING
CENTRALPATCHING
TWO EACHCHASSIS
SWITCHESWITH 6-48
PORT
48 PORT 48 PORT 48 PORT 48 PORT 48 PORT 48 PORT 48 PORT 48 PORT 48 PORT 48 PORT
AREA
BLADES
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
PATCHPANEL TOCENTRALPATCHING
POWER SUPPLY
FIXED CHANNEL
PATCH CORD/JUMPER
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
38/59
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
39/59
Number Number of Total Ports Port
Supplies
Point to Point 20 ea. 40 960 400
(28 ports used per cab)
Central Any to All 2 chassis based with
4 576 16
In a data center with 20 server cabinets housing 14 servers eachrequiring two network connections each (560 total ports required)
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
40/59
Hot Spots Land Locked
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
41/59
Emerging Applications Beyond 10Gbps
Proposed 40Gbps Ethernet
-
multimode optical fiber (OM3)
Target: 10m, copper cabling assembly
Target: 100m, 850nm laser-optimized 50/125m
multimode optical fiber (OM3)
Tar et: 10m co er cablin assembl
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
42/59
Application Wavelength62.5
160/50062.5
200/50050
500/50050
2000/500 SMF
100BASE-SX 850nm 300m 300m 300m 300m
1000BASE-SX 850nm 220m* 275m* 550m 550m
1000BASE-LX 1300nm 550m 550m 550m 550m 5km
10GBASE-SX 850nm 28m 28m 86m 300m
10GBASE-LX 1310nm 10km
10GBASE-EX 1550nm 40km
10GBASE-LRM 1310nm 220m* 220m* 220m* 220m
10GBASE-LX4 1310nm 300m* 300m* 300m* 300m 10km
*Mode conditioning patch cords will add to channel costs.Only 50 micron 2000/500 and SMF are in 100G/40G IEEE HSSG
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
43/59
Data No. ofRate Fibers
40GBASE-SR4 40Gb/s OM3 4 * 10Gb/s (parallel) 8 100m
- *
100GBASE-SR10 100Gb/s OM3 10 * 10Gb/s (parallel) 20 100m
100GBASE-LR4 100Gb/s OS1 4 * 25Gb/s WDM 2 10km
100GBASE-ER4 100Gb/s OS1 4 * 25Gb/s (WDM) 2 40km
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
44/59
-
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
45/59
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
46/59
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
47/59
No Cable Management
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
48/59
No Cable ManagementCables covering fans, blocking airflow
Cables Cutting off AirflowOn Side Discharge Switches
Bend radii violated
No routing cabinet to cabinet
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
49/59
Swing Arms blocking Airflow
Doors removed due to lack of room
Esthetically unpleasing
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
50/59
Cant use doors
Esthetically unpleasing
Exhaust fans blocked
No channel for cables
Bend radii violations
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
51/59
Cant use doors
Cable blocking fans,
Improper routing of cables
Improper cable support
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
52/59
Can you find the exhaust fan in this picture?
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
53/59
Cabinet to cabinetrouting means nodoors!
or zon a on y ermanagement is only
what is needed Improper support
Bend radii issues
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
54/59
Permanent link
cables need to besupported and
.
Fibers hanging over
cabinet
and power
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
55/59
ou racmount PDUs
up in tangled
Mixing power
the floor is acode violation
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
56/59
problems!
Patch cords block airsupply in cold aisles
Densit not su ortedin smaller cabinets
Patching areas areobscured due to lackof management
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
57/59
backwards- blowinghot air into the coldaisle and pulling in
heated exhaust fromservers
PDUs covering
Exhaust fans are
After!
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
58/59
BeforeAfter!
8/9/2019 Who is Killing Who - Not So Smart Dc Decisions
59/59
Skype ID: chigbie