How Microsoft DoesData Centres
John DwyerArea Data Centre Mgr - InternationalData Centre Solutions
The Global Services FoundationAcross the company, all over the world, around the clock
Zurich .Net Online
2 MBS Online
Office Labs
Sharepoint.Microsoft.com
Yellow box or text = pipeline
Plus over 150 more sites
and services
Azure
Scale and Market Growth
Server Infrastructure Doubling Every Year
Network Capacity 9x Growth in Four Years
Tripling of Data Centre Instances
Dramatic Expansion of Server & Network Geo-Diversity
Managed Growth of Power Capacity & Consumption
Middle East
Africa Latin America
Asia Europe Oceania / Australia
North America
0%
200%
400%
600%
800%
1000%
1200%
Internet Population Growth Rate 2000-2008
Source: http://www.internetworldstats.com
3
Data Centre Economics Have Changed!• Cost of physical space was a primary consideration in data centre design
• Cost of power and cooling has risen to prominence
• Data centre managers now must prioritize investment in efficient power and cooling systems to lower the total cost of operating (TCO) of their facilities.
Belady, C., “In the Data Center, Power and Cooling Costs More than IT Equipment it Supports”, Electronics Cooling Magazine (Feb 2007)
4
Site Selection Internet Population Internet Peering / NetworkMobile UsersPower PricingEnvironmental Construction CostsTax Climate IT Labor AvailabilityCorporate Citizenship
Composite Heat Map
5
Why Power Matters…• In 2006, U.S. data centres consumed an estimated 61 billion
kilowatt-hours (kWh) of energy, which accounted for about 1.5% of the total electricity consumed in the U.S. that year.
• In the EU, data centres consumed an estimated 56 billion kilowatt-hours in 2007.
• As an Industry Segment, Data Centres are the fastest growing Energy Segment in the US.
• Current projections are that data centre power consumption will exceed 100 billion kWh by 2011 in the US and by 2020 in the EU.
• Those levels of power consumption in the US would necessitate the construction of 10 additional power plants.
6
Relevant Metrics at Microsoft
PUE/DCiE
DC Utilization
Server Utilization
CostMove from Cost = f(space)
Cost = f(power)
X 100%
The Green Grid
SCRY
8
Setting Aggressive PUE Targets
2006 2008 2010 20121.000
1.200
1.400
1.600
1.800
2.000
2.200
Annual Average PUE Targets
PUE
9
Environmental Control Standards
10
Air Movement12%
Electricity Transformer/
UPS10%
Lighting, etc.3%
Cooling25%
IT Equipment50%
Source: EYP Mission Critical Facilities Inc., New York
Where Data Centre Power Goes
GFS’ Infrastructure Services is focusing on all the pieces of the pie
Opportunities
areEverywhere
11
Air Movement12%
Electricity Transformer/
UPS10%
Lighting, etc.3%
Cooling25%
IT Equipment50%
Source: EYP Mission Critical Facilities Inc., New York
Where Data Centre Power Goes
GFS’ Infrastructure Services is focusing on all the pieces of the pie
Opportunities
areEverywhere
Widening environment can remove chillers and drive this to
zero
Offline UPS technologies can drive this substantially
down
Virtualization, active power management
12
Three of our Data Centres
13
Data Centre Costs in the US
Land - 2%Core & Shell Costs – 9% Architectural – 7%Mechanical / Electrical – 82%
Since 2004 -16% increase year-to-year
LandCore / ShellMech /ElecArch
Where the costs are:>80% scale with power<10% scale with space
14
SCRY – Window to Our World
15
SCRY Helps Demonstrate Continuous Improvement
2006 2008 2010 20121.000
1.200
1.400
1.600
1.800
2.000
2.200
Microsoft Average PUE Targets for
New DCs
PUE
22% improvement over 3 years
Follows Moore’s Law
On existing data centres and helps set goals for new data centres at Microsoft
Where We Think Things Are Going …
17
18
19
Futures – Containers (Chicago)
20
Why We Like Containers1) Can deploy at scale2) Plug and Play3) Drives Innovation4) Abstracts away religious wars in
competitive bid• AC vs DC• Air vs Liquid
5) Cost can include maintenance6) Allows for easy cost and
performance measurement7) Creates an environment to drive
competition around efficiency8) One throat to choke
Question:Is this water cooling or air cooling?
21
Container SolutionsUse a standard ISO (40’, 20’, 10’ x 8 x 8’6”) shipping container to hold servers
Portability allows the delivery and operation of servers in self -contained units
Move costs from long lead to short lead equipment, increased ROI CapitalOptimize server Delivery at 1000U+ as a unit vs. 40+U in a rack and a Single SKU & Warranty
Containers seen as a solution to burst demand and temporaryMicrosoft’s approach is different –use them as a primary packaging unit
Cost: It costs less to ship 2,000+ servers in one container than it does to ship and then install individual racks. Additional savings come from not needing raised floors or fans for each server, and requiring a lot less wiring, packaging and shipping. 22
Container SolutionsThe container gives us the opportunity to test new technology such as increased supply air temperature, removal of fans from servers, managed airflow with hot aisle containment, significantly increased WPSF and more efficient electrical distributionsMicrosoft have stand alone units running in production today and are running proof of concepts on newer technology
Energy efficiency: At more than 1,000 Watts per square foot, containers allow us to power a lot more servers in a given area. This is the key reason containers have become economically viable. PUE numbers tested in a POC measured at peak ~1.323
Microsoft Confidential
Container POC
GFS DCS Ran a Proof of Concept on a Container System in Seattle, Washington, USA
PUE came in between 1.2 and 1.3 éRan unit up to full load measured at 107 kW = 178 watts per server éDropped power to unit and came back online no issues éFans and servers ran at full speed for 7 minutes on batteries éContainer spec completed and vendor RFP underway é
Batteries temp remained at 75 F using probe and IR camera. Back is exposed to 85 F Need to place a temp probe at rear of battery èAmbient air temp had a large effect on temp inside the trailer due to no insulation èPermit and UL inspection took 90 days to obtain è
Harmonics above 15%: varies across all 3 phases ê
24
Timeline started with container in late August
PO approved in October First unit Live in January5 months from Engineering to Live Delays encountered:
1 month: Flood plain re-planNew location, elevated foundation
1 month: Excell energy transformer install1 month: Final concrete delayed due to snow
Actual planning, permitting, construction effort totaled about 3 monthsFirst container Jan 5th, third container Feb 2nd
Container vendor committing to 6-8 week turn around on order long term
Virtual Earth Case Study
25
Containers - Chicago Data Centre
Elevated Spine Connection
Microsoft Container: •2400 Servers•375 KW•Standard 40 foot shipping container• Target PUE 1.25
Top Floor: 10.8MW Traditional COLO CapacityGround Floor: 20 MW Container Capacity
26
27
But More Change Is Coming…
Generation 4 Modular Data CentresChallenging Many Data Centre Conventions
Prefabrication of Shell and M&E PlantPre-Assembled ContainersPower Density > 1,000 Watt / Square FootTotally Flexible Configuration (Classes)PUE < 1.1 (depending on Class)
3-5 Month Time-to-MarketReduced Cost of EntryApplying the Model-T Approach
http://loosebolts.wordpress.comhttp://blogs.technet.com/msdataCentres/
28