+ All Categories
Home > Documents > 5.3 Data Center - DELL

5.3 Data Center - DELL

Date post: 31-Dec-2021
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
11
DATA CENTER 1 Data Center 5.3 These guidelines are the general design standards for all Dell global projects. Please see the standard below for regional specific details that may vary from location to location. REQUIREMENTS AND PLANNING General Requirements • Data Centers can have several types and/or configurations based on number of racks or cabinets required for the business unit’s equipment. • The type and size of data center is determined by the factors below. · Types of corporate applications being supported · Number of racks or cabinets · Type of specific location · Power and cooling requirements · Configuration of the space to be installed in · Consider weight of racks for layout compared to existing facility structural conditions. Overview The data center is the facility that centralizes Dell Technologies IT operations and equipment, as well as where it stores, manages, and disseminates its data. Data centers house a network’s most critical systems and are vital to the continuity of daily operations. DESIGN STANDARDS | WORKPLACE ENVIRONMENTS
Transcript
Page 1: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 1

Data Center5.3

These guidelines are the general design standards for all Dell global projects. Please see the standard below for regional specific details that may vary from location to location.

REQUIREMENTSAND PLANNING General Requirements • Data Centers can have several types and/or configurations

based on number of racks or cabinets required for the business unit’s equipment.

• The type and size of data center is determined by the factors below. · Types of corporate applications being supported · Number of racks or cabinets · Type of specific location · Power and cooling requirements · Configuration of the space to be installed in · Consider weight of racks for layout compared to existing

facility structural conditions.

Overview

The data center is the facility that

centralizes Dell Technologies IT operations

and equipment, as well as where it stores,

manages, and disseminates its data. Data

centers house a network’s most critical

systems and are vital to the continuity of

daily operations.

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS

Page 2: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 2

Planning Specifics 1. Racks are provided in several different sizes and configurations. These need to be confirmed and coordinated with the business unit. 2. Data Center layout will utilize a hot and cold aisle configuration to increase cooling efficiency. 3. Evaluate cooling options on an individual project basis so that the specific needs of a project can be met. The following is a brief list of items which must be taken into account during the selection of a mechanical system for a project: a. Project Size (This includes both the total project area as well as the area that can be dedicated to Mechanical and Electrical Equipment) b. Load Requirements (kilowatt per rack, watt per square foot/meter) c. Geographic Location d. Budget e. Maintenance and Reliability f. Scalability g. Efficiency h. Life Cycle Cost Analysis.

4. Enterprise Data Center should be reviewed for best case cooling design. 5. The following is a list of items which must be taken into account during a selection of an electrical system for a project. a. Project Size and Load Requirements (kilowatt per rack, watt per square foot/meter) b. Maintenance and Reliability c. Budget d. Efficiency e. Scalability f. Life Cycle Cost Analysis 6. Follow piping and piping color standards for proper color systems.

Page 3: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 3

GENERAL INFORMATIONSpace Design• All finish selections to be reviewed and approved by Global Space Planning. Selections to be coordinated with the PMO budget and schedule.• Flooring: • Provide a sealed concrete subfloor able to withstand design loads for computer room equipment and prevent dusting. • Where raised floors are utilized, depress floor slabs are preferred to avoid the needs for ramps. Provide proper curbs, drains, and pumps to avoid area flooding. Provide curbs for equipment. Provide floor hole covers designed for the raised floors. • Utilize a 3’-0” (900mm) space between concrete sub floor and the raised floor when possible. • Design or confirm the design of the elevated floor slab loading to be 150 pounds/square foot (70 kilograms/m2). Confirm with local codes. Assume fully loaded equipment would be 5000 pounds (2300 kilograms) or verify with actual server cabinet limits. • Install high-pressure laminate raised floor panels on a grounded raised floor system. Use materials tested for delamination in the environment where the raised floor will be installed. • Provide electro-static dissipative, vinyl tile flooring (ESD VCT) to avoid any static buildup, where a raised floor is not provided. • Wall base shall be installed along the perimeter wall of the data center to provide adequate seal between the raised floor and the wall (where applicable). • 4” (100mm) rubber wall base is acceptable as a durable and cost effective material. (Refer to Building Standards), • Install leak detection where raised access floor is installed. • Install rope style system with intelligence to indicate the location of the wet rope. • Include zoning map.• Walls: • Install metal stud walls with drywall, acoustically insulated, full height from floor to deck above and painted full height to avoid dusting. • Seal dividing walls to prevent air infiltration and leakage from and to adjacent non-critical space areas. • Exterior walls, ceilings and slabs shall have vapor barriers. Consider installing interior wall vapor barrier. • If adjacent walls are common to non-Dell space, Global Security will complete an assessment of the site to determine the risk of forced access and recommended solutions (for example, steel plate/glass break detection). Understand lease building requirements during planning.

• Exterior windows and skylights within the data center space are not allowed. Windows on the interior partitions are not allowed.• Doors: • Personnel door openings shall be provided at a minimum 3’-0” (900mm) width x 8’-0” (2400mm) height. • Equipment entrances shall be provided with double doors at a minimum 3’-0” (900mm) width x 8’-0” (2400mm) height for each door leaf. • Egress door hardware only with “Alarm Will Sound” sign. (Confirm requirement with data center ops) • Keypad/badge reader on interior side. (confirm requirement with data center ops) • Provide adequate space to move equipment in and out and minimize turns. Door height to accommodate a minimum of 52U racks. • Coordinate with local codes and requirements. • Doors to be fire rated. Coordinate with local codes and requirements.• Ceilings: • Install a lay-in ceiling within all new data centers if height is not restricted. Ceiling heights shall be between 10’-0” (3000mm) minimum to 12’-0” (3600mm). The higher in this range the better. Coordinate optimal ceiling elevation in coordination with the rack height, power distribution, cooling delivery method, and containment system and cable tray structure. • A structural ceiling grid may be considered as an alternative to the unistrut structure to carry the below ceiling systems. Confirm with local codes and anticipated design loads. • (http://www.datacenteraircontainment.com/ structural-ceiling/) • In the event a ceiling is not required in a data center, seal all exposed insulation to prevent dusting and flaking. Sealing method/solution used to prevent dusting and flaking must be pre-approved by Dell. • Ceilings or other means of containment are recommended for high-density areas within the data center to maximize the efficiency of the cooling system. Ceiling and/or cooling solutions will be determined as part of the overall cooling design. • A ceiling plenum can be used as a containment option for hot air return or supplemental cold air supply if in-row cooling is not provided. • Ceilings are to be bright, humidity-resistant and sound absorptive acoustic ceiling tile to allow easy access above ceiling.

Page 4: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 4

• Supplemental structure includes metal channel strut framing spaced 8’-0” (2400mm) by 8’-0” (2400mm) on center. Overall spacing may vary depending on maximum spacing between supports for electrical and telecommunications/data equipment requirements. This could be integral or below the ceiling grid. Coordinate support for this based on local codes. Provided at 20 pounds/square foot (100 kilograms/square meter). • Align ceiling grid with floor grid.• Containment: • Containment systems are based on the required power densities. Containment shall be solid wall hot aisle or cold aisle containment with sliding or swinging doors at the end of rows. Blank off panels are required in opening in racks.• Security: • Install access control consisting of a keypad/card reader. Add phone near entrance for connection to security in case of maintenance and create an automatic work order ticket. • Provide security cameras for the required views and locations as per current Dell Security Standards. • Coordinate security design requirements with Dell Security for updated standards.• Plumbing: • Provide non-potable make-up where required for mechanical equipment. Provide shut-offs and pressure reducing valves as coordinated with mechanical equipment. Provide an appropriate in-line sediment filter. • Where required by code, install clear water indirect waste drains on storm rises for HVAC condensate discharge. The drains shall be trapped, vented, and with check valve. • Water lines that are not essential to the data center should not run within the data center. This is not applicable for condensate lines associated with the data center cooling equipment.• Fire Protection: • Provide wet system or single-interlocked preaction system. Each room shall be a single zone. • Install FM Approved quick-response sprinklers with “intermediate temperature” sprinkler heads that have a temperature rating from 175 to 225 degrees F. • Minimum density of 0.1 gpm/sq.ft. over a 1,500 sq.ft. demand area. • Coordinate sprinkler system with containment solution when provided. • Ask owner for additional fire suppression requirements. • Refer to Risk Management for additional requirements.

• Fire Suppression: In general, the data center areas supporting mission critical applications shall be designed and constructed to provide fire ratings to meet local codes and Factory Mutual approval. All design elements (walls, doors, windows, ceilings, floors) should meet the same rating level. • Tier 1 and 2 – 1 hour minimum • Tier 3 – 2 hour rating · Data center, UPS room, battery room, shall be protected with an air supervised, dry pipe, pre-action fire sprinkler system. FM200 system or equivalent is an acceptable alternative where local code allows. · Dry-Pipe Pre-action System: Provide a dual Interlock, provide electric – electric controls (no pneumatics), compressed air to be maintained at 35 PSI, alarms to be set 25 PSI, compressor to turn on at 30 PSI, dry-pipe, pre-action zone including releasing panel for the Data Center connected to the building wet-pipe fire sprinkler system. · Pre-action system shall be dual inter-lock type where one side of will be: activation of two smoke detectors and one heat sensor and/or 2 heat sensors and the second side will be a loss of air pressure that is supervisory monitored and sends a signal to the alarm system when the air pressure becomes lower than 25psi. System air pressure will be maintained at 35psi during normal operation. · All piping shall be galvanized schedule 40 FM approved piping. · A supervisory signal shall be installed to monitor low air. · Notifier type system Panel separate from the building panel. · Smoke and heat detectors need to be located equally in the same locations. · All smoke duct detectors to be supervised only by fire panel. Air conditioning controls shall isolate unit and alarm accordingly. Fire panel does not perform any control functions. CALL OUT INTERNAL SMOKE DETECTION IN CRACs in A/C SECTION . · Under floor smoke and head detector heads shall not to be located under equipment and will be clearly labeled on the floor and easily accessible. All piping to be located below the ceiling. · All extinguishers shall be Clean Agent Type (CO2 typical) type. · Inspectors drain outlet shall be located outside of the data center either to the exterior of the building or to a sufficiently sized floor drain. · All fire suppression wiring shall be in a combination of EMT and/or MC flexible conduit. · All penetrations and ceiling and floor connections need to be waterproofed and/or fire caulked in a manner that meets fire rating levels. All such motors and similar heavy duty electrical devices shall have a local disconnect within sight of the load device.

Page 5: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 5

• Mechanical: • Provide an (N+1) redundant mechanical system. • Coordinate with the electrical system to provide concurrent maintainability. • Design high-density areas to provide proper cooling per design load and increase cooling efficiency. • Computer Room Air Conditioning (CRAC) or Computer Room Air Handling (CRAH) units can be DX, chilled water or dual source type units. All cooling units are most efficient when higher aisle temperatures are returned to the cooling unit. Local adopted energy code, economizer requirements, and available ceiling/underfloor space will all have large impacts on what type of system and air distribution is selected. • Larger ducted CRAC/H units should be designed to return hot air from hot aisles and supply cold air to the cold aisles. If ducted return is not feasible, an option might be to place CRAC/H units at the end of hot aisles and use return grille options from the cooling unit manufacturer. • Computer Room Air Conditioning Units (CRACs): CRACs provide a flexible, economical option for small to mid-size data centers. They can be used in any climate and are most efficient when higher inlet temperatures are supplied. Due to their required proximity to the racks, they reduce the amount of white space available within the Data Center. CRACs can be installed in both under floor and overhead air distribution options. • CRACs use outdoor air-cooled condensing units and refrigerant piping to provide a flexible, economical option for small to mid-size Data Centers. Due to their required proximity to the racks, they reduce the amount of white space available within the Data Center. CRACs can distribute and return air under floor, overhead, or a combination of the two. Pressurized supply plenums are not preferred but are sometimes unavoidable. CRACs are also available in in-row cooling (IRC) models, which increases cooling efficiency and removes the need for ductwork. • Computer Room Air Handling Units (CRAHs): CRAHs use chilled water to provide cooling. Depending on the adopted energy code of the location, chilled water is normally used for medium to large data centers or facilities that have an existing chilled water plant with available capacity. CRAHs are available in similar options as CRACs described above.

• In-Row Cooling Units (IRCs) provide a flexible, economical option for small to mid-size Data Centers. They can be used in any climate and are most efficient when higher inlet temperatures are supplied. Due to their operation, they are required to be installed within the rack row reducing the quantity of racks within each row. Aside from being employed as a primary cooling system in row cooling units can also be considered for supplemental cooling needs within a Data Center where higher density rack rows are installed. IRCs are placed within the row of racks to provide direct cooling to the cold aisle and direct return from the hot aisle, thereby increasing cooling efficiency. These generally smaller units also make meeting N+1 requirements easier by adding one additional in-row unit per aisle. Using in-row units reduces the future flexibility of the rack layout since they are installed within the rack rows. They also reduce the quantity of racks per row since each unit takes up a rack space. However, end-of-row space is not required as in larger CRAC/H ducted units. In-row cooling eases cooling to higher-density areas by increasing the number of in-row units around the higher-density rack areas. • Ducted CRAH units can be used in conjunction with IRC chilled water units to supply cooling to supplemental areas such as electrical and UPS rooms. Both systems would use the same chilled water system for cooling. • Custom Air Handling Units (Air Side Economizer): Custom Air Handling Units provide a highly efficient cooling system when installed in areas with mild climates. This system is primarily used in large Data Centers and provides the greatest amount of white space utilization by eliminating the installation of mechanical equipment inside the Data Center space. When an under floor air distribution system is selected supply plenum shafts must be provided within the building in order to deliver supply air from the AHU to under the raised floor. When the economizer cycle is not available due to unfavorable temperature and humidity conditions the custom air handling units provide cooling to the Data Center through chilled water mechanical cooling. • Chiller systems can be water or air-cooled depending on most efficient design analysis and local energy codes. • Cooling systems will require emergency power backup consisting of a backup generator. HVAC controls should be on the UPS system. • Cooling shall not be backed up by the IT equipment UPS system.

Page 6: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 6

• Design the humidification system in accordance with local climatic conditions and may include controls on each air handling unit, a centralized system or combination of both. • Central humidification systems may be used if approved by IT and Facilities and meet the ASHRAE performance recommendations. • The parameters for humidity are a target of 20-80% and data center specific. Reference ASHRAE standard 9.9. • If humidification is supplied on an individual basis at each air handling unit, the system must be equipped to operate in a uniform manner such that all units are in the same humidification mode (humidify or de-humidify, but not both at the same time). • Install drip pans under the raised floor, stands of the evaporator / CRAC units, and equip them with leak detection systems. • Provide a separation of the chilled water piping system to be able to detect and isolate leak locations. • Cabling and electrical distribution is above racks if a raised floor is provided. Install piping shall be installed below raised floor.• Electrical: • Research the quality of the power service and distribution equipment, switchgear construction with the draw-out type, circuit breakers, cast coil transformers, copper bus and feeders and SPD devices. • Provide a dedicated emergency generator power system with N+1 redundancy, sized for 100% coverage of data center equipment and support of mechanical loads. Design with a paralleling distribution switchgear, dedicated automatic transfer switches with the bypass/ isolation option for computer equipment and for mechanical, lighting and general power (receptacles) equipment. • Provide an uninterruptible power supply (UPS) system, (2N) redundant, to serve separate A and B side power distribution units (PDUs). The UPS to be a static, double-conversion, high efficiency UPS system with either flywheel or battery power backup. UPS to provide power for the Data Center electronic equipment, data communication network equipment and the facility control/monitoring systems. • Power should be located above floor to avoid interaction with water piping. • Follow Tier 3 requirements. Generators are N+1 redundant and all electrical distribution downstream is 2N. • Fifteen-minute battery ride through time required at full load, rated for end of life capacity, with a 10-year design lifetime. Review battery versus flywheel UPS systems on case by case basis based on reliability of power provider.

• Critical Systems Infrastructure design and components shall achieve “concurrent maintainability” based on the Tier 3 definition. • Include redundant utility power feeders from separate substations if possible. • Install electrical panels within the lab. • Provide bus tap boxes and busway to distribute the power overhead to each rack. • Allow for the utilization of more than one breaker per power receptacle. • Provide busway for flexibility on power receptacles as a lab may evolve. Refer to the Facilities Electrical Safe Work Practices for Busways re: busway selection criteria. • Provide power receptacles matching the PDU power cord (L6-30 for example). • Provide 2N redundancy and physical separation for electrical service and power distribution equipment: N+1 generators, dual UPS’s, pad-mounted transformers, unit substations, switchboards and distribution panels. • Generators: • Generator size shall be coordinated with the UPS and other critical equipment manufacturers and include technical and mechanical loads in the calculations. • Tier 3: N+1 designed generator system for the Data Center, IDF rooms and for life safety equipment, where permitted by code, for minimum 36 hour run time period at full load. Approximate size will be based on site capacity and shall carry data center loads (UPS and Mechanical loads). • Generators shall be installed outside of building where local code allows. • Generator shall have weather-proof and sound attenuated housing. Exhaust shall be “turned up” or designed to eliminate propensity of exhaust being pulled back into generator intake air louvers, or discharged directly onto shrubbery or personnel. • Fuel tanks must be readily accessible and provide secondary containment system sufficient to contain 120% of the tank volume. Where containment areas are located outside, adequate covers shall be installed to prevent collection of storm water. Do not allow bio-diesel. Provide fuel polishing station on a case by case basis based on the use of stored diesel. • An alternative to covering outside secondary containment areas, containment should be sized to hold the volume of the tank plus the anticipated rainfall accumulation as defined by the authority having jurisdiction, (ex: 100 year 24 hour rain event), with drainage valves that can be locked in the closed position.

Page 7: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 7

• Double-walled fuel piping, where piping runs outside of secondary containment area is required. • Fuel tanks shall have accessible means for re-fueling. • Fuel tanks shall have leak alarm and low fuel alarm. • A load bank connection point with approved disconnecting device shall be provided in an accessible location. • Power from the generators shall be provided through an automatic transfer switch or through automatically controlled (PLC based control) switchgear with ties to main power distribution boards. • Tier 3: Provide isolation disconnect breaker that allows for concurrent maintenance on the generator. • The generator/transfer scheme shall be capable of providing full emergency power within 15 seconds of power loss. • Locate “Generator running” annunciator within the data center, by the main lobby or security office and tied into BAS. • Generators and/or fuel tanks may require pre-construction and/or operating permits from local authorities having jurisdiction (AHJs), typically fire or environmental agencies. Additionally, an AHJ may require specialized pollution abatement equipment, minimum exhaust stack height, on-board emissions monitoring, and other considerations, even when no permit is required. Permits may take 6-24 months to obtain before equipment may be installed, depending on the proposed units and the AHJ’s processes, so it is important to consult with EHS regarding these requirements well before the project starts. • Install the UPS in a separate room from the data center. • Power metering /monitoring system to incorporate the electrical switchgears, unit substations, generator power system, UPS, PDUs, computer room panels and/or busways, and their branch circuit breakers and/or tapcans via a data communication interface using a protocol consistent with a facility BMS. • Panel boards, switchgear, motor control centers and similar distribution equipment shall be capable of being made safe through the installation of lockout, tag-out devices when needed to protect workers downstream of the distribution equipment location. • PDUs to support dual power fed computer equipment (A and B side). Power distribution from PDUs to either the A and B side panels at the end of the equipment rows or to the A and B power track type busways overhead of the each equipment row. Coordinate quantity and size of the panels’ circuit breakers or the busways’ tapcans with a matrix for the Data Center equipment power connections.

• Ground system shall be designed in accordance with local codes and soil conditions. Where local code specifies a lesser standard than described herein, this standard, NFPA 70: NEC Article 250, shall supersede local code. Grounding shall be designed and installed in accordance with Dell IT Telecommunications Grounding Specification (latest version). Reference telecom standards for telecom (BICSI). • Facility grounding system including the raised floor grounding grid bonding. • Lighting fixtures and control. Selection should be coordinated with the Architect to achieve a scheme, which will complement the aesthetics and functional use of the space. System efficiency and controls to comply with the latest Energy Code and sustainability requirements. In row lighting levels shall be designed to 538 Lux (50 fc) min. at 36” (1m) above finished floor in the aisles between rows of cabinets and between cabinets and walls. Where direct lighting fixtures are installed, locations shall be coordinated with rack layout to center fixtures within the each aisle. Where indirect lighting is installed, Fixtures shall be installed above and perpendicular to the rack rows. • For an under raised floor power distribution option, provide emergency power off (EPO) buttons at all computer equipment area exits to de-activate power in the room only if required by local codes. Coordinate EPO requirements (if any) for the overhead power distribution system with a local AHJ- Only provided if the AHJ requires them (apply for a variance if possible). Require a dual button if required to avoid accident.• Controls and Monitoring: • The manufacturer will supply the computer room units with control package. The building automation system will provide gateway communications to the package control system and control them to prevent simultaneous cooling by one unit and heating by another. • The BMS may monitor all functions of the building’s mechanical/electrical systems supporting the Data Center • The BMS may be monitored by the user’s data center management system. • The Monitoring and Alarm System shall have SNMP or Modbus capabilities in interfacing with diverse mechanical and electrical equipment and systems, and have web enabled capabilities for remote monitoring access. • An external battery monitoring system that monitors the integrity (open cell) of the battery string is required for battery system and must be monitored. • All UPS, PDU, batteries, generators, Fire Alarm system, EPO system, cooling system, incoming power, critical

Page 8: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 8

water and air systems, standby power fuel systems and leak detection systems shall have alarm points actively tied to the monitoring system. • Power monitoring: BAS and power monitoring (DCIM from IT) specific to DC operations. Coordination of points into the BMS. • Leak detection ropes installed under each CRAC unit or other locations will be configured to provide a warning alarm tied into the BMS system. • All monitoring and controls equipment panels and supporting servers should preferably be installed in a secure facilities controlled area. Where this is not possible the data center is an acceptable alternative. • Power to critical controls should be provided by two redundant UPS sources where tier level requires. Use of critical computing UPS power sources is acceptable in redundant configurations. • The computer room units will be supplied with control package by the manufacturer. The building automation system will provide gateway communications to the package control system and control them to prevent simultaneous cooling by one unit and heating by another. The BMS may monitor all functions of the building’s mechanical/electrical systems supporting the Data Center • The BMS may be monitored by the user’s data center management system • VESDA- dry system early detection add to monitoring system • Fire protection- dual interlock needs to be coordinated with the other systems. Follow FM Global recommendations when financially prudent.• Structured Cabling: (Requirements to be further reviewed by the Data Center Infrastructure Team) • Provide cable tray above each rack/row. Wire mesh, or ‘basket’ type cable tray is the preferred cable management system, and multiple tiers of tray may be required. Provide cable tray with the appropriate width to support anticipated cable volume. Tray size calculations should incorporate a 40% maximum fill ratio. • Cable Tray Support- provided by unistrut and threaded rod trapeze support system attached to the structure above and braced according to local codes. • Waterfall dropouts will be installed on the tray system to support and protect cables transitioning from the tray down to the server cabinets. • Cable Tray Intersections and Turns- radius corners provided at all turns in the cable tray. Tray intersections will be cut, joined, fabricated and supported according to Manufacturer instructions. Use manufacturer recommended components to support and join tray sections. The installed tray will be free of sharp edges at intersections, corners and end locations.

• Install a #2 AWG bare copper conductor from the Telecommunications Grounding Busbar (TGB) along the entire length of the cable tray system. Install the ground wire inside the cable tray and secure to the cable tray for a neat appearance. Bond each section of cable tray to the Telecommunications Grounding System. Reference BICSI standard for grounding. • Horizontal Cabling Systems: Cabling requirements to be determined by IT.

Ergonomics, Safety and Sustainability• Ergonomics: Refer to the EHS and Sustainability design criteria for ergonomic design.• Safety: Refer to the EHS and Sustainability design criteria, including but not limited to walking surfaces, electrical service, cord management to prevent tripping hazards. • Sound proofing materials used to assist with noise control. • Aisle clearance of at least 48” (1.2m) to provide proper space for moving heavy equipment and racks within the space. • Electrical equipment, such as PDU’s and electrical panels, to have minimum of 36” (0.9 m) clearance in front of and the sides of equipment. • Refer to provided equipment manual for recommended clearance distances for all equipment. • Lighting should be adequate, based on Dell’s defined best practices or local AHJ and coordinated within the rows of racks. • Nearby staging area adjacent to Data Center for storing boxes and other packaging/combustibles (outside of data center). • Design and /or provide supplier for floor hole covers for raised floors. • Provide signage for raised floors stating space below raised floors is “Confined Space”. • Eye wash stations adjacent to any wet lead-acid UPS batteries. • Emergency Power Off (EPO) button only required where required by local regulations. EPO to be located by exit door serving as emergency exit. • Provide adequate racking for safe storage of any materials in the data center. • Provide area against wall where ladders and gas cylinders can be secured and stored safely.• Sustainability: Refer to the EHS and Sustainability design criteria, including but not limited to materials, installation, battery fluid and fuel containment, energy efficient lighting and UPS systems, segregated hot or cold aisle configurations to optimize cooling efficiency, and fuel tank containment design.

Page 9: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 9

• Cooling equipment shall meet Dell’s GHG reduction goal criteria and local regulatory requirements for refrigerant type. Single pass water cooled units are prohibited. • Provide UPS rooms with secondary containment and spill clean-up equipment for battery fluid leaks. • If generators and fuel tanks are to be provided, see above note re: emergency generators and fuel tanks permits. Technology• Data and security to be provided as noted.

Signage• Room identifier label for data center.

Specialties and Equipment• Raised Access Floor System• Containment System

Design References• Building Standards• Brand Guidelines: Refer to https://brand.delltechnologies.com/dell-facilities/ for Dell Brand Guidelines.• EHS and Sustainability Design Criteria: Refer to EHS documentation• IT Standards.• Risk Management: Refer to Dell/local FM Global representative for guidelines and recommendations.

Page 10: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 10

PHOTOS | DIAGRAMS | DRAWINGS

Lab/Data Center Durham, NC

Anti-Passback Device

Page 11: 5.3 Data Center - DELL

DESIGN STANDARDS | WORKPLACE ENVIRONMENTS DATA CENTER 11

PHOTOS | DIAGRAMS | DRAWINGS

Lab/Data Center Durham, NC

Lab/Data Center Durham, NC


Recommended