data center design criteria

17
Stoops Consulting, Inc. Information Technologies Planning / Engineering / Project Management Phone: (360) 913-0102 PO Box 1224 Fax: (206) 274-4886 Marysville, WA 98270 www.stoopsconsulting.com DATA CENTER DESIGN CRITERIA DATE: APRIL 22, 2008 AUTHOR: MARK D. STOOPS, RCDD

Upload: yahya-hammoudeh

Post on 11-Jul-2016

11 views

Category:

Documents


4 download

DESCRIPTION

design criteria

TRANSCRIPT

Stoops Consulting, Inc. Information Technologies Planning / Engineering / Project Management

Phone: (360) 913-0102 PO Box 1224 Fax: (206) 274-4886 Marysville, WA 98270 www.stoopsconsulting.com

DATA CENTER DESIGN CRITERIA

DATE: APRIL 22, 2008 AUTHOR: MARK D. STOOPS, RCDD

April 22, 2008 Page 2 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

TABLE OF CONTENTS

General ....................................................................................................................................... 4

Best Practices and Standards ...................................................................................................... 4

The Uptime Institute, Inc. ...................................................................................................... 4

Raised Floors ...................................................................................................................... 5

Wattage Per Square Foot .................................................................................................... 6

The ANSI TIA-942 ................................................................................................................ 8

Physical Space .......................................................................................................................... 10

Cabinet Size ............................................................................................................................. 11

Seismic Protection .................................................................................................................... 13

Ceiling Height .......................................................................................................................... 14

Power Per Cabinet .................................................................................................................... 14

Electrical Outlets ...................................................................................................................... 15

Additional Power Supply ......................................................................................................... 15

HVAC ...................................................................................................................................... 15

System Sizing ....................................................................................................................... 16

Air Handling Methods ......................................................................................................... 16

IT System Interconnection ....................................................................................................... 17

April 22, 2008 Page 3 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

TABLE OF TABLES Table 1 - The Uptime Institute's Tier Attributes and Statistics .................................................. 5 Table 2 - Computer Room Watts Per Square Foot Comparison Based on Cabinet Size ........... 7 Table 3 - Watts Per Square Foot Comparison Based on Power Supply Capacity ..................... 7 Table 4 – Watts Per Square Foot Old Technology vs. New Technology .................................. 7 Table 5 - Suggested Updates to The Uptime Institute's, Inc. Data ............................................ 8

TABLE OF FIGURES Figure 1 – ANSI/TIA/EIA 942 Computer Room Areas and Interconnection ......................... 10

Figure 2 – Simple Computer Room Design ............................................................................. 12 Figure 3 - 30-Cabinet Computer Room Design ....................................................................... 13

April 22, 2008 Page 4 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

GENERAL In today’s business environment, the computer room is a mission-critical core component. It is important that computer rooms be designed to support the business, not only in the current business setting, but as the business grows and changes. The key to this is scalable computer room is a design based on industry-accepted standards and best practices. Computer room design must address both the services and systems that support the center (e.g., electricity, HVAC) as well as the systems and services that will be installed in the data center (e.g., servers, switches). The focus of this section is on those systems and services that support the computer room, as well as the physical infrastructure that will be in the computer room to support the systems and services which will reside there. Scalability, as related to the design of a computer room, speaks to the ease by which the computer room can be expanded or upgraded. Scalability means new computing equipment can be easily deployed, while also providing for the replacement or upgrading of legacy equipment to support new missions. Ensuring scalability requires that some difficult issues be settled at the outset so that they do not become roadblocks later on. The issues that absorb the most effort during computer room design are physical space, electrical load, and HVAC. From a best practices/standards perspective, there are two key areas for discussion. The first is the accepted best practice data center (computer room) tiered classification established by The Uptime Institute, Inc. These best practices established a tier structure for computer rooms based on key attributes and statistics. The second area is the ANSI TIA-942 Telecommunications Infrastructure Standard for Data Centers. This document speaks to a standardized model for the makeup of the computer room, regardless of the computer room/data center’s tier classification. BEST PRACTICES AND STANDARDS

THE UPTIME INSTITUTE, INC. The Uptime Institutes tier classification established a four-tier ranking for computer rooms. This ranking is based on the following computer room attributes:

Power and cooling delivery paths

Redundant components

Support space to raised floor ratio

Initial watts per square foot

Ultimate watts per square foot

Raised floor height

Floor loading pounds per square foot

Utility voltage In addition to the above listed attributes the following statistics add to the comparison of the four tiers:

Months to implement

April 22, 2008 Page 5 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

Year first deployed

Construction $ per square foot

Annual IT downtime due to site

Site availability The following table from The Uptime Institute compares the above attributes and statistics for the four tiers.

Table 1 - The Uptime Institute's Tier Attributes and Statistics

Attribute / Statistic Tier I Tier II Tier III Tier IV

Power and Cooling Delivery Paths

1 Active 1 Active 1 Active 1 Passive

2 Active

Redundant Components N N + 1 N + 1 2(N + 1)

Support Space to Raised Floor Ratio

20% 30% 80 – 90% 100%

Initial Watts / square foot 20 – 30 40 – 50 40 – 60 50 – 80

Ultimate Watts / square foot

20 – 30 40 – 50 100 – 150 150+

Raised Floor Height 12” 18” 30 – 36” 30 – 36”

Floor Loading Pounds / square foot

85 100 150 150+

Utility Voltage 208, 480 208, 480 12 – 15 kV 12 – 15 kV

Months to Implement 3 3 – 6 15 – 20 15 – 20

Year First Deployed 1965 1970 1985 1995

Construction $ / square foot

$450 $600 $900 $1,100+

Annual IT Downtime Due to Site

28.8 hrs 22.0 hrs 1.6 hrs 0.4 hrs

Site Availability 99.671% 99.749% 99.982% 99.995%

There are two issues presented by The Uptime Institute that need further discussion: raised floors and watts per square foot.

RAISED FLOORS Raised floors are common in computer rooms that were design and created in the 70s; a raised floor environment was a necessity for the equipment at that time. The equipment was designed to be cooled by cold air traveling up through the cabinet from below. Many modern computer rooms are designed with a raised floor because that is the way it has always been. The equipment in a modern computer room is not designed to take advantage of this up-flow

April 22, 2008 Page 6 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

HVAC design. It is, in fact, designed to be cooled by air flow from front to back. For the equipment cooled by an updraft system, good efficiency could be achieved by ensuring the raised floor system was properly pressurized. The same cannot be said for today’s computer room environment. There is much discussion about hot and cold aisles in today’s computer rooms. The concept is to concentrate the cold area at the front of the equipment cabinets and allow the front-to-rear internal fan systems to capture the cold air and flow it across the components exhausting the now hot air to the rear hot aisle. In a raised floor computer room, there are a number of things working against the end goal of adequate cooling for all equipment in the cabinet. First, consider the supply air flow dynamics. Cold air tends to migrate down, not up. Therefore “stacking” a column of cold air into the cold aisle requires pushing a significant volume of cold air from the under-floor area. This was not a problem for the older equipment designed for updraft cooling because the raised floor system design had large openings under the cabinet allowing this air movement and the cabinets had large internal air paths. Today, this air must flow up in the aisle where the staff must also be able to walk and work, meaning the air has to flow up through perforated tiles. These tiles were not intended to support the volume needed for equipment cooling. In addition any floor tiles that have been removed will significantly impact air flow in the under-floor plenum. Tiles may have been removed during normal maintenance or new installation. These tile disruptions may last for hours, during which there will be an adverse affect on the flow of cooling air. It is also important to remember in a raised floor environment that there is a significant amount of network and electrical cabling in the under-floor area. These act as restrictors to the air flow, again impacting the delivery of the necessary volume of cold air to the cold aisle. As equipment densities increase, the heat load in the computer room also increases. As the heat load increases, the volume of cold air for cooling must also increase. This increase in the necessary cold air volume is why the height of the raised floor increases as the heat load increases. So with higher heat loads, it is necessary to move a greater volume of cold air through the restrictive under-floor space and the perforated tiles. At the other end of the heat transfer cycle, we have the exhausted hot air. This hot air wants to move up. As the hot air moves up out of the hot aisle, it begins to cool and as such tends to flow downward. The volume of hot air coming out of the hot aisle keeps this air from falling back into the hot aisle, which leaves the cold aisle as the remaining place for it to go. This results in the upper level of the cold aisle having warm, not cold, air, meaning the intake air to the equipment upper 30% of the cabinet is already warmer than many hardware manufacturer’s recommendation. This is why in many computer rooms you will find the upper 30% of the cabinets either not used or used only for inactive components.

WATTAGE PER SQUARE FOOT The wattage per square foot noted in Table 2 is an interesting quantity. First consider the foot print of data cabinets. In the 80s, a deep equipment cabinet was 36 inches and the most common was 30 inches. IT equipment manufacturers have been delivering hardware in decreasing height form factors. This decrease in the height of the hardware has also resulted in an increase in equipment depth. Considering the depth of today’s hardware, the management of cables at the rear of the hardware, the management of network cabling at the rear of the cabinets, and the delivery of power via power distribution units has resulted in cabinets that are now over 48 inches in depth.

April 22, 2008 Page 7 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

The following table compares two computer rooms. Each room has two rows of cabinets, and in each the cabinets are 24 inches wide. The first has 30-inch deep cabinets. The second has 48-inch deep cabinets. This table shows that when fully utilized the ratio of aisle space to equipment space is roughly 2/3 aisle space to 1/3 equipment space.

Table 2 - Computer Room Watts Per Square Foot Comparison Based on Cabinet Size

Cabinet Depth Room Width Room Depth Total Sq Footage Sq Footage For Equipment

% SQ Footage Equipment

30” 14’ 23’ 322 100 31%

48” 20’ 23’ 469 160 34%

Now consider the increase in the power/heat load per cabinet over the past ten years. For comparison, I have used a cabinet with 42U1 of equipment mounting space. Table 3 compares 7U servers with 400W power supplies to 1U servers with 700W power supplies.

Table 3 - Watts Per Square Foot Comparison Based on Power Supply Capacity

Server Size Power Per Server

Servers Cabinet

Watts Per Cabinet Watts Per Cabinet SQ Foot

30” Deep Cabinet

Watts Per Cabinet SQ Foot 48” Deep Cabinet

7U 400W 6 2,400W 480W 300W

1U 700W 212 14,700W 2,940W 1,838W

Now consider the watts-per-square foot difference for a 48-inch deep cabinet populated with 7U 400W servers, compared to 1U 700W servers:

Table 4 – Watts Per Square Foot Old Technology vs. New Technology

Cabinet Size Server Size Watts Per Cabinet

Cabinets Total Watts per Data Center

Total SQ Footage Of Data Center

Watts Per Data Center

SQ Foot

30” 7U 2,400W 20 48,000W 322 149W

30” 1U 14,700W 20 294,000W 322 913W

48” 7U 2,400W 20 48,000W 469 102W

48” 1U 14,700W 20 294,000W 469 927W

This table shows that based on currently available equipment the Watts per square foot data in The Uptime Institute table is low and should perhaps be increased to show the following.

1 1 U of equipment mounting space equals 1.75” of vertical space.

2 Research and development efforts by today’s data equipment manufacturers have shown the current maximum manageable heat

load for a cabinet with passive cooling is 10KW to 15 KW. Based on this with 1U servers at best only 50% of a cabinet can be utilized.

April 22, 2008 Page 8 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

Table 5 - Suggested Updates to The Uptime Institute's, Inc. Data

Initial Watts / square foot 100–150 100–150 100–150 100–150

Ultimate Watts / square foot 650 – 950 650 – 950 650 – 950 650 – 950

THE ANSI TIA-942 The ANSI TIA-942 Telecommunications Infrastructure Standard for Data Centers has been adopted as the industry standard for data center and computer room design. The standard addresses architectural design, electrical and HVAC, bonding and grounding, and structured wiring. The architectural design speaks to the size and construction of the space. It is important the computer room and associated support spaces have the necessary space for both the equipment and the people that will work on the equipment. It is also important the space and the equipment in the space be designed to address both personal safety as well as equipment longevity. For example, it is possible to develop a design for seismic bracing that provides a safe environment for personnel yet does nothing to address the impact of a seismic event on hard disk drives. Electrical and HVAC design are interrelated: the more electrical load within a space, the more HVAC load, in other words, watts in means BTUs out. There is also the issue of the larger electrical load presented by the high density electronics installed in computer rooms today. Bonding and grounding are as important now as they have always been. This standard addresses that importance and provides guidance for the design of the bonding and grounding system. Structured wiring has been an important design issue for a number of years as well. Much has been done to guide the designer in cabling office spaces. This standard brings the computer room into the structured cabling design arena. The standards document provides a model for computer room design based on zones or areas where similar systems and services are grouped. This provides designers a model that allows for maximum utilization of space for active components while at the same time providing for efficient interconnection of equipment through the structured cabling. ANSI/TIA/EIA 942 defines four key areas for the computer room. Those areas are: The Entrance Room (where service providers deliver service to the building) There will often be diverse3 entrance paths for the service providers.

3 Diverse routing was defined by the telephone companies in the late 70’s as being two service connections to a business whose

routes never come closer that 30 feet once they leave the customers property. This 30’ standard came about because the typical right of way for an alley is 20’ in width while a typical street right of way is at least 50’ in width. This allowed route diversity to be provided with two cables on opposite sides of a street so long as the routes never traverse the same alley. This standard was established because it is quite common for a construction excavation to be open across an alley while it was very uncommon for a construction excavation to be open across an entire city street. Both legs of a diverse route might be dug up in an alley while it is not likely they would both be dug up in a street.

April 22, 2008 Page 9 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

The Office and Support Areas Note these are separate rooms not a part of the computer room equipment area. A Telecom Room or Rooms These rooms serve as the point of termination for the horizontal cabling coming from the office and support areas. The workgroup switches serving the office and support areas will be located here. The Computer Room This is the actual equipment room/space of the computer room housing the network/computer hardware. This room is further broken down into zones.

The Equipment Distribution Area houses the servers and application-specific hardware. o The Equipment Distribution Area is connected to the Horizontal Distribution Area by

horizontal cabling.

In very large computer rooms/data centers, there are Zone Distribution Areas where a number of Equipment Distribution Areas aggregate via horizontal cabling to a switch which is, in turn, connected through horizontal cabling to the Horizontal Distribution Area.

The Horizontal Distribution Area houses the LAN/SAN/KVM hardware. o The Horizontal Distribution Areas are tied to the Main Distribution Area by backbone

cabling.

The Main Distribution Area contains the routers, backbone switches, multiplex equipment, and PBX.

o The Main Distribution Area has backbone cable connections to the Entrance Room and the Telecom Rooms.

The following diagram provides representation of the areas and their interconnection in a computer room as described by ANSI/TIA/EIA 942.

April 22, 2008 Page 10 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

Figure 1 – ANSI/TIA/EIA 942 Computer Room Areas and Interconnection

PHYSICAL SPACE Considering the information from The Uptime Institute and the ANSI TIA-942 Telecommunications Infrastructure Standard for Data Centers, further discussion is necessary in the areas of physical space,

Data Center

Offices and

Support Areas

Telecom Room Entrance Room

Computer Room

Main Distribution Area

Horizontal

Distribution Area

Horizontal

Distribution Area

Horizontal

Distribution Area

Zone Distribution

Area

Equipment

Distribution Area

Equipment

Distribution Area

Equipment

Distribution Area

Access Provider Access ProviderBackbone Cabling

Horizontal Cabling

April 22, 2008 Page 11 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

power, HVAC, and IT system interconnection as they relate to the Computer Room within the Data Center. Raised floor versus solid floor will be the topic for energetic debate for years to come. Consider the issues discussed above relating to cold air delivery to the cold aisle in a raised-floor environment. Also consider the added cost of a raised-floor system over a solid floor. A detailed cost benefit analysis specific to a project design might be necessary to show which solution is the best for that particular project. In general, the problems outweigh the benefits. A solid floor computer room will have a lower initial first cost and will have a more cost-efficient HVAC delivery system. The raised-floor environment does not appear to. CABINET SIZE Before any recommendations can be provided for the overall size of a computer room, a decision has to be reached regarding the size of the equipment cabinets that will be utilized in the space. In general, considering the depth of new 1U servers, the cable management at the rear of the server, the power delivery systems at the rear of the cabinet, as well as cabinet cable management in general, a deeper cabinet will serve better than a shallower one. A 48-inch deep cabinet sounds like a waste of space, however, all things considered, the difference may be the ability to have a cabinet with closed secure doors at the rear versus an open cabinet. Where in the past, we strived for 36-inch aisles, we now need to instead consider the aisle width necessary for equipment installation in a 48-inch deep cabinet (and even the possibility of having to replace the cabinet itself). A standard aisle width of 4.5 feet should now be the norm. The actual dimensions of the computer room need to be based on an inventory of the equipment that is to be initially installed and the forecast of equipment to be added over the next 10 to 15 years. In addition, the three types of distribution areas identified in ANSI TIA-942 need to be established and maintained. The following diagram shows the simplest computer room where there is a single cabinet for the three areas: Main Distribution, Horizontal Distribution, and Equipment Distribution.

April 22, 2008 Page 12 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

Figure 2 – Simple Computer Room Design

Expanding this model to include three rows of ten cabinets each results in the following:

Computer Room

Main

Distribution

Horizontal

Distribution

Equipment

Distribution

13'

11'

April 22, 2008 Page 13 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

Figure 3 - 30-Cabinet Computer Room Design

This 30 cabinet module could be used as a base section to model a larger computer room in multiples of 30 cabinets. In addition, the roughly 30-foot by 30-foot size will fit well into an architectural plan for a building. SEISMIC PROTECTION The primary goal of any seismic protection system is preventing personal injury due to falling equipment during a seismic event. To this end, most all seismic systems consist of hardware bracing and strapping

Computer Room

Main

Distribution

30'

27'

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

Main

Distribution

Main

Distribution

Main

Distribution

Main

Distribution

Main

Distribution

Main

Distribution

Main

Distribution

Main

Distribution

Main

Distribution

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

Horizontal

Distribution

Equipment

Distribution

April 22, 2008 Page 14 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

that will keep the equipment upright and in place so long as the building structure to which it is attached stays upright and in place. Looking at this from another angle one can readily see that if the structure is moving in multiple directions in the horizontal plane at a relatively high frequency, then hard disk drives within a chassis secured to a cabinet braced to the structure are, in turn, moving in multiple directions in the horizontal plane at a relatively high frequency. In other words, during a seismic event in a seismically braced environment, hard disk drives are subject to significant shaking. In the event the power has not failed, these same disk drives probably have platens spinning at high speed when the significant shaking occurs. The likely result is a hardware failure of the disk drives. Putting this in worst-case verbiage, not only can a seismic event damage or destroy the computer room, it can also likely damage or destroy the data in the computer room. Insurance can repair or replace the computer room. Insurance can do nothing to restore the data. An alternate to the physical bracing seismic systems is one that allows the equipment cabinets within the computer room to remain relatively in place in reference to the space that is moving around them. A base isolation seismic system results in both personnel safety as well as providing some level of protection to the hard disk drives with the all important data. CEILING HEIGHT The ceiling height in the computer room is not usually a topic of discussion. It is actually an important items of consideration, however, early in the design process. When careful consideration is given to the vertical space occupied by the cabinets, the seismic platforms, the necessary clearance between the top of the cabinet and the lowest cable tray4, the depth of cable runway including working space, and the required space for HVAC ducting it is not uncommon to require a clear ceiling height of at least 12’ 6”. Depending on the type of construction, be it open web-truss or large beam, the distance from floor to the bottom of the roof deck could be in excess of 14’ 6”. POWER PER CABINET Power requirements in computer rooms can be a difficult topic. The most direct way is to begin the study considering the possible load for a single cabinet. (It is important to stipulate this is a worst-case discussion. It is likely a customer will make a conscious decision to not load the cabinets to the extent used in this discussion.) Lighter loaded cabinets would result in slightly smaller electrical loads, however, since it is possible to load a cabinet to the extent in my example that fully utilized cabinet will be the basis for discussion. Earlier In this document I showed where a single 42U cabinet equipped with 21 servers each with 700W power supplies5 equals a load of 14,700W. When considering the electrical service to a cabinet, it is also important to consider the effect of redundant power supplies on the circuits. Computer equipment with redundant power supplies operates in a shared load condition when power is available to both the primary and redundant power supplies. In the event of a power loss to one of the supplies, the entire load is picked up by the other supply. Therefore, when systems with redundant supplies are connected

4 When a base isolation seismic system is used it is necessary to leave a minimum of 12 inches between the top of the cabinets and

the bottom of the cable runway. This clearance and the use of loose tails for power and network cable connections provides for the lateral movement the cabinet will experience during a seismic event. 5 The HP DL360 G5 has dual 700W power supplies.

April 22, 2008 Page 15 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

to separate electrical feeds, it is important that the individual circuit not be loaded above 40% (based on an overall requirement of not loading a circuit breaker beyond 80% of the circuits rated capacity). This ensures that in the event of a circuit failure, the increased load will not overload the remaining circuit. So considering the redundant power supplies, we can say the 21 1U servers have two equal loads or 7,350W or at 208V that is two equal loads of 35A at 208V. Remember, however, that since this is a redundant power supply model, we have to consider that 35A to be 40% of the load. In the event we loose the redundant supplies in all the servers, the load on the primary side will increase to the full 70A. Using a 208V 3 phase 30A circuit as a standard, we will need 3 - 208V 3 phase 30A circuits for the primary supplies and another 3 – 208V 3 phase 30 A circuits for the redundant side per cabinet. In addition, to allow for some ancillary 120V equipment, we should provide a 120V 20A circuit to each cabinet. Now it is a simple matter of multiplying this per cabinet requirement by the number of cabinets to be installed in the computer room. ELECTRICAL OUTLETS The power circuits discussed in the previous paragraph would be considered technical outlets meaning they will likely have network hardware connected to them. In addition to these technical outlets, the computer room will need some number of convenience electrical outlets for use with test equipment and other tools. The spacing for these outlets along the walls can be left to what the electrical code requires. The only stipulation should be that each location be a quad outlet not a duplex outlet. ADDITIONAL POWER SUPPLY The remaining power issues to be discussed are UPS and generator. The first decision relates to generator backup for the computer room. If there will be a generator, then the associated UPS run-time will be shorter. If there will not be a generator, then the UPS run-time will be longer. Without a generator, the UPS run-time needs to be long enough to allow for graceful system shutdown. Where more than a few minutes run-time is required, a UPS with a battery string will be required. In a generator backed-up computer room, consideration can be given to a kinetic or fly wheel UPS. These systems have environmental benefits as well as lower maintenance costs as there are no batteries. These systems can be prone to failure in a seismic event and as such may not be suitable in areas where seismic events can occur. HVAC There are two key topics when discussing HVAC and computer rooms, system sizing and air handling methods. The purpose of the HVAC system is to maintain the environment within the Data Center within the following parameters per ANSI TIA-942. The temperature and humidity shall be controlled to provide continuous operating ranges for temperature and humidity:

dry Bulb Temperature: 20° C (68° F) to 25° C (77° F); relative Humidity: 40% to 55%; maximum Dew Point: 21° C (69.8° F); maximum Rate of Change: 5° C (9° F) per hour;

April 22, 2008 Page 16 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

humidification and dehumidification equipment may be required depending upon local environmental conditions.

A positive pressure differential with respect to surrounding areas should be provided.

SYSTEM SIZING System sizing relates to the expected electrical load in the room. Using the concept of sizing per cabinet, again consider the 42U cabinet with a 14,700W electrical load when fully loaded. Here again it is important to stipulate this is a worst-case discussion. 50,186 BTUs = (3,414 BTU an hour/KW) x (14.7 KW) 50,186 BTUs require 4.2 tons of cooling: 4.2 tons = 50,186 BTUs ÷12,000 BTU/hr per ton For the simple computer room described above, the HVAC system would be 12.6 tons. 4.2 tons x 3 = 12.6 tons The HVAC system for the expanded computer room model, 30-foot by 30-foot the HVAC system would be 126 tons. 4.2 tons x 30 = 126 tons

AIR HANDLING METHODS Delivering the HVAC into the computer room is accomplished with a supply plenum and a return plenum. The discussion above related to raised-floor systems suggest that a raised floor may not be the best delivery plenum and, in fact, support designing a computer room with both the supply and return air plenums ducted in the overhead. A continuous load of cold air dropped into the cold aisle ensures the entire cabinet is supplied with the necessary cold air. A properly ducted return system limits the flow of hot air back to the cold aisle. The concept of connecting the return air plenum to the top of each cabinet on the hot side will result in the most efficient heat transfer system for the computer room. As previously mentioned, today’s IT hardware is designed for cooling by air flow from the front through the equipment chassis to the back. As such, the delivery model utilizes a cold front aisle to a hot rear aisle. Remember that air flows follow the path of least resistance. This concept helps explain the critical need for sealing the spaces between the cold and the hot aisle. All the space at the cold aisle side of the equipment cabinets that are not occupied by network hardware should be sealed to limit the flow of cold air around, rather through, the network equipment chassis. There are a few discrete network hardware items such as the Cisco 6000 series that require a side-to-side air flow. Computer rooms that will include this hardware will need specific attention paid to that unique air flow requirement.

April 22, 2008 Page 17 of 17

PO Box 1224 Stoops Consulting, Inc. Marysville, WA 98270 Information Technologies Phone: (360) 913-0102 Planning / Engineering / Project Management Fax: (206) 274-4886 www.stoopsconsulting.com

IT SYSTEM INTERCONNECTION The ANSI TIA-942 Telecommunications Infrastructure Standard for Data Centers brings the concept of structured cabling into the computer room. There is a tendency to work toward this by utilizing cabinet space or at least floor space that could serve equipment cabinets as the location for racks or patch panels. Consideration should be given to moving the interconnect fields to the walls of the computer room freeing up the floor space for cabinets that can house equipment. At this centralized wall field interconnect, consideration should be given to a cross connect system that does not require patch cords. An interconnect system that utilized jumper wire allows for excellent cross connect management as a matter of course; as the cross connects are installed, the cross connect wire is routed and managed coincident with the installation. Patch panels with patch cords afford the technician an opportunity to install a connection while ignoring the patch cord management system. Patch panels serve as a convenient method of interconnection where one of the ends being connected has to be an 8-pin modular jack such as the NIC on a server. This allows for the connection to be accomplished with a patch cord. While these connections are easy to make, they can quickly become a management nightmare. In addition, consideration needs to be given to how a device in Cabinet A interconnects to a device in Cabinet B. In many locations, this connection is accomplished by patch cords routed between cabinets. This results in an exacerbated cable management situation in addition to increased difficulty for system maintenance. Once an agreement has been reached on the centralized interconnect system, the design can be expanded to incorporate the equipment cabinets and switches in the interconnect system. Each cabinet should have both copper and fiber multi-cable harnesses installed from the cabinet to the centralized interconnect location. These harnesses would terminate in patch panels at the cabinet. At the centralized interconnect location, the fiber strands would terminate in fiber interconnect panels and the copper cables would terminate in jumper-wire-based termination block. Multi-port Ethernet switches and voice systems should be provided with switch tail harnesses that route from the hardware ports to the centralized interconnect location. The following steps would complete the connection from a server NIC in Cabinet A to a switch port in Cabinet B:

1. In Cabinet A connect a patch cord from the NIC to a port in the cabinet patch Panel A-1. 2. In cabinet B connect a patch cord from the switch port to a port in the cabinet patch Panel B-1. 3. At the centralized interconnect location install a cross connect jumper from the Panel A-1

termination position to the Panel B-1 position.