presented by jim rarus, wayne resa maeds 48 october 24, 2012

61
Data Center Remodeling Presented by Jim Rarus, Wayne RESA MAEDS 48 October 24, 2012

Upload: scarlett-young

Post on 29-Dec-2015

217 views

Category:

Documents


2 download

TRANSCRIPT

Data Center Remodeling

Data Center Remodeling

Presented byJim Rarus, Wayne RESA

MAEDS 48October 24, 2012

AgendaBrief historyOverview of our facilityChallengesPositivesRemodeling ProcessRemodeling in picturesHardware highlightsSummary

Wayne RESAOne of 57 regional educational support centersEstablished by Legislation in 1962 50th anniversarySupports 34 school districts in Wayne County110 Public school academies313,000 studentsLargest in the state 13th nationwideServicesConsolidating technical services Teacher trainingSpecial education and vocational servicesWayne RESA Education Center

Computer Services Consortium Services to 72 districts, ISDs and other agencies statewideBusiness services payroll, financeStudent Services grade reporting, attendance, transportationTechnical ServicesCountywide WANConnectivity to application serversInternet Merit affiliate since 1994Network management and consulting62 staff members

FacilityEducation Center build in 1976Computer Services wing10,000 square feet of raised floorOriginal data center 4,000 square feet1970s and 80s needed all the spaceThe big one IBM 3033 water cooled mainframe

Systems OperationEquipment and PeopleTapesDiskPrintingCommunicationsAdjust modems

ChangesHardware from multiple vendorsNew functionsModemsGateway PCsNot rackableEarly networkingWe survived dial up InternetEndless number of modem power bricks

Use of space evolvesMainframe downsizesOne end of the data centerMore communications equipment and file serversOther end of the data centerUse the middle for office space (more consultant staff)Partial walls help block noiseAir flow under the floor keeps space too coolLots of abandoned cables and power under the floorHard to clean done only once

Floor Plan

Mainframe area West

Office Area - Center

Server area East

Aisle in between areas

Connectivity more in less spacePrior to 1994Leased analog circuits to mainframe and minicomputersDial up to timesharing minicomputers1994 to early 2000sDigital circuits 56Kb and 1.5Mb (T1)Early 2000s Fiber optic connections up to 1 GB possibleDirect connect or AT&T services (tiered GigaMAN to Opt-E-MAN)LAN like performance over a Wide Area2001 Fiber

Improved Connectivity leads to..Increase in Application Servers vs file serviceOffer more that just web serviceClient server applicationsWayne RESA starts migration of administrative services Need for rack space increasesMinimum amount of cabling Limited bandwidth and redundancyAll gear still not rack mountable shelvesEspecially early Applc serversCabling relatively modest

Life after AV (After Virtualization)Modest start which GROWSWayne RESA installs Vmware 3.0 on release 7/20062 host HP AMD systems with fibre channel SANAdd a few more host systemsMove to iSCSI storage Dell Equallogic in 2008Cabling becomes more complexTrunking Ethernet connections to host computers and storage Becomes clearFuture expansion will head us to a cabling migraine

Cabling Complexity

Other FactorsSpace showing its ageHP installer in 2006 comments on poor quality of facility electrical only grade of ASeveral minor electrical problems in 2010 (Cisco 6509 power supply pops)Internet demand closing in on 1 Gb by spring 2010AT&T GigaMAN at 100 Mb and AT&T Opt-E-MAN at 600 MbOther building wings at RESA remodeledRecent remodel of Media and Technology areaWaiting for payroll application to migrate off mainframe

Server room space limitedServer racks against office wall

Limited ExpansionOffice space behind air handler (CRAC unit)

Facility concernsAudit issuesNo swipe card access to machine room Push code lockNo fire suppression systemNo under floor leak detectionCooling issues on extremely hot daysLimited space behind racks perhaps the reasonPeople and equipment same space - dont mix wellTemperature and noiseCleanliness of areaSecurityFactors in our favorCoolingReplaced Glycol system in 2007Two new CRAC unitsPowerUPS replaced in 2009Plenty of excess capacityDiesel generator installed in early 2000s10,000KVPowers the entire buildingBoard of Education and administrative team supportConnectivity limitsCisco 6509 installed in 20011 Gb max per interfaceUpgrade supervisor and added FWSM (firewall)Trunking multiple gig connections from servers Some through old gigabit aggregation switchesPotential reliability issuesDidnt want to trunk Merit connectionsPacket shaping with Exinda unit messy routingServer storage connections limited to trunked gig connections as well through stackable switches

Solving connectivity challengeAny upgrade would be expensiveDiscussion on remodeling made easier to also discuss new network hardwareNew place new gear made senseConsidered an upgrade to a Cisco Nexus 7000No firewall bladeVery expensive More suitable to data center only use not WAN

Connectivity improves 6/2011Decided on a mixed upgradeUpgrade Cisco 6509 chassis to an E seriesBackplane supports 10Gb modulesUse existing supervisors FWSM used to firewall server VLANsPurchase Nexus 5000 unit for storage aggregationUse TOR (Top Of Rack) switches to connect both servers and storage over trunked 10GbHighly reliable and scalable at a relatively low costUse 6509E for all routingPurchase 2 Cisco ASA 5585 Firewall and IPS unitsMulti gigabit firewall and IPS capable

Process startsPreliminary meeting with Plante Moran 11/2010Information gathering phase and budget discussionsDivide the project into 2 teamsFacilities Headed by Rick Crosby, RESA Building MangerTechnical team members as neededTechnical Headed by Jim Rarus, LAN and Network ManagerWhat about the fiber opticsFacilities or Technical ?Added to the Facilities budgetDecided to re-run cabling to 7 building IDF closetsPrevious cable path down the middle of the computer room

ComplicationsDrawings assembled by Barton Marlow and IDSRFP releasedGeneral contractor selectedElectrical ChallengeUPS was large enough but over tappedComplete power down needed to provide service to new spaceJust power down for the day! (Sure, last time was 7/09)Wire the new spaceTap into the UPS as first step of the move

Technical TeamLearning StrategiesTake advantage of Merit Professional LearningGlobal Knowledge Data Center Design Webinar 3 daysSeveral annual meetingsMJTS sessionsTrusted vendorsCisco and resellersJEM ComputerDell ComputersData StrategySite visit - SteelcasePersonal researchChallenge to staff you know whats wrong with our current infrastructure here is your chance to correct it! Dont come back later with a should of done

Fun BeginsFormal Project Kick Off Meeting - 7/27/11Team meetings at least once per monthEstablish space requirements 20 racks over 2 aisles plus 1 aisle for future growth (1200 square feet) with a work area (450 square feet)Tech Team previously selected rack vendorOrdered 6509E, Nexus, ASA firewalls, 3 HP DL 585 host systems and 2 shelves of Dell Equallogic storageConnect in new racks in existing spaceDemand for computer resources cant wait for construction

TimelineTechnical team continue to upgrade hardware as needed to meet demandsShort term electrical workConstruction to start Spring Break, (4/2012)Wall off construction area from existing server areaSet up temporary cooling 3 units needed for server area2 needed for mainframeShutdown and move to start Friday evening July 6, 2012Date set in February

Steps along the wayUpgradesNeed to support 10Gb connectivity in 9/2011Downtime needed Sunday 7/24/11 close to start of schoolImplement Nexus and servers/storage as time allowed in new racksNo downtimeBring 10Gb servers and storage on line over next few monthsFirewall over winter break, 2011Re-rack most servers and storage3 days over Winter Break, 2011Lots of preliminary pre-wiringMajor reason why final July move was so smooth!

Network upgrade Summer 2011

Winter break new racks

Construction work proceedsStarts with Spring Break after my staff relocatesNew office space in another wing a concern!Computer room will be a lights out operationTemporary CoolingWorks fine uses lots of flowing water to remove heatConstruction wallDoor left open several times between area Opps!Perhaps plastic with no door a better solutionMore of everythingMore wiring, removal of more plumbing, still under budget

New office space natural light

Work begins Construction Wall

Down to the floor

Clean up the plumbing

Remove old offices

New plumbing

Old Fiber

Taking shape

Removing the wall

Moving Day

Terminate new fiber

Team work

Clean up

Entry via work room/trap

The New Look

Standard Rack Raritan Recipe for expansion - MIXPlatinum Convertible rackTwo Dominion PX Power Control Units (208 volt)Temperature and humidity sensorKVM over IP (above interconnect)Two Cisco Nexus fabric extender switchesCisco 2960 Management interfaces such as LOM

Add servers, storage or appliances

Hardware highlightsCisco 6509E10Gb connectivityInfrastructure routing and VlAN firewallRedundant Cisco ASA 5585 with IPSInternet facing high performanceVMware ESXi14 HP DL 585 4 core servers3 oldest development labDell Equallogic Storage8 shelves 54 Tb of storageRedundant Nexus 5548 chassisSupporting 2232 (10Gb) and 2248TP (1Gb) TOR Exinda 8560Shaping and bandwidth monitoring (10Gb upgrade)

Top Of Rack (TOR) switching

High Level Network Diagram

Improved connectivityUse Cisco Twinax cables for 10GbFar cheaper than fiber 10Gb GBICs

Proper rack layoutFiber cabinets near equipment

Managed PDUs and grounding

Fire SuppressionHFC 125Environmental safe replacement for HalonDoes not displace oxygenExtinguishes fire through chemical and physical mechanisms

What did we learn?Renovation takes time and is a costly formal processStaff experienced with construction - wonderfulNeed a core group of dedicated technical staffNeed to work with vendors you can trustYou can never plan too muchLikely need to implement over a long period of timeBreak up large tasksThe final cutover long hours for many individualsDo it right so you dont have to do it againAlways remember who is the customer you!

Design take awaysFailover and redundancy is part of the planAs much as is affordableMay cause a major re-engineering of your infrastructurePower, network connections, electronics, storage systemsBuy quality hardwareMinimize complexityKeep cabling contained by rackGood labeling and documentationSet a standard for future expansionRacks, PDU, monitoring, TOR electronicsRemote management and monitoringAs much as possibleBad things happen at bad timesUpgrades lead to more upgrades10Gb core requires 10Gb interfaces on support hardwareGive a Mouse a Cookie analogy