pdc site update

11
1 PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham PDC Site Update at HP-CAST NTIG April 1st 2008 Linköping by Peter Graham [email protected] PDC Kungl Tekniska Högskolan

Upload: ezhno

Post on 31-Jan-2016

22 views

Category:

Documents


0 download

DESCRIPTION

PDC Site Update. by Peter Graham [email protected] PDC Kungl Tekniska Högskolan. at HP-CAST NTIG April 1st 2008 Linköping. PDC premises since 2004. PDC a Centre for HP Scientific Computing (HP=High Performance). HP Itanium cluster Lucidor. upgraded in 2007 to…. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: PDC Site Update

1

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

PDC Site Update

at HP-CAST NTIGApril 1st 2008

Linköping

byPeter Graham [email protected] PDC

Kungl Tekniska Högskolan

Page 2: PDC Site Update

2

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

PDC premises since 2004

Page 3: PDC Site Update

3

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

Page 4: PDC Site Update

4

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

PDC a Centre for HP Scientific Computing (HP=High Performance)

HP Itanium cluster Lucidor

Page 5: PDC Site Update

5

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

upgraded in 2007 to…

HP Itanium cluster Lucidor 2

Page 6: PDC Site Update

6

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

Lucidor2 at a glance

• The system contains 106 nodes, each with four Itanium2 (McKinley) 1.3GHz CPUs. 22 nodes have 48 Gb RAM and the rest have 32 Gb RAM. At least 64 nodes available for general (i.e. SNIC) user.

• The interconnect, Myrinet 2000, now uses MX stack which improves latency.

• Myrinet M3-E128 switch is populated with 112 ports. Each card/port has a data rate of 2+2 Gbit/s, all through 50/125 multi-mode fiber.

• Linux distribution CentOS 5.

Page 7: PDC Site Update

7

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

Latest HP addition ”Key”

HP SMP system (donated to KTH in 2008)

Key is a shared memory system consisting of 32 1.6 GHz cores of IA64 (Intel) type with 18 MB cache. The total main memory will be 256 GB.

(named after Ellen Key)

Page 8: PDC Site Update

8

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

Current systems

• Lenngren, 442 nodes, Dell 1850• Lucidor2, 106 nodes, HP Itanium2• SweGrid, 100 nodes, South Pole Pentium 4• SBC , 354 nodes, Dell P4 and South Pole Athlon XP• Hebb, IBM BlueGene/L

New systems:• Key , HP SMP, 16 nodes• Ferlin, 680 nodes, Dell M600 blade• SweGrid2, 90 nodes, Dell M600 blade• Climate and turbulence system, under joint

procurement with NSC, for SMHI, MISU at SU and Dep of Mechanics at KTH

Page 9: PDC Site Update

9

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

Infrastructure power & cooling

• Change of transformer from 800 kVA to 2 MVA, done• Upgrading of UPS from 400 kVA to 1100 kVA, in progress• Diesel generator 400 kVA, existing• Upgrade of cooling exchanger, done• Adding 300 kW APC cooling hut for Ferlin and SweGrid2• Addition of 300 kW chiller for redundant cooling , in progress

Page 10: PDC Site Update

10

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

Price-performance vs energy

• Power per node 250-400W• Energy cost for 300W over 4 years is nearly 15 kSEK• If you pay 15 kSEK per node you spend equal amount

on investment and energy• To develop more energy efficient nodes will give a

competitive advantage• We would prefer to spend money on application experts

rather than on energy bills

Page 11: PDC Site Update

11

PDC Site Update at HP-CAST NTIG 1st April 2008 by Peter Graham

Summing up

• PDC is tripling power capacity to meet the need of the new systems coming in

• High density cooling is required for the new systems with around or above 20 kW per rack

• Energy efficiency, both in regards to cost but also out of environmental concern is becoming more important

• Our new patch cables… -->>(for the UPS)