tony doyle - university of glasgow gridpp edg - uk contributions architecture testbed-1 network...
Post on 28-Mar-2015
215 Views
Preview:
TRANSCRIPT
Tony Doyle - University of Glasgow
GridPP GridPP
EDG - UK Contributions
ArchitectureTestbed-1Network MonitoringCertificates & SecurityStorage Element R-GMALCFGMDS deploymentGridSiteSlashGridSpitfire…
Applications (start-up phase)
BaBarCDF/D0 (SAM)ATLAS/LHCbCMS(ALICE)UKQCD
£17m 3-year project funded by PPARC
CERN - LCG (start-up phase)
funding for staff and hardware...
£3.78m
£5.67m
£3.66m
£1.99m
£1.88m
CERN
DataGrid
Tier - 1/A
Applications
Operations
http://www.gridpp.ac.uk
Tony Doyle - University of Glasgow
GridPP –GridPP – Achievements and Issues Achievements and Issues
• 1st Year Achievements• Complete Project Map
– Applications: Middleware: Hardware
• Fully integrated with EU DataGrid and LCG Projects
• Rapid middleware deployment /testing
• Integrated US-EU applications development e.g. BaBar+EDG
• Roll-out document for all sites in the UK (Core Sites, Friendly Testers, User Only).
• Testbed up and running at 15 sites in the UK
• Tier-1 Deployment• 200 GridPP Certificates issued• First significant use of Grid by an
external user (LISA simulations) in May 2002
• Web page development (GridSite)
• Issues for Year 2• Status: 19 Jul 2002 17:52 GMT –
keep monitoring and improve testbed deployment efficiency
• Importance of EU-wide development of middleware
• Integrated Testbed for use/testing by all applications
• Reduce “integration” layer between middleware and application software
• Integrated US-EU applications development
• Tier-1 Grid Production Mode• Tier-2 Definitions and Deployment• Integrated Tier-1 + Tier-2 Testbed• Transfer to UK e-Science CA• Integration with other UK projects
e.g. AstroGrid
Tony Doyle - University of Glasgow
GridPP Sites in Testbed: GridPP Sites in Testbed: Status 19 Jul 2002 17:52Status 19 Jul 2002 17:52
Project MapProject MapSoftware releases Software releases at each siteat each site
Tony Doyle - University of Glasgow
UK Tier-1 RALUK Tier-1 RAL
New Computing Farm
4 racks holding 156 dual 1.4GHz Pentium III cpus. Each box has 1GB of memory, a 40GB internal disk and 100Mb ethernet.
50TByte disk-based Mass Storage Unit
after RAID 5 overhead. PCs are clustered on network switches with up to 8x1000Mb ethernet out of each rack.
Tape Robotupgraded last yearuses 60GB STK 9940 tapes 45TB currrent capacitycould hold 330TB.
2004 Scale: 1000 CPUs0.5 PBytes
Tony Doyle -University of Glasgow
LHC Computing ChallengeLHC Computing Challenge
Tier2 Centre ~1 TIPS
Online System
Offline Farm~20 TIPS
CERN Computer Centre>20 TIPS
RAL Regional Centre
US Regional Centre
French Regional Centre
Italian Regional Centre
InstituteInstituteInstituteInstitute ~0.25TIPS
Workstations
~100MBytes/sec
~100MBytes/sec
100 -1000Mbits/sec
•One bunch crossing per 25 ns
•100 triggers per second
•Each event is ~1 Mbyte
Physicists work on analysis “channels”
Each institute has ~10 physicists working on one or more channels
Data for these channels should be cached by the institute server
Physics data cache
~PBytes/sec
~ Gbits/sec or Air Freight
Tier2 Centre ~1 TIPS
Tier2 Centre ~1 TIPS
~Gbits/sec
Tier Tier 00
Tier Tier 11
Tier Tier 33
Tier Tier 44
1 TIPS = 25,000 SpecInt95
PC (1999) = ~15 SpecInt95
ScotGRID++ ~1 TIPS
Tier Tier 22
Tony Doyle - University of Glasgow
UK Tier-2 ScotGRIDUK Tier-2 ScotGRID
ScotGrid Processing nodes at Glasgow 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory • 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and dual ethernet • 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and 100 + 1000 Mbit/s ethernet • 1TB disk • LTO/Ultrium Tape Library • Cisco ethernet switches
ScotGrid Storage at Edinburgh• IBM X Series 370 PIII Xeon with 512 MB memory 32 x 512 MB RAM • 70 x 73.4 GB IBM FC Hot-Swap HDD
CDF equipment at Glasgow• 8 x 700 MHz Xeon IBM xSeries 370 4 GB memory 1 TB disk
Griddev testrig at Glasgow• 4 x 233 MHz Pentium II
2004 Scale: 300 CPUs0.1 PBytes
BaBar UltraGrid System at Edinburgh• 4 UltraSparc 80 machines in a rack 450 MHz CPUs in each 4Mb cache, 1 GB memory • Fast Ethernet and Myrinet switching
Tony Doyle -University of Glasgow
LHC Computing ChallengeLHC Computing Challenge
Tier2 Centre ~1 TIPS
Online System
Offline Farm~20 TIPS
CERN Computer Centre>20 TIPS
RAL Regional Centre
US Regional Centre
French Regional Centre
Italian Regional Centre
InstituteInstituteInstituteInstitute ~0.25TIPS
Workstations
~100MBytes/sec
~100MBytes/sec
100 -1000Mbits/sec
•One bunch crossing per 25 ns
•100 triggers per second
•Each event is ~1 Mbyte
Physicists work on analysis “channels”
Each institute has ~10 physicists working on one or more channels
Data for these channels should be cached by the institute server
Physics data cache
~PBytes/sec
~ Gbits/sec or Air Freight
Tier2 Centre ~1 TIPS
Tier2 Centre ~1 TIPS
~Gbits/sec
Tier Tier 00
Tier Tier 11
Tier Tier 33
Tier Tier 44
1 TIPS = 25,000 SpecInt95
PC (1999) = ~15 SpecInt95
ScotGRID++ ~1 TIPS
Tier Tier 22
Tony Doyle - University of Glasgow
NetworkNetwork
• Internal networking is currently a hybrid of – 100Mb(ps) to nodes of cpu farms – 1Gb to disk servers– 1Gb to tape servers
• UK: academic network SuperJANET4 – 2.5Gb backbone upgrading to 20Gb in 2003
• EU: SJ4 has 2.5Gb interconnect to Geant• US: New 2.5Gb link to ESnet and Abilene for researchers
• UK involved in networking development
– internal with Cisco on QoS– external with DataTAG
Tony Doyle - University of Glasgow
Experiment Deployment Experiment Deployment
Tony Doyle - University of Glasgow
t0
t1
From Grid to Web…From Grid to Web…using GridSiteusing GridSite
top related