the national grid service
DESCRIPTION
The National Grid Service. Guy Warner NeSC Training Team. Acknowledgements. This talk was written by Mike Mineter of the NeSC Training Team. Some NGS and GOSC slides are taken from a talk by Stephen Pickles, Technical Director of GOSC Also slides from Malcolm Atkinson on e-Science programme. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/1.jpg)
http://www.ngs.ac.ukhttp://www.grid-support.ac.uk
http://www.eu-egee.org/http://www.pparc.ac.uk/http://www.nesc.ac.uk/
The National Grid ServiceGuy Warner
NeSC Training Team
![Page 2: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/2.jpg)
3
Acknowledgements
• This talk was written by Mike Mineter of the NeSC Training Team.
• Some NGS and GOSC slides are taken from a talk by Stephen Pickles, Technical Director of GOSC
• Also slides from Malcolm Atkinson on e-Science programme
![Page 3: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/3.jpg)
4
Overview
• The UK e-science programme
• Grid Operations Support Centre
• The NGS
![Page 4: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/4.jpg)
5
NERC (£15M)7%
CLRC (£10M)5%
ESRC (£13.6M)6%
PPARC (£57.6M)27%
BBSRC (£18M)8%
MRC (£21.1M)10%
EPSRC (£77.7M)37%
Staff costs only -Grid Resources
Computers & Networkfunded separately
Applied (£35M)45%
HPC (£11.5M)15%
Core (£31.2M)40%
EPSRC Breakdown
UK e-Science Budget (2001-2006)
Source: Science Budget 2003/4 – 2005/6, DTI(OST)
Total: £213M
+ Industrial Contributions £25M
+ £100M via JISC
![Page 5: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/5.jpg)
6
Globus Alliance
CeSC (Cambridge)
DigitalCurationCentre
e-Science Institute
Open Middleware
Infrastructure Institute
The e-Science Centres
EGEE
Grid Operations
SupportCentre
NationalCentre fore-SocialScience
National Institute
forEnvironmental
e-Sciencehttp://www.nesc.ac.uk/centres/
![Page 6: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/6.jpg)
7
Grid Operations Support Centre
![Page 7: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/7.jpg)
8
GOSC and the National Grid Service
The NGS is the core UK grid, intended for the production use of computational and data grid resources. NGS is the core service resulting from the UK's e-Science programme. NGS is supported by JISC, and is run by the Grid Operations Support Centre (GOSC).
![Page 8: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/8.jpg)
9
GOSC
The Grid Operations Support Centre is a distributed “virtual centre” providing deployment and operations support for the UK e-Science programme.
![Page 9: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/9.jpg)
10
GOSC Roles UK Grid Services
National Services Authentication, authorization, certificate management, VO management,
security, network monitoring, help desk + support centre. [email protected]
NGS Services Job submission, data transfer, data access and integration, resource
brokering, monitoring, grid management services, operations centre,… NGS core-node Services
CPU, (meta-) data storage, key software Services coordinated with others (eg OMII, NeSC, LCG):
Integration testing, compatibility & Validation Tests, User Management, training
Administration: Policies and acceptable use conditions Resource providers: service level agreements,… Coordinate deployment and Operations
![Page 10: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/10.jpg)
11
The National Grid Service
![Page 11: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/11.jpg)
12
Scales of collaboration
• The pattern that is emerging:– Inside an institute: computation (e.g. Condor) and
data– Multi-disciplinary in a university: across admin
domains (“Campus grids”)– Nation-wide collaboration: National Grid Service
(core resources + resources for a VO)– International collaboration: EGEE
• Campus Grid Meeting on today at NeSC, Edinburgh– http://www.nesc.ac.uk/esi/events/556/
![Page 12: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/12.jpg)
13
Core NGS resources
• Nodes providing compute services– White Rose Grid, Leeds: grid-compute.leeds.ac.uk– Oxford e-Science Centre: grid-compute.oesc.ox.ac.uk
• For list of compilers, software,… e.g. http://www.ngs.ac.uk/sites/ox/software/
• Nodes providing data services– Manchester: grid-data.man.ac.uk– Rutherford Appleton Lab. (RAL): grid-data.rl.ac.uk
• When you have joined the NGS you can access each of the core nodes. http://www.ngs.ac.uk/resources.html
![Page 13: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/13.jpg)
14
NGS software
• Computation services based on GT2– Use compute nodes for sequential or parallel jobs,
primarily from batch queues– Can run multiple jobs concurrently (be reasonable!)
• Data services:– Storage Resource Broker:
• Primarily for file storage and access• Virtual filesystem with replicated files
– “OGSA-DAI”: Data Access and Integration• Primarily for grid-enabling databases (relational, XML)
– NGS Oracle service
![Page 15: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/15.jpg)
16
Gaining Access
NGS core nodes
• data nodes at RAL and Manchester
• compute nodes at Oxford and Leeds
• all access is through digital X.509 certificates– from UK e-Science CA– or recognized peer
National HPC services
• HPCx
• CSAR
• Must apply separately to research councils
• Digital certificate and conventional (username/ password) access supported
![Page 16: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/16.jpg)
17
UofA
HPCx
UofD
GOSC
NGS Core Nodes: Host core services, coordinate integration, deployment and support+free to access resources for all VOs. Monitored interfaces + services
NGS Partner Sites: Integrated with NGS, some services/resources available for all VOs Monitored interfaces + services
NGS Affiliated Sites: Integrated with NGS, support for some VO’sMonitored interfaces (+security etc.)
RAL Oxford
LeedsMan. CSAR
UofB
UofC
CommercialProvider
PSRE
BRISTOL
CARDIFF
![Page 17: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/17.jpg)
18
Managing middleware evolution
• Important to coordinate and integrate this with deployment and operations work in EGEE, LCG and similar projects.
• Focus on deployment and operations, NOT development.
ETF
NGSOther software sources Software with proven
capability & realistic deployment experience
‘Gold’ services
Prototypes &specifications
Feedback & future requirements
EGEE…
Deployment/testing/advice
Operations
Engineering Task Force
UK,Campus and other grids
![Page 18: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/18.jpg)
19
Key facts
• Production: deploying middleware after selection and testing – major developments via Engineering Task Force.
• Evolving: – Middleware– Number of sites– Organisation:
• VO management
• Policy negotiation: sites, VOs
• International commitment• Gathering users’ requirements – National Grid Service
![Page 19: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/19.jpg)
20
Joining the NGS• See talks given at NGS Induction events at NeSC, Edinburgh
– http://www.nesc.ac.uk/training
![Page 20: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/20.jpg)
28
Web Sites
• NGS– http://www.ngs.ac.uk– To see what’s happening: http://ganglia.ngs.rl.ac.uk/
• GOSC– http://www.grid-support.ac.uk
• CSAR– http://www.csar.cfs.ac.uk
• HPCx– http://www.hpcx.ac.uk
![Page 21: The National Grid Service](https://reader036.vdocuments.site/reader036/viewer/2022081506/5681445c550346895db0f42a/html5/thumbnails/21.jpg)
29
Summary
• NGS is a production service– Therefore cannot include latest research prototypes!– ETF recommends what should be deployed
• Core sites provide computation and also data services
• NGS is evolving– OMII, EGEE, Globus Alliance all have m/w under assessment by
the ETF for the NGS • Selected, deployed middleware currently provides “low-level” tools
– New deployments will follow soon– New sites and resources being added !