gridpp steve lloyd, chair of the gridpp collaboration board

14
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board

Upload: emery-spencer

Post on 25-Dec-2015

223 views

Category:

Documents


3 download

TRANSCRIPT

GridPP

Steve Lloyd, Chair of the GridPP Collaboration Board

GridPP PPAP July 2014Steve Lloyd Slide 2

A Brief History of GridPP

• 2001 – 2004 GridPP1 – From Web to Grid• 2004 – 2007 GridPP2 – From Prototype to

Production• 2007 – 2008 GridPP2+ One year extension• 2008 – 2011 GridPP3 – From Production to

Exploitation• 2011 – 2015 GridPP4 - Computing in the LHC era • 2015 – 2016 GridPP4+ One year extension• 2016 - 2020? GridPP5

GridPP provides UK Computing resources for the LHC experiments, other HEP experiments and other activities.

GridPP PPAP July 2014Steve Lloyd Slide 3

Project Management

GridPP only exists to facilitate analysis of data from the experiments. This is reflected by the management of the project which has the LHC experiments directly represented.

Oversight Committee

(OC)

Collaboration Board

(CB)

Project Management Board

(PMB)

Operations Team

(Ops-Team)

Experiments Liaison

Review

Operation

UtilisationProvision

GridPP PPAP July 2014

GridPP Activities

Steve Lloyd Slide 4

Tier-1 Hardware 23%Tier-1 Staff 20%

Tier-2 Hardware 10%Tier-2 Staff 23%

Operations Staff 15%

Manage/Travel 10%

GridPP PPAP July 2014

Hardware Costs

Steve Lloyd Slide 5

STFC have signed an MoU with WLCG for the UK to provide Tier-1 and Tier-2 resources

The LHC Experiments provide estimates of their required future resources

These are scrutinised and approved by the CERN Computing Resource Review Board (CRRB) We multiply by the ~UK authorship fraction: 2% of ALICE, 12.5% of ATLAS, 8%/5% of CMS and 31.5%/21.5% of LHCb [Tier-1/Tier-2]We add ~10% for other experiments then multiply by the estimated cost

Gives total hardware cost per year at Tier-1 and Tier-2s – NO DISCRETION!

GridPP PPAP July 2014

RAL Tier-1

Steve Lloyd Slide 6

5,760 Logical CPUs59,641 HEPSPEC0612 PB Disk12 PB Tape

The UK Tier-1 at RAL provides:

• CPU and disk resources to meet the UK’s WLCG Tier-1 MoU requirements;

• An Archival (robotic) Tape service to preserve raw LHC data;

• Manpower to maintain and operate the disk and tape systems, Grid Middleware, Oracle databases, fabric and day to day production;

• Embedded manpower to interface directly with ATLAS, CMS and LHCb;

• Excellent reliability, a high level of availability and rapid responsiveness;

• Excellent Network connections including an optical private network (OPN) direct connection to CERN.

GridPP PPAP July 2014

Tier-1 Delivery

Steve Lloyd Slide 7

Tier-1 CPU Delivery 2011-2014

GridPP PPAP July 2014Steve Lloyd Slide 8

Tier-2s

The UK Tier-2s provide:

• CPU and disk resources to meet the UK’s WLCG Tier-2 MoU requirements;

• Manpower to support Group Analysis Sites, with large amounts of disk and excellent network connections, for ATLAS and CMS;

• Manpower to provide all experiments with CPU and disk for opportunistic user analysis and Monte Carlo simulation;

• A distributed ecosystem to support UK physicists doing their analysis;

• The majority of the Deployment, Operations, and Support staff who transform the distributed resources into a coherent Grid infrastructure

• Hardware resources for testing middleware releases, new technologies and running some core services;

• A successful framework for leveraging local resources and support;

• Opportunities for reaching out to other communities to support STFC’s impact agenda.

GridPP PPAP July 2014

Tier-2 Resources

Steve Lloyd Slide 9

Large sites (green) have ~2 FTE staffMedium sites (yellow) have ~1 FTE staffSmall sites (blue) have ~0 FTE staff

London

ScotGrid

NorthGrid

SouthGrid

Some redistribution following mid-term review

GridPP PPAP July 2014

Tier-2 Delivery

Steve Lloyd Slide 10

Tier-2 CPU Delivery 2011-2014

GridPP PPAP July 2014

UK Tier-2 Delivery

Steve Lloyd Slide 11

UK Tier-2 CPU Delivery 2011-2014

GridPP PPAP July 2014

External Activities

Steve Lloyd Slide 12

GridPP is part of WLCG that combines:• EGI (European Grid Infrastructure)• OSG (Open Science Grid) in the US• NorduGrid in the Nordic countries

There are new Initiatives looking towards Horizon 2020 etc. such as • EU-T0 – an initiative to federate national centres into a European

Computing Centre for Experimental Data Management formed by major EU funding agencies inc. STFC.

• HEP Software Foundation – a collaborative framework to develop and maintain all major HEP-related software.

• VLData – a generic platform for distributed computing integrating existing Grid, cloud and other computing and storage resources.

• UK Project Directors Group.• E-infrastructure academic user community.• JOIN – A joint DiRAC (HPC)-GridPP Authentication and Authorization

Infrastructure (AAI) initiative

GridPP PPAP July 2014Steve Lloyd Slide 13

The Future

13

GridPP5

GridPP PPAP July 2014

The Future

Steve Lloyd Slide 14

Hardware costs will hopefully continue to fall (Moore’s Law etc) but:Data accumulates and trigger rates are going up (x2.5 Run-2, x10 Run-3?) and more complex (pile-up) requiring much more hardwareThe Grid is continually evolving – multi-cores, new architectures, clouds, storage technologies, WAN access etc.

Manpower requirements may decrease slightly but not dramatically in face of increased resources and complexity

GridPP with continue to be a necessary component of UK particle physics for many years for ALL experiments.

Distinction between Tier-1, Tier-2 is blurring.

LUX/ZEPLIN are bidding for computing and storage that will be run by GridPP leveraging existing support and expertise. Could be a good model for other communities, T2K, Hyper-K, ILC, NA62 etc.)