atlas data challenges chep03 la jolla, california 24 th march 2003 gilbert poulard atlas data...

38
ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid team

Upload: dayna-jacobs

Post on 02-Jan-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

ATLAS Data Challenges

CHEP03La Jolla, California24th March 2003

Gilbert PoulardATLAS Data Challenges Co-ordinatorCERN EP-ATC

For the ATLAS DC & Grid team

Page 2: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 2

Outline

Introduction ATLAS Data Challenges activities Grid activities in ATLAS DC1 Beyond DC1 SummaryConclusion

Page 3: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 3

Introduction Several talks on the LHC project and on GRID

projects in this conference I will not ‘repeat’ what is said in several other

presentations on LHC, LCG and GRID

This talk concentrates on the on-going “Data Challenges” activities of ATLAS, one of the four LHC experiments

Page 4: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 4

ATLAS experiment

Diameter 25 mBarrel toroid length 26 mEndcap end-wall chamber span 46 mOverall weight 7000 Tons

ATLAS:~ 2000 Collaborators~150 Institutes34 Countries

Page 5: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 5

ATLAS “and building 40”

Page 6: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 6

Aims of the Experiment

5 discovery

~1 year ~3 years

~ 4 years

presentlimit

Search for the Higgs boson

• Measure the Standard Model Higgs Boson

• Detect Supersymmetric states

• Study Standard Model QCD, EW, HQ Physics

• New Physics?

ATLASATLAS

H ZZ(*) 4

~1 year

Page 7: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 7

Detectors

Front-endPipelines

ReadoutBuffers

EventBuilder

Buffers &ProcessingFarms

DataStorage

Readout Drivers

Level 1

Level 2

Event Filter

LVL2– Region-of-

Interest (RoIRoI)

– Specialized algorithms– Fast selection with

early rejection EF

Full event available Offline derived

algorithms Seeding by LVL2 Best calibration /

alignment Latency less demanding

LVL1 Hardware based

(FPGA and ASIC) Coarse calorimeter

granularity Trigger muon

detectors: RPCs and TGCs

The ATLAS Trigger/DAQ System

1 GHz interaction rate /

<75 (100) kHz

O(1) kHz output rate

O(100) Hz output event rate

~100 GB/s output data flow

O(100) MB/s output data flow

O(1) GB/s output data flow

2 s latency

O(10) ms latency

~ seconds latency

40 MHz bunch-crossing rate

RoI Pointers

HLTHLT

Page 8: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 8

Data Every event will consist of 1-1.5 MB (all

detectors together) After on-line selection, events will be written to

permanent storage at a rate of 100-200 Hz Total amount of “raw” data: 1 PB/year To reconstruct and analyze this data: Complex

“Worldwide Computing Model” and “Event Data Model” Raw Data @ CERN Reconstructed data “distributed” All members of the collaboration must have

access to “ALL” public copies of the data (at a reasonable access speed)

Page 9: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 9

ATLAS Computing Challenge

The emerging World Wide computing model “is an answer” to the LHC computing challenge

In this model the Grid must take care of: data replicas and catalogues condition data base replicas, updates and

synchronization access authorizations for individual users, working

groups, production managers access priorities and job queues

Validation of the new Grid computing paradigm in the period before the LHC requires Data Challenges of increasing scope and complexity

Page 10: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 10

Data Challenges are the way to test the prototype infrastructure before the start of the real experiment (2007)

ATLAS plans to run one Data Challenge per year, with increasing complexity and amount of data

Each Data Challenge consists of the following steps: Physics event generation (Pythia, Herwig, ...) Event simulation (Geant3, Geant4) Background simulation, pile-up and detector

response simulation (all these depend on luminosity)

Event reconstruction Event analysis

Data can be (re-)distributed to different production sites between any of these steps

} this is the real challenge!

Systems Tests: Data Challenges

Page 11: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 11

DC0: readiness & continuity tests(December 2001 – June 2002)

“3 lines” for “full” simulation 1) Full chain with new geometry (as of January 2002)

Generator->(Objy)->Geant3->(Zebra->Objy)->Athena recon.->(Objy)->Analysis 2) Reconstruction of ‘Physics TDR’ data within Athena

(Zebra->Objy)->Athena rec.-> (Objy) -> Simple analysis 3) Geant4 robustness test

Generator-> (Objy)->Geant4->(Objy)

“1 line” for “fast” simulationGenerator-> (Objy) -> Atlfast -> (Objy)

Continuity test: Everything from the same release for the full chain (3.0.2) we learnt a lot (we underestimated the implications of that statement) completed in June 2002

Page 12: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 12

ATLAS DC1 Phase I (July-August 2002)

Primary concern was delivery of events to High Level Trigger (HLT) community Goal ~107 events (several samples!)

Put in place the MC event generation & detector simulation chain Switch to AthenaRoot I/O (for Event generation) Updated geometry Filtering Validate the chain:

Athena/Event Generator -> (Root I/O)->Atlsim/Dice/Geant3->(Zebra)

Put in place the distributed MonteCarlo production “ATLAS kit” (rpm) Scripts and tools (monitoring, bookkeeping)

AMI database; Magda replica catalogue; VDC Quality Control and Validation of the full chain

Page 13: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 13

Tools used in DC1

AMI Magda MagdaVDC

AtCom GRAT

replica catalog

physics metadatarecipe catalogPerm. production logTrans. production log

physics metadataperm production logtrans production logreplica catalog

recipe catalog

interactiveproduction framework

automaticproduction framework

AMI

physics metadata

Page 14: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 14

ATLAS GeometryScale of the problem:25,5 millions distinct volume copies23 thousands different volume objects4,673 different volume typesmanaging up to few hundred pile-up eventsone million hits per event on average

Page 15: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 15

Atlsim/Geant3+ Filter

105 events

Atlsim/Geant3+ Filter

Hits/Digits

MCTruth

Atlsim/Geant3+ Filter

As an example, for 1 sample of di-jet events: Event generation: 1.5 x 107 events in 150 partitions Detector simulation: 3000 jobs

Pyt

hia

6

Di-jet

Athena-Root I/O Zebra

HepMC

HepMC

HepMC

Event generation

DC1/Phase I Task Flow

Detector Simulation

(5000 evts) (~450 evts)

Hits/Digits

MCTruth

Hits/Digits

MCtruth

Page 16: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 16

DC1: validation & quality control

We defined two types of validation Validation of the sites:

We processed the same data in the various centres and made the comparison

• To insure that the same software was running in all production centres

• We also checked the random number sequences

Validation of the simulation: We used both “old” generated data & “new” data

• Validation datasets: di-jets, single ,e, ,H4e/2/2e2/4• About 107 evts reconstructed in June, July and August• Comparison made also with previous simulations

“QC” is a “key issue” for the success

Page 17: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 17

Comparison Procedure

Test sample

Reference sample

Superimposed Samples

Contributions to 2

Page 18: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 18

Comparison procedure endswith a 2 -bar chart summary

Give a pretty nice overview

of how samples compare:

Summary of Comparison

Page 19: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 19

Validation samples (740k events)

single particles (e, , , ), jet scans, Higgs events

Single-particle production (30 million events)

single (low pT; pT=1000 GeV with 2.8<<3.2) single (pT=3, …, 100 GeV) single e and

different energies (E=5, 10, …, 200, 1000 GeV) fixed points; scans (||<2.5); crack scans

(1.3<<1.8) standard beam spread (z=5.6 cm);

fixed vertex z-components (z=0, 4, 10 cm)

Minimum-bias production (1.5 million events)

different regions (||<3, 5, 5.5, 7)

Data Samples I

Page 20: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 20

QCD di-jet production (5.2 million events)

different cuts on ET(hard scattering) during generation

large production of ET>11, 17, 25, 55 GeV samples, applying particle-level filters

large production of ET>17, 35 GeV samples, without filtering, full simulation within ||<5

smaller production of ET>70, 140, 280, 560 GeV samples

Physics events requested by various HLT groups (e/, Level-1, jet/ETmiss, B-physics, b-jet, ; 4.4 million events)

large samples for the b-jet trigger simulated with default (3 pixel layers) and staged (2 pixel layers)

layouts

B-physics (PL) events taken from old TDR tapes

Data Samples II

Page 21: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 21

Contribution to the overall CPU-time (%) per country

1,41%

10,92%

0,01%

1,46%9,59%2,36%

4,94%

10,72%

2,22%

3,15%

4,33%

1,89%

3,99%

14,33%

0,02%

28,66%

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

ATLAS DC1 Phase 1 : July-August 20023200 CPU‘s110 kSI9571000 CPU days

5*10*7 events generated1*10*7 events simulated3*10*7 single particles30 Tbytes35 000 files

39 Institutes in 18 Countries1. Australia

2. Austria3. Canada4. CERN5. Czech Republic6. France7. Germany8. Israel9. Italy10. Japan11. Nordic12. Russia13. Spain14. Taiwan15. UK16. USA

grid tools

used at 11 sites

Page 22: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 22

ATLAS DC1 Phase II (November 02/March 03)

Provide data with and without ‘pile-up’ for HLT studies Pile-up production new data samples (huge amount of requests) “Byte stream” format to be produced

Introduction & testing of new Event Data Model (EDM) This includes new Detector Description

Production of data for Physics and Computing Model studies Both ESD and AOD produced from Athena

Reconstruction Testing of computing model & of distributed

analysis using AOD Use more widely GRID middleware

Page 23: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 23

Luminosity Effect Simulation

• Aim Study Interesting Processing at different Luminosity L (cm-

2s-1)• Separate Simulation of Physics Events & Minimum Bias Events

•and cavern background for muon studies• Merging of

• Primary Stream (Physics)• Background Stream(s) (Pileup (& cavern background)) Primary Stream

(KINE,HITS)Background Stream

(KINE,HITS

DIGITIZATION

Bunch Crossing (DIGI)

1 N( L )

Page 24: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 24

Pile-up features

Different detectors have different memory time requiring very different number of minimum bias events to be read in Silicons, Tile calorimeter: t<25 ns Straw tracker: t<~40-50 ns Lar Calorimeters: 100-400 ns Muon Drift Tubes: 600 ns

Still we want the pile-up events to be the same in different detectors !

For Muon studies: in addition “Cavern Background”

Page 25: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 25

Pile-up task flow

ATLSIM

Minimum bias0.5 MB

(460 sec)

Cavern Background20 KB

(0.4 sec)Background

0.5 MB

Physics2 MB

(340 sec)

Pile-up7.5 MB (@HL)

400 sec(Mixing:80

Digitization: 220)

0.03 sec

High Luminosity: 1034

23 events/bunch crossing 61 bunch crossings

Low luminosity: 2 x 1033

Page 26: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 26

Higgsinto twophotons

nopile-up

Page 27: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 27

Higgsinto twophotons

L=10^34pile-up

Page 28: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 28

ATLAS DC1/Phase II: November 2002-March 2003

Goals : Produce the data needed for the HLT TDR Get as many ATLAS institutes involved as possible

Worldwide collaborative activity

Participation : 56 Institutes

Australia Austria Canada CERN China Czech Republic Denmark * France Germany Greece Israel

Italy Japan Norway * Poland Russia Spain Sweden * Taiwan UK USA *

New countries or institutes * using Grid

Page 29: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 29

Preparation for Reconstruction

On-going activities (in several areas) Put in place the infrastructure for the production Get the “reconstruction” software ready and

validated Both Physics & HLT communities involved

Include the dedicated code for HLT studies Lvl1, Lvl2 & Event Filter

Today we are in the validation phase End of March we expect to reconstruct and analyse

a full high statistics sample without pile-up ~ 10% of a high statistics sample with pile-up

Data being concentrated in 8 “sites” Production both on “standard batch” or “GRID”

systems

Page 30: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 30

Primary data (in 8 sites)

6%

20%

6%

31%

4%

4%

4%

25%

1

2

3

4

5

6

7

8

Total amount of primary data: 59.1 TBytes

Alberta ( 3.6)

BNL (12.1)

CNAF (3.6)

Lyon (17.9)

FZK (2.2)

Oslo (2.6)

RAL ( 2.3)

CERN (14.7)

Data (TB)Simulation: 23.7 (40%)Pile-up: 35.4 (60%)Lumi02: (14.5)Lumi10: (20.9)

Pile-up:

Low luminosity ~ 4 x 106 events (~ 4 x 103 NCU days)High luminosity ~ 3 x 106 events ( ~ 12 x 103 NCU days)

Data replication usingGrid tools(Magda)

Page 31: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 31

Grid in ATLAS DC1*

US-ATLAS EDG Testbed Prod NorduGrid

part of Phase 1 reproduce part of full phase 1 & 2production phase 1 data productionFull Phase 2 several testsproduction

[ * See other ATLAS talks for more details]

Page 32: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 32

DC1 production on the Grid

Grid test-beds in Phase 1 11 out of 39 sites (~5% of the total production)

NorduGrid (Bergen, Grendel, Ingvar, OSV, NBI,Oslo,Lund,LSCF)

• all production done on the Grid US-Grid (LBL, UTA, OU)

• ~10% of US DC1 production (~900 CPU.days)

Phase 2 NorduGrid (full pile-up production) US Grid

Pile-up in progress ~ 8TB of pile-up data, 5000 CPU.days, 6000 Jobs

Will be used for reconstruction

Page 33: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 33

Summary on DC1 Phase 1 (summer 2002) was a real success The pile-up production ran quite smoothly

Expects to have it completed by end of March The concentration of the data is on its way

Replication mostly performed with “Magda”

Progress are being made in the organization Integration of tools (production, bookkeeping, replication…)

Validation of the “offline” reconstruction software is progressing well HLT dedicated software will then have to be added

“Massive” production for reconstruction expected by beginning of April

Page 34: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 34

DC2-3-4-… DC2:

Probably Q4/2003 – Q2/2004 Goals

Full deployment of EDM & Detector Description Geant4 replacing Geant3 Test the calibration and alignment procedures Use LCG common software (POOL, …) Use widely GRID middleware Perform large scale physics analysis Further tests of the computing model (Analysis) Run on LCG-1

Scale As for DC1: ~ 10**7 fully simulated events

DC3: Q3/2004 – Q2/2005 Goals to be defined; Scale: 5 x DC2

DC4: Q3/2005 – Q2/2006 Goals to be defined; Scale: 2 X DC3

Page 35: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 35

Summary (1)

ATLAS computing is in the middle of first period of Data Challenges of increasing scope and complexity and is steadily progressing towards a highly functional software suite, plus a World Wide computing model, which gives all ATLAS equal and equal quality of access to ATLAS data

Page 36: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 36

Summary (2)

These Data Challenges are executed at the prototype tier centers and use as much as possible the Grid middleware being developed in Grid projects around the world

Page 37: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 37

Conclusion

Quite promising start for ATLAS Data Challenges!

Page 38: ATLAS Data Challenges CHEP03 La Jolla, California 24 th March 2003 Gilbert Poulard ATLAS Data Challenges Co-ordinator CERN EP-ATC For the ATLAS DC & Grid

March 24, 2003 G. Poulard - CHEP03 38

Thanks to all DC-team members(working in 14 work packages)

A-WP1: Event generation

A-WP1: Event generation

A-WP2: Geant3 simulation

A-WP2: Geant3 simulation

A-WP3: Geant4 Simulation

A-WP3: Geant4 Simulation

A-WP4: Pile-up

A-WP4: Pile-up

A-WP5: Detector response

A-WP5: Detector response

A-WP6: Data Conversion

A-WP6: Data Conversion

A-WP7: Event filtering

A-WP7: Event filtering

A-WP9: Analysis

A-WP9: Analysis A-WP10:

Data Management

A-WP10: Data Management

A-WP8: Reconstruction

A-WP8: Reconstruction

A-WP11: Tools

A-WP11: Tools

A-WP12: TeamsProductionValidation….

A-WP12: TeamsProductionValidation….

A-WP13: Tier Centres

A-WP13: Tier Centres

A-WP14: Fast Simulation

A-WP14: Fast Simulation