wlcg lhcc mini-review lhcb summary

19
WLCG LHCC mini- review LHCb Summary

Upload: buffy

Post on 24-Feb-2016

45 views

Category:

Documents


0 download

DESCRIPTION

WLCG LHCC mini-review LHCb Summary. Outline. Activities in 2008: summary Status of DIRAC Activities in 2009: outlook Resources in 2009-10. Tier1s (re-)configuration. LFC mirror, ConditionsDB replication DB replication using 3D from CERN to all Tier1s In place for the whole year - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: WLCG LHCC mini-review LHCb Summary

WLCG LHCC mini-review

LHCb Summary

Page 2: WLCG LHCC mini-review LHCb Summary

PhC 2

Outline

m Activities in 2008: summarym Status of DIRACm Activities in 2009: outlookm Resources in 2009-10

LHCb

Page 3: WLCG LHCC mini-review LHCb Summary

WLCG-MB report, February 08

Tier1s (re-)configuration

m LFC mirror, ConditionsDB replicationo DB replication using 3D from CERN to all Tier1s

P In place for the whole yearo LFC service for scalability and redundancy

P Some problems at GridKam Site SE migration (winter 08-09)

o RAL (dCache to Castor2)P T0D1 migration went rather smoothly (FTS copy of files)P T1D0 migration extremely painful (staging tape by tape)

d Took several monthso PIC (Castor1 to dCache for T1D0)

P Went very smoothly without file copy (file migration to Enstore)

o CNAF (Castor2 to StoRM for TxD1)P Performed in May

m SURLs migrated to SRM v2.2 end-points (October)P Needs as datasets in use for 2.5 years now (DC06)P No dependency on SRM v1

3

Page 4: WLCG LHCC mini-review LHCb Summary

WLCG-MB report, February 08

Pit 8 to Tier0-Castor transfers

m First weeks in February: continuous transfers at low rate

m As of 18 Feb: nominal rate (70 MB/s) with ~50% duty cycle

o A few longer stops for SW upgrades

4

Migration

Tier1 Transfers

Page 5: WLCG LHCC mini-review LHCb Summary

WLCG-MB report, February 08

Castor migration

5

Page 6: WLCG LHCC mini-review LHCb Summary
Page 7: WLCG LHCC mini-review LHCb Summary

May CCRC: reconstruction

41.2k reconstruction jobs submitted27.6k jobs proceeded to done state

Done/created ~67%

Page 8: WLCG LHCC mini-review LHCb Summary

PhC 8

Cosmics and first beam data

m Cosmics used for detector commissioningo Of course very few detectors are hit!o Allows internal time alignment of subdetectors

P Using readout of consecutive 25ns slotso Partially time alignment between detectors

P Shifting timing by 12.5 ns and equalising population in consecutive bins…

o All subdetectors included in global runs as of end August

m TED datao LHCb was first to see tracks coming from the

injection line!o Single shots with ~2 muons/cm2, but once every 48 s

only!o First tracks in the VeLoo Allowed rough global detector time alignment (~2ns)

m 10th Septembero Only muons, calorimeters and for short time OT

LHCb Status to LCG-LHCC referees

Page 9: WLCG LHCC mini-review LHCb Summary

PhC 9

First data

LHCb Status to LCG-LHCC referees

Page 10: WLCG LHCC mini-review LHCb Summary

PhC 10

DIRAC3 put in production

m Production activitieso Started in Julyo Simulation, reconstruction, stripping

P Includes file distribution strategy, failover mechanismP File access using local access protocol (rootd, rfio,

(gsi)dcap, xrootd)P Commissioned alternative method: copy to local disk

d Drawback: non-guaranteed space, less CPU efficiency, additional network traffic (possibly copied from remote site)

o Failover using VOBOXesP File transfers (delegated to FTS)P LFC registrationP Internal DIRAC operations (bookkeeping, job

monitoring…)m Analysis

o Started in Septembero Ganga available for DIRAC3 in Novembero DIRAC2 de-commissioned on January 12th

LHCb

Page 11: WLCG LHCC mini-review LHCb Summary

PhC 11

DIRAC3 jobs

LHCb Status to LCG-LHCC referees

Page 12: WLCG LHCC mini-review LHCb Summary

PhC 12

Issues in 2008

m Data Managemento Site configuration (non-scaling)o SRM v2.2 still not fully mature (e.g. pinning)o Many issues with StorageWare (mainly dCache)

m Workload Managemento Moved to gLite WMS, but still many issues with it

(e.g. mix-up of identities). Better scaling behavior though than LCG-RB

o LHCb moved to using “generic pilot jobs” (i.e. can execute workload from any user or production)

P Not switching identity yet (gLexec / SCAS not available)P Not a show-stopper as not required by LHCb but by sites

m Middleware deploymento LHCb distributes the client middleware

P From distribution in the LCG-AAP Necessary to ensure bug fixes to be availableP Allows multiple platform (OS, architecture, python

version)

Page 13: WLCG LHCC mini-review LHCb Summary

PhC 13

LHCb Computing Operationsm Production manager

o Schedules production work, sets up and checks workflows, reports to LHCb operations

m Computing shifterso Computing Operations shifter (pool of ~12 shifters)

P Covers 14h/day, 7 days / weekP Computing Control room (2-R-014)

o Data Quality shifterP Covers 8h/day, 7 days / week

o Both are in the LHCb Computing Control room (2-R-014)

m Daily DQ and Operations meetingso Week days (twice a week during shutdowns)

m Grid Expert on-callo On duty for a weeko Runs the operations meetings

m Grid Team (~6 FTEs needed, ~2 missing)o Shared responsibilities (WMS, DMS, SAM,

Bookkeeping…)LHCb Status to LCG-LHCC referees

Page 14: WLCG LHCC mini-review LHCb Summary

PhC 14

Plans for 2009

m Commissioning for 2009-10 data taking (FEST’09)o See next slides

m Simulationo Replacing DC06 datasets

P Signal and background samples (~300 Mevts)P Minimum bias for L0 and HLT commissioning (~100

Mevts)P Used for CP-violation performance studiesP Nominal LHC settings (7 TeV, 25 ns, 2 1032 cm-2s-1)

o Tuning stripping and HLT for 2010P 4/5 TeV, 50 ns (no spillover), 1032 cm-1s-1

P Benchmark channels for first physics studiesd Bµµ, Γs, BDh, BsJ/ψϕ, BK*µµ …

P Large minimum bias samples (~ 1mn of LHC running)P Stripping performance required: ~ 50 Hz for benchmark

channelsP Tune HLT: efficiency vs retention, optimisation

o Preparation for very first physicsP 2 TeV, low luminosityP Large minimum bias sample (part used for FEST’09)LHCb

Page 15: WLCG LHCC mini-review LHCb Summary

PhC 15

FEST’09

m Aimo Replace the non-existing 2008 beam data with MCo Points to be tested

P L0 (Hardware trigger) strategyd Emulated in software

P HLT strategyd First data (loose trigger)d Higher lumi/energy data (b-physics trigger)

P Online detector monitoringd Based on event selection from HLT e.g. J/Psi eventsd Automatic detector problems detection

P Data streamingd Physics stream (all triggers) and calibration stream (subset

of triggers, typically 5 Hz)P Alignment and calibration loop

d Trigger re-alignmentd Run alignment processesd Validate new alignment (based on calibration stream)

LHCb Status to LCG-LHCC referees

Page 16: WLCG LHCC mini-review LHCb Summary

PhC 17

FEST’09 runs

m FEST activityo Define running conditions (rate, HLT version +

config)o Start runs from the Control System

P Events are injected and follow the normal patho Files export to Tier0 and distribution to Tier1s o Automatic reconstruction jobs at CERN and Tier1s

P Commission Data Quality green-lightm Short test periods

o Typically full week, ½ to 1 day every week for testso Depending on results, take a few weeks interval for

fixing problemsm Vary conditions

o L0 parameterso Event rateso HLT parameterso Trigger calibration and alignment loop

LHCb Status to LCG-LHCC referees

Page 17: WLCG LHCC mini-review LHCb Summary

PhC 18

Resources (very preliminary)

m Consider 2009-10 as a whole (new LHC schedule)o Real data

P Split year in two parts:d 0.5 106 s at low lumi – LHC-phase1d 3 106 s at higher lumi (1 1032) – LHC phase2

P Trigger rate independent on lumi and energy: 2 kHzo Simulation: 2 109 events (nominal year) over 2 years

m New assumptions for (re-)processing and analysis o More re-processings during LHC-phase1o Add calibration checks (done at CERN)o Envision more analysis at CERN with first data

P Increase from 25% (TDR) to 50% (phase1) and 35% (phase2)P Include SW development and testing (LXBATCH)

o Adjust event sizes and CPU needs to current estimatesP Important effort to reduce data size (packed format for

rDST, DST, µDST…)P Use new HEP-SPEC06 benchmarking

LHCb Status to LCG-LHCC referees

Page 18: WLCG LHCC mini-review LHCb Summary

PhC 19

Resources (cont’d)

m CERN usageo Tier0:

P Real data recording, export to Tier1sP First pass reconstruction of ~85% of raw dataP Reprocessing (in future foresee to use also the Online HLT

farm)o CAF (“Calibration and Alignment Facility”)

P Dedicated LXBATCH resourcesP Detector studies, alignment and calibration

o CAF (“CERN Analysis Facility”)P Part of Grid distributed analysis facilities (estimate 40% in

2009-10)P Histograms and interactive analysis (lxplus, desk/lap-tops)

m Tier1 usageo Reconstruction

P First pass during data taking, reprocessingo Analysis facilities

P Grid distributed analysisP Local storage for users’ data (LHCb_USER SRM space)

LHCb

Page 19: WLCG LHCC mini-review LHCb Summary

PhC 20

Conclusions

m 2008o CCRC very useful for LHCb (although irrelevant to be

simultaneous due to low throughput)o DIRAC3 fully commissioned

P Production in JulyP Analysis in NovemberP As of now, called DIRAC

o Last processing on DC06P Analysis will continue in 2009

o Commission simulation and reconstruction for real data

m 2009-10o Large simulation requests for replacing DC06,

preparing 2009-10o FEST’09: ~1 week a month and 1 day a weeko Resource requirements being prepared for WLCG

workshop in March and C-RRB in April

LHCb