trigger/daq/dcs

19
Trigger/DAQ/DCS Trigger/DAQ/DCS

Upload: carson-hart

Post on 31-Dec-2015

49 views

Category:

Documents


1 download

DESCRIPTION

Trigger/DAQ/DCS. LVL1 Trigger. O (1M) RPC/TGC channels. ~7200 calorimeter trigger towers. Japan, Israel. Italy. Calorimeter trigger. Muon trigger. Muon Barrel Trigger. Muon End-cap Trigger. Pre-Processor (analogue  E T ). Muon-CTP Interface (MUCTPI). Jet / Energy-Sum Processor. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Trigger/DAQ/DCS

Trigger/DAQ/DCSTrigger/DAQ/DCS

Page 2: Trigger/DAQ/DCS

LVL1 TriggerLVL1 Trigger

Calorimeter trigger Muon trigger

Central Trigger Processor (CTP)

Timing, Trigger, Control (TTC)

Germany, Sweden, UK

Italy Japan, Israel

Cluster Processor (e/, /h)

Pre-Processor (analogue ET)

Jet / Energy-Sum Processor

CERN

Muon Barrel Trigger

Muon End-cap Trigger

Muon-CTP Interface

(MUCTPI)

~7200 calorimeter trigger towers O(1M) RPC/TGC channels

Page 3: Trigger/DAQ/DCS

Calorimeter triggerCalorimeter trigger Cluster Processor Module (CPM)

for e///h trigger New version fixes timing

problems in fanned-out data Fabrication problems solved

using firms with better QA Jet/Energy Module (JEM)

Full-specification version recently made, tests so far look good

Common Merger Module (CMM) Tested extensively, very close to

final version

CPM

JEM

Page 4: Trigger/DAQ/DCS

Calorimeter triggerCalorimeter trigger

PreProcessor Module (PPM) - later than planned, but... Final ASIC prototype is OK MCM now OK

Substrate problem fixed by change of material PPM stand-alone tests now nearly completed

System tests Many subsystem tests done without PPM

e.g. 5 DSSs 3 CPMs CMM (crate) CMM (system)

Full system tests with PPM starting very soon Will participate in test-beam in

August/September, including 25 ns run Aim to integrate with

– Calorimeters and receivers– Central Trigger Processor– RoI Builder– ATLAS DAQ, run control environment, etc.

Produce simple triggers based on calorimeter signals

3 CPMs, 1 JEM, 2 CMMs,TCM and CPU in crate

Page 5: Trigger/DAQ/DCS

Tile Calorimeter - PPM TestTile Calorimeter - PPM Test

TileCal trigger signals

PatchPanel:

separatescalo and

muon signals

Receiver:convertsE to ET

PreProcessor Module

AnIncard

Multi-ChipModule

ASICADC

Test pulse Test pulse recorded in PPM @ H8

Page 6: Trigger/DAQ/DCS

Barrel muon triggerBarrel muon trigger Preproduction of “Splitter” boxes

completed Main production on the way

Prototype of final-design “Pad” boards evaluated in lab and (last week) at 25 ns test beam

Seem to work well, but test-beam data still to be analysed

Design completed for revised version of CM ASIC

Interaction in progress with IMEC on placement/routing plus simulation to check design

Very urgent! More Pad boxes being prepared

for chamber integration tests Number limited by availability of

prototype ASICs (old version)

Correlation in measurements between two BML doublets

Page 7: Trigger/DAQ/DCS

Endcap muon triggerEndcap muon trigger System operated successfully in last

week’s 25 ns test beam with MUCTPI and CTP demonstrator

Even better efficiency than last year Many improvements to software

Will test new PS boards with revised version of SLB ASIC in test beam August/September

Revised version of SLB ASIC is being evaluated in lab tests

Trigger part passes all tests Problem detected in readout part for

certain sequences of L1A signals Probably very localized and

hopefully only very minor revision to design required, but still under investigation

All other endcap ASICs already final

Trig. Eff. PT=6 PT=5 PT=4

Page 8: Trigger/DAQ/DCS

Central triggerCentral trigger CTP

Final prototypes either available or coming soon (layout, production)

Tight schedule to test, commission and integrate for test beam later in summer

LTP Prototypes just received

MUCTPI Work queued behind CTP Only one kind of module needs to

be upgraded to achieve full functionality

Existing “demonstrator” adequate in short term

LTP prototype

Page 9: Trigger/DAQ/DCS

LVL1 ScheduleLVL1 Schedule

Re-baselined in line with current status and plans Schedule for calorimeter and central trigger electronics matches

availability of detector systems Production of on-detector muon trigger electronics is later

than we would like for integration with detectors Very tight schedule to have barrel electronics available in time to

equip chambers before installation Late submission of revised version of the CM ASIC Try to advance ASIC schedule if at all possible Need to prepare for very fast completion and testing of electronics

once production ASICs become available Need to prepare for efficient integration of electronics with

chamber assemblies prior to installation End-cap electronics schedule also tight for integration with detectors

in early 2005 Detector installation is later than for barrel, so not as critical

Page 10: Trigger/DAQ/DCS

Installation ScheduleInstallation Schedule

According to present schedule, final availability of all LVL1 subsystems is still driven by detector installation schedule Latest ATLAS working installation schedule (v. 6.19) shows last TGC

chambers (with on-detector trigger electronics) installed January 2007 Leaves little time for commissioning of on-detector electronics

before we lose access prior to first beams Action defined for discussion with TC (and Muon PL) to see if

there is scope to optimize the installation planning

Page 11: Trigger/DAQ/DCS

HLT/DAQHLT/DAQ Major activity in the present phase is the test beam

Support for detector and for LVL1 trigger tests Organization in “support teams” who are first point of contact

– Call on experts when necessary Dedicated training sessions were organized for the team members Team members participate with experts in problem solving

– Good way to spread expertise Electronic log book very useful

– Could extend use to detector systems HLT/DAQ studies

Preparation and planning for dedicated period in August– Aim to operate HLT/DAQ system to gain experience in a “real-life”

environment» Will need support from detector systems

Generally experience at test beam is very positive for T/DAQ However, work at the test beam takes a lot of effort

In parallel, continue development and system evaluation work ... within the constraints of the available effort

E.g. Dataflow measurements and modelling

Page 12: Trigger/DAQ/DCS

Detector integration with DAQ at H8Detector integration with DAQ at H8

Muon Detectors TGC’s, MDT fully integrated

Extended running in combined mode during 25 ns run last week together with MUCTPI (sometimes triggered by CTPD)

RPC almost fully integrated Data were successfully taken in stand-alone mode

Calorimeters Tiles fully integrated in data-taking mode LAr integration well advanced

Inner Detectors Started for TRT and pixels; plan to integrate SCT later

The exercise of joining detectors together has proven to be “easy” if the detector segment has been properly done

according to the TDAQ prescriptions

Page 13: Trigger/DAQ/DCS

HLT integration for test beamHLT integration for test beam

The infrastructure for the EF is prepared An EF cluster has been divided and pre-assigned to different detectors

The configuration allows, nevertheless, to dynamically assign more CPU’s to the partition that requests it

The main work now is to get ATHENA integrated Scheme of having “rolling” unique version of the offline software

(8.2.x) specially maintained for the test-beam working well We are now trying to put in place the automatic procedure that

sets ~80 environment variables! The Gatherer is integrated

Allows for aggregation of histograms across multiple processors The LVL2 commissioning is progressing well

A LVL1 result has been successfully read out by the L2PU Progress is being made in integrating algorithms

Page 14: Trigger/DAQ/DCS

Example of HLT algorithm workExample of HLT algorithm work

Eff for e pair Efficiency wrt to LVL1

Rates

LVL1 100 % 3.5 kHz

EF Calo 84.5 % 6.2 Hz

EF ID 71.6 % 1.5 Hz

EF ID-Calo 55.5 % 1.5 Hz

New result since TDR 2e15i at 2x1033cm-2s-1

Rates consistent with TDR assumptions

H4e mH=130 GeV L = 2x1033cm-2s-1

4 reconstructed electrons in ||<2.5 At least 2e with pT > 20 GeV

Efficiency includes both single and double-object triggers

Good trigger acceptance of Higgs events

2e2 study also being done

Trigger Selection Steps

Efficiency wrt LVL1

Overall Efficiency

LVL1 100 % 99.6 %

L2Calo 99.7 % 99.4 %

EFCalo 98.9 % 98.5 %

EFID 98.1 % 97.7 %

EFIDCalo 97.1 % 96.7 %

Page 15: Trigger/DAQ/DCS

Continuous evolution of Online softwareContinuous evolution of Online software Control

Databases

Monitoring

Page 16: Trigger/DAQ/DCS

Example of ongoing work: large-scale testsExample of ongoing work: large-scale tests

Boot

05

1015202530354045

Number of Controllers

Se

co

nd

s

Verified the Operational Scalability and PerformanceOperational Scalability and Performance of the Online System on a very large scale close to the size of final Atlas

Partitions of up to 1000 run controllers + 1000 processes1000 run controllers + 1000 processes running on 340 PCs

Individual tests on corba communication communication components

and configuration databaseconfiguration database components successfulsuccessful

4th iteration of Online Software Large Scale tests 340 PCs (800-Hz to 2.4 GHz) of CERN LXSHARE cluster Linux RH 7.3Partitions and configuration trees under varying conditions

Run control operations: boot DAQ, start run, stop run. shutdown DAQ

Warm Start state transition consisting of 2 internal transitions

0

0.2

0.4

0.6

0.8

1

1.2

Controllers

Se

co

nd

s

0

2000

4000

6000

8000

10000

12000

14000 1 receiver

5 receivers

10 receivers

Number of requests per seconds

Number of simultaneous providers

Information Service performance:Each provider publishes one information,then updates it as fast as possible

Page 17: Trigger/DAQ/DCS

HLT/DAQ procurement plansHLT/DAQ procurement plans

S-link source card FDR/PRR successfully concluded in February Preproduction run before end of year; mass production in 2005

ROB-in FDR successfully concluded in May Production of 10 prototype boards

of final design due in July Switches

Plan for switch evaluation exists Measurements on some “pizza box” switches in progress

Technical specification document under review Market survey later this year

ROS PCs Technical specification document under review Market survey later this year

Other PCs Will be addressed later

Page 18: Trigger/DAQ/DCS

HLT/DAQ pre-series system in preparationHLT/DAQ pre-series system in preparation

Approximately 10% slice of full HLT/DAQ system Validate functionality of final system

1 full ROS rack (11 PCs equipped with ROBins) 1 128-port Gbit Ethernet switch 1 LVL2 processor rack 1 EF processor rack (partially equipped) 1 ROIB (50% equipped) 1 EFIO rack (DFM, SFI, SFO, ...) 1 Online rack DCS equipment

Practical experience Racks, power distribution, cooling, etc

Considering installation in USA15/SDX (as for final system) Check of infrastructure ~6 months before main installation starts Subject to feasibility checks (schedule, working environment,

safety issues and regulations Will be discussed in July TMB

Page 19: Trigger/DAQ/DCS

DCSDCS

Front-End system ELMB: mass production on-going (LHCC 31/8/04) CAN branch supervisor: prototype being tested Rack control system: defined, HW prototype ordered

Back-End system Distributed PVSS system running (SR1) Hierarchical system with 3 levels set up Logging to (present) conditions data base (H8) Prototype Finite State Machine running Connection to DAQ fully operational Data retrieval from accelerator in the works by JCOP (LHCC

31/7/04)