Trigger/DAQ/DCS - PowerPoint PPT Presentation

1 / 19
About This Presentation
Title:

Trigger/DAQ/DCS

Description:

Trigger/DAQ/DCS – PowerPoint PPT presentation

Number of Views:120
Avg rating:3.0/5.0
Slides: 20
Provided by: EricE55
Category:
Tags: daq | dcs | cmms | software | trigger

less

Transcript and Presenter's Notes

Title: Trigger/DAQ/DCS


1
Trigger/DAQ/DCS
2
LVL1 Trigger
O(1M) RPC/TGC channels
7200 calorimeter trigger towers
Japan, Israel
Italy
Calorimeter trigger
Muon trigger
Muon Barrel Trigger
Muon End-cap Trigger
Pre-Processor (analogue ? ET)
Muon-CTP Interface (MUCTPI)
Cluster Processor (e/g, t/h)
Jet / Energy-Sum Processor
Central Trigger Processor (CTP)
Germany, Sweden, UK
CERN
Timing, Trigger, Control (TTC)
3
Calorimeter trigger
  • Cluster Processor Module (CPM) for e/g/t/h
    trigger
  • New version fixes timing problems in fanned-out
    data
  • Fabrication problems solved using firms with
    better QA
  • Jet/Energy Module (JEM)
  • Full-specification version recently made, tests
    so far look good
  • Common Merger Module (CMM)
  • Tested extensively, very close to final version

CPM
JEM
4
Calorimeter trigger
  • PreProcessor Module (PPM) - later than planned,
    but...
  • Final ASIC prototype is OK
  • MCM now OK
  • Substrate problem fixed by change of material
  • PPM stand-alone tests now nearly completed
  • System tests
  • Many subsystem tests done without PPM
  • e.g. 5 DSSs ? 3 CPMs ? CMM (crate) ? CMM
    (system)
  • Full system tests with PPM starting very soon
  • Will participate in test-beam in
    August/September, including 25 ns run
  • Aim to integrate with
  • Calorimeters and receivers
  • Central Trigger Processor
  • RoI Builder
  • ATLAS DAQ, run control environment, etc.
  • Produce simple triggers based on calorimeter
    signals

3 CPMs, 1 JEM, 2 CMMs,TCM and CPU in crate
5
Tile Calorimeter - PPM Test
Test pulse recorded in PPM _at_ H8
Test pulse
6
Barrel muon trigger
  • Preproduction of Splitter boxes completed
  • Main production on the way
  • Prototype of final-design Pad boards evaluated
    in lab and (last week) at 25 ns test beam
  • Seem to work well, but test-beam data still to be
    analysed
  • Design completed for revised version of CM ASIC
  • Interaction in progress with IMEC on
    placement/routing plus simulation to check design
  • Very urgent!
  • More Pad boxes being prepared for chamber
    integration tests
  • Number limited by availability of prototype ASICs
    (old version)

Correlation in f measurements between two BML
doublets
7
Endcap muon trigger
  • System operated successfully in last weeks 25 ns
    test beam with MUCTPI and CTP demonstrator
  • Even better efficiency than last year
  • Many improvements to software
  • Will test new PS boards with revised version of
    SLB ASIC in test beam August/September
  • Revised version of SLB ASIC is being evaluated in
    lab tests
  • Trigger part passes all tests
  • Problem detected in readout part for certain
    sequences of L1A signals
  • Probably very localized and hopefully only very
    minor revision to design required, but still
    under investigation
  • All other endcap ASICs already final
  • Trig. Eff.
  • PT6
  • PT5
  • PT4

8
Central trigger
  • CTP
  • Final prototypes either available or coming soon
    (layout, production)
  • Tight schedule to test, commission and integrate
    for test beam later in summer
  • LTP
  • Prototypes just received
  • MUCTPI
  • Work queued behind CTP
  • Only one kind of module needs to be upgraded to
    achieve full functionality
  • Existing demonstrator adequate in short term

LTP prototype
9
LVL1 Schedule
  • Re-baselined in line with current status and
    plans
  • Schedule for calorimeter and central trigger
    electronics matches availability of detector
    systems
  • Production of on-detector muon trigger
    electronics is later than we would like for
    integration with detectors
  • Very tight schedule to have barrel electronics
    available in time to equip chambers before
    installation
  • Late submission of revised version of the CM ASIC
  • Try to advance ASIC schedule if at all possible
  • Need to prepare for very fast completion and
    testing of electronics once production ASICs
    become available
  • Need to prepare for efficient integration of
    electronics with chamber assemblies prior to
    installation
  • End-cap electronics schedule also tight for
    integration with detectors in early 2005
  • Detector installation is later than for barrel,
    so not as critical

10
Installation Schedule
  • According to present schedule, final availability
    of all LVL1 subsystems is still driven by
    detector installation schedule
  • Latest ATLAS working installation schedule (v.
    6.19) shows last TGC chambers (with on-detector
    trigger electronics) installed January 2007
  • Leaves little time for commissioning of
    on-detector electronics before we lose access
    prior to first beams
  • Action defined for discussion with TC (and Muon
    PL) to see if there is scope to optimize the
    installation planning

11
HLT/DAQ
  • Major activity in the present phase is the test
    beam
  • Support for detector and for LVL1 trigger tests
  • Organization in support teams who are first
    point of contact
  • Call on experts when necessary
  • Dedicated training sessions were organized for
    the team members
  • Team members participate with experts in problem
    solving
  • Good way to spread expertise
  • Electronic log book very useful
  • Could extend use to detector systems
  • HLT/DAQ studies
  • Preparation and planning for dedicated period in
    August
  • Aim to operate HLT/DAQ system to gain experience
    in a real-life environment
  • Will need support from detector systems
  • Generally experience at test beam is very
    positive for T/DAQ
  • However, work at the test beam takes a lot of
    effort
  • In parallel, continue development and system
    evaluation work
  • ... within the constraints of the available
    effort
  • E.g. Dataflow measurements and modelling

12
Detector integration with DAQ at H8
  • Muon Detectors
  • TGCs, MDT fully integrated
  • Extended running in combined mode during 25 ns
    run last week together with MUCTPI (sometimes
    triggered by CTPD)
  • RPC almost fully integrated
  • Data were successfully taken in stand-alone mode
  • Calorimeters
  • Tiles fully integrated in data-taking mode
  • LAr integration well advanced
  • Inner Detectors
  • Started for TRT and pixels plan to integrate SCT
    later

The exercise of joining detectors together has
proven to be easy if the detector segment has
been properly done according to the TDAQ
prescriptions
13
HLT integration for test beam
  • The infrastructure for the EF is prepared
  • An EF cluster has been divided and pre-assigned
    to different detectors
  • The configuration allows, nevertheless, to
    dynamically assign more CPUs to the partition
    that requests it
  • The main work now is to get ATHENA integrated
  • Scheme of having rolling unique version of the
    offline software (8.2.x) specially maintained for
    the test-beam working well
  • We are now trying to put in place the automatic
    procedure that sets 80 environment variables!
  • The Gatherer is integrated
  • Allows for aggregation of histograms across
    multiple processors
  • The LVL2 commissioning is progressing well
  • A LVL1 result has been successfully read out by
    the L2PU
  • Progress is being made in integrating algorithms

14
Example of HLT algorithm work
Eff for e pair Efficiency wrt to LVL1 Rates
LVL1 100 3.5 kHz
EF Calo 84.5 6.2 Hz
EF ID 71.6 1.5 Hz
EF ID-Calo 55.5 1.5 Hz
  • New result since TDR
  • 2e15i at 2x1033cm-2s-1
  • Rates consistent with TDR assumptions

Trigger Selection Steps Efficiency wrt LVL1 Overall Efficiency
LVL1 100 99.6
L2Calo 99.7 99.4
EFCalo 98.9 98.5
EFID 98.1 97.7
EFIDCalo 97.1 96.7
  • H?4e mH130 GeV
  • L 2x1033cm-2s-1
  • 4 reconstructed electrons in ?lt2.5
  • At least 2e with pT gt 20 GeV
  • Efficiency includes both single and double-object
    triggers
  • Good trigger acceptance of Higgs events
  • 2e2m study also being done

15
Continuous evolution of Online software
  • Control
  • Databases
  • Monitoring

16
Example of ongoing work large-scale tests
Run control operations boot DAQ, start run,
stop run. shutdown DAQ
  • Verified the Operational Scalability and
    Performance of the Online System on a very large
    scale close to the size of final Atlas
  • Partitions of up to 1000 run controllers 1000
    processes running on 340 PCs
  • Individual tests on corba communication
    components and configuration database components
    successful

Information Service performanceEach provider
publishes one information,then updates it as
fast as possible
Number of requests per seconds
  • 4th iteration of Online Software Large Scale
    tests
  • 340 PCs (800-Hz to 2.4 GHz) of CERN LXSHARE
    cluster
  • Linux RH 7.3
  • Partitions and configuration trees under varying
    conditions

Number of simultaneous providers
17
HLT/DAQ procurement plans
  • S-link source card
  • FDR/PRR successfully concluded in February
  • Preproduction run before end of year mass
    production in 2005
  • ROB-in
  • FDR successfully concluded in May
  • Production of 10 prototype boards of final
    design due in July
  • Switches
  • Plan for switch evaluation exists
  • Measurements on some pizza box switches in
    progress
  • Technical specification document under review
  • Market survey later this year
  • ROS PCs
  • Technical specification document under review
  • Market survey later this year
  • Other PCs
  • Will be addressed later

18
HLT/DAQ pre-series system in preparation
  • Approximately 10 slice of full HLT/DAQ system
  • Validate functionality of final system
  • 1 full ROS rack (11 PCs equipped with ROBins)
  • 1 128-port Gbit Ethernet switch
  • 1 LVL2 processor rack
  • 1 EF processor rack (partially equipped)
  • 1 ROIB (50 equipped)
  • 1 EFIO rack (DFM, SFI, SFO, ...)
  • 1 Online rack
  • DCS equipment
  • Practical experience
  • Racks, power distribution, cooling, etc
  • Considering installation in USA15/SDX (as for
    final system)
  • Check of infrastructure 6 months before main
    installation starts
  • Subject to feasibility checks (schedule, working
    environment, safety issues and regulations
  • Will be discussed in July TMB

19
DCS
  • Front-End system
  • ELMB mass production on-going (LHCC 31/8/04)
  • CAN branch supervisor prototype being tested
  • Rack control system defined, HW prototype
    ordered
  • Back-End system
  • Distributed PVSS system running (SR1)
  • Hierarchical system with 3 levels set up
  • Logging to (present) conditions data base (H8)
  • Prototype Finite State Machine running
  • Connection to DAQ fully operational
  • Data retrieval from accelerator in the works by
    JCOP (LHCC 31/7/04)
Write a Comment
User Comments (0)
About PowerShow.com