US CMS TriDAS - PowerPoint PPT Presentation

About This Presentation
Title:

US CMS TriDAS

Description:

Single Track-Finder Crate Design with 1.6 Gbit/s ... for new compact track-finder design) Optical Link Radiation ... engineering delay of installation date ... – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 25
Provided by: wesl51
Category:
Tags: cms | tridas | date | finder

less

Transcript and Presenter's Notes

Title: US CMS TriDAS


1
US CMS TriDAS
  • US CMS Meeting
  • Wesley H. Smith, U. Wisconsin
  • CMS Trigger Project Manager
  • May 10, 2002
  • Outline
  • Calorimeter Trigger Status Plans
  • Muon Trigger Status Plans
  • DAQ Status Plans (from Vivian ODell)
  • This talk is available on
  • http//hep.wisc.edu/wsmith/cms/TriDAS_USCMS_0502.p
    df

2
L1 Trigger Hardware Overview
  • US CMS
  • Trigger
  • (this talk)
  • US CMS

3
Calorimeter Trig.Overview
4
Cal. Trig 2nd Gen. Prototypes (U. Wisconsin)
  • New High-Speed Backplane
  • 160 MHz with 0.4 Tbit/sec dataflow
  • Designed to incorporatealgorithm changes
  • New Non-Isolated Electron, Tau Jet Triggers
  • New Clock Control Card
  • Fans out 160 MHz clock adjusts phases for
    allboards
  • 50 tested successfully

Front
VME Slots
DC-DC Converters
Clock delay adjust
Rear
DC-DC Conv
5
New Cal. Trig. 4 Gbaud CopperLink Cards Tester
(U. Wisconsin)
  • 8 Compact MezzanineCards for each Receiver
    Card accept 4 x 20 m 1.2-Gbaudcopper pairs
    transmitting 2 cal. tower energies every 25 ns
    with low cost power.
  • Uses new Vitesse Link Chips (7216-01).
  • New Serial Link Test Card
  • Status under test

6
New Calorimeter TriggerReceiver Card (U.
Wisconsin)
  • Full featured final prototype board is in test -
    initial results are good.Continue to test
    on-board ASICs copper link mezzanine cards

Adder
PHASE ASICs
mezz link cards
MLUs
BSCAN ASICs
DC-DC
Top side with 1 of 8 mezzanine cards 2 of 3
Adder ASICs
Bottom side with all Phase Boundary Scan ASICs
7
Cal. Trig. New Electron Isolation Jet/Summary
Cards (Wisconsin)
Full featured final E.I. Proto. board is finished
ready for testing.
Recv. Mezz.
Region 4x4 HF Sums to Cluster Crate
SORT ASICs
EISO
JSC Proto. ready to build pending tests of other
boards
SORT ASICs
E.I. Proto will test Electron Isolation ASICs
Sort ASICs
EISO
8
Cal Trigger Status/Plans
  • Preparing second generation prototype tests
  • Crate, Backplane, Clock Control, ASICs done
  • Receiver Card Electron Isolation Card ready.
  • Serial Link Mezzanine Card Receiver done,Tester
    Card at vendor, Transmitter Tester in design
  • Goals for 2002
  • Complete of prototype tests, validate ASICs
  • Integrate Serial Links w/ECAL,HCAL front-ends
  • Prototype Jet/Summary card manufacture
  • Ready for manufacture -- waiting for other board
    tests
  • Finalize Jet Cluster crate design

9
CSC Muon Trigger Scheme
3-D Track-Finding and Measurement
On-Chamber Trigger Primitives
Muon Port Card(Rice)
Trigger Motherboard(UCLA)
SectorReceiver/ Processor(U. Florida)
Strip FE cards
LCT
OPTICAL
FE
SP
SR/SP
MPC
LCT
3? / port card
TMB
FE
2? / chamber
3? / sector
Wire LCT card
Wire FE cards
In counting house
RIM
CSC Muon Sorter(Rice)
RPC Interface Module
DT
RPC
4?
4?
4?
Combination of all 3 Muon Systems
Global L1
Global ? Trigger
4?
10
1st Muon Trigger Prototypes(Florida, Rice, UCLA)
  • Successful CSC Trigger Integration test
  • Prototype Muon Port Card, Sector Receiver, Sector
    Processor, Clock Board, Backplane work
    communicate -- Result in 2000
  • ORCA full simulation working
  • Agreement/use with hardware test

11
1st Track-Finder Crate Tests
Clock Control Board (Rice)
Sector Receiver (UCLA)
Muon Port Card (Rice)
Sector Processor (Florida)
Bit3 VME Interface
Very successful but latency too high -- New
design in 2001
Custom Backplane (Florida)
Prototype crate for original six crate design
100m optical fibers
12
New EMU Trigger DesignU. Florida Track-Finder
  • Single Track-Finder Crate Design with 1.6
    Gbit/s optical links
  • Reduces processing time from 525 ns (old design)
    to 175 ns
  • Total Latency 20 Bx (from input of SR/SP card
    to output of MS card)
  • Crate Power Consumption 500 W 15 Optical
    connections per SR/SP card
  • Custom Backplane for SR/SP ? CCB and MS connection

SR/SP Card

(3 Sector Receivers


Clock and Control Board

Sector Processor)


SR
SR

SR

SR

SR

SR


SR

SR

SR

SR

SR

SR


CCB

MS
/
(60
sector)
/

/

/

/

/


/

/

/

/

/

/



SP
SP

SP

SP

SP

SP


SP

SP

SP

SP

SP

SP

BIT3 Controller
From MPC

(chamber 4)

Muon Sorter
From MPC

(chamber 3)


From MPC


(chamber 2)

From Trigger Timing Control
From MPC

(chamber 1B)


From MPC


(chamber 1A)

ToGlobal Trigger

To DAQ

13
New Muon Port Card Design Optical Link Tests
(Rice)
  • New MPC Design uses new high speed links
    (TLK2501) to send one muon per optical
    fiber(needed for new compact track-finder design)

VME J1 CONNECTOR
9U 400 MM BOARD
VME INTERFACE
CCB INTERFACE SORTER LOGIC INPUT ANDOUTPUT FIFO

TMB_1
SER
OPTO
3 OPTICAL CABLES TO SECTOR PROCESSOR
CUSTOM PERIPHERAL BACKPLANE
TMB_2
TMB_3
TMB_4
SER
OPTO
TMB_5
TMB_6
OPTICAL TRANSCEIVERS
SER
OPTO
TMB_7
TMB_8
SERIALIZERS
FPGA
TMB_9
GTLP TRANSCEIVERS
Optical Link Radiation TestsThree serializers
up to 270 kRad TID.No permanent damage or
SEUTwo Finisar optical modules No errors up to
70 kRad. Failed at 70kRad (well above 10 kRad
TID inner CSC dose for 10 years)
14
CSC Trigger Status/Plans
  • Prototype 1 tests now complete
  • Prototype 2 and production follow EMU components
    to optimize technology
  • MPC, SP, CCC modules, backplane milestones
  • Apr-02 Prototype 2 designs done
  • Freeze CSC-DT interface
  • Determine DDU compatibility with OSU module for
    EMU
  • Nov-02 Prototype 2 construction done
  • Apr-03 Prototype 2 testing done
  • Sep-03 Final designs done
  • Oct-04 Production done
  • Apr-05 Installation done
  • (backplane schedule 3 months ahead of above
    dates to provide platform for testing and
    integration)
  • Muon Sorter module only 1, design by Jan-04

15
Schedule Trigger Project Completion
  • Installation in UndergroundCounting Room
  • Expect access by March 05
  • Sufficient time for installationand some testing
    but not forcompleting commissioningwith
    detectors
  • Slice Test (on surface)
  • With both HCAL and EMU
  • Verify trigger functions and interfaces by
    testing with detectors on surface at CERN.
  • Suggest as substitute for commissioning
    completion step.
  • Will check as much on surface before gaining
    access to underground facilities.
  • Planned for October 04 - March 05

UndergroundCounting Room
16
Original Trigger L2 Task Schedule Updates
  • Tasks original start finish new
  • Produce TDR 8/00 12/00 4
  • Design Final Prototypes 11/00 12/01 4
  • Construct Final Prototypes 6/01 6/02 ?11/02
  • Test/Integrate Final Prototypes 12/01 12/02 ?
    4/03
  • Pre-Production Design Test 6/02 6/03 ?11/03
  • Production 12/02 6/04
  • Production Test 6/03 11/04
  • Trigger System Tests 5/04 5/05
  • "Slice Test" NEW 10/04 3/05
  • Trigger Installation 3/05 9/05
  • Integration Test w/DAQ FE 6/05 12/05
  • Maintenance Operations 10/04 -------
  • 6 months civil engineering delay of installation
    date

17
DAQ System Overview
Original design L-1 _at_ 100 kHz Rescope1997
75kHz 2001 InitialL-1 _at_ 50 kHz But design all
elements to be able to do 100 kHz
40 MHz

16 Million channels
DETECTOR CHANNELS
COLLISION RATE
3 Gigacell buffers
LEVEL-1
TRIGGER
Charge
Time
Pattern
Energy
Tracks
75 kHz

1 MB EVENT DATA
1 Terabit/s

200 GB buffers

READOUT

400 Readout memories
50,000 data

channels
EVENT BUILDER.

A large switching network (400400 ports) with
total throughput 400Gbit/s forms the
interconnection between the sources (deep
buffers) and the destinations (buffers before
farm CPUs).

500 Gigabit/s

SWITCH NETWORK



400 CPU farms
EVENT FILTER.

100 Hz
A set of high performance commercial processors
organized into many farms convenient for on-line
and off-line applications.


FILTERED

5 TeraFlps
EVENT
Computing Services
Gigabit/s

Petabyte ARCHIVE
SERVICE LAN
18
New DAQ design principle
  • Basic principle
  • Break DAQ into a number of functionally
    identical, parallel, smaller DAQ systems
  • A 64x64 system is feasible today

19
Detector readout to surface
20
D2S RB EF breakdown
Data to Surface
Readout Builder
Event Filter
21
DAQ US contribution (old)
  • US Event Manager Builder Units

CERN
Inputs(500) Switch
Detector Front-end
Level 1 Trigger
Readout
Systems
Builder Networks
Event Manager
Run Control

US
Outputs EVM
US
Builder and Filter
Systems
Computing Services
BU
Other responsibilities Detector
Front-Ends detector groups Computing
Services infrastructure
FU
Filter Units not included in outputs
FU
FU
22
US contribution new
  • Cover one segment (1/8) of the CMS DAQ plus ¼ of
    the Data-to-Surface system (plus the associated
    prototypes preseries)
  • Segment 1 Readout Builder 1 Event Filter
  • US_CMS detector electronics is 1/4 of the total
  • Delivery of the system can be accomplished by the
    end of the US_CMS project (FY05)
  • Aids the experiment most in the current phase
    where cash flow is very tight
  • US RD program can remain unchanged (to the
    extent that the basic modules are the same)
  • Roughly speaking, the US
  • (a) works on/delivers prototype system (to 2004)
  • (b) delivers the startup DAQ for CMS (2005)

23
Milestones
  • Prototype DAQ (US Contribution)
  • D2S Prototype July, 2004
  • Slice Test November, 2004 ()
  • Readout Builder Prototype April, 2005
  • Startup DAQ (US Contribution)
  • Filter Farm Ready May, 2006
  • Readout Builder Ready August, 2006
  • Declaration of Completion (US Contribution)
  • Startup DAQ ready for beam September, 2006
  • () Slice Test for US-CMS detectors DAQ will
    have full D2S proto a few RB elements
  • (Version 33)

24
Slice Test DAQ (10-100 Hz)
Trigger system GTP, TTC and sTTS Detector
readout Complete FED crate systems (FED-TTC-TTS,
Controller CPUDSN) Readout Units XDAQ
RU-VME-tasks running in all the FED
controllers Data to Surface None just the
FED-VME bus of FED crates RCN, BCN, BDN
networks DAQ Service Network (DSN e.g.
GEthernet) Event manager XDAQ EVM-task running
in the GTP controller Builder/Filter Units XDAQ
BU-task running in any DSN(WAN)
CPU Performances Few 10 Hz (up to 100s when
using GE switches in DSN as EVB)
Write a Comment
User Comments (0)
About PowerShow.com