ATLAS Pixel Detector Status - PowerPoint PPT Presentation

About This Presentation
Title:

ATLAS Pixel Detector Status

Description:

ATLAS Pixel Detector Status – PowerPoint PPT presentation

Number of Views:81
Avg rating:3.0/5.0
Slides: 46
Provided by: wwwphys
Category:
Tags: atlas | detector | pixel | status | ufc

less

Transcript and Presenter's Notes

Title: ATLAS Pixel Detector Status


1
ATLAS Pixel Detector Status
  • LBNL Research Progress Meeting
  • Oct. 16, 2008
  • Introduction
  • The Detector Installed
  • The Detector In Operation
  • The Future

2
Credits
  • Strong team of LBNL students and postdocs
    resident at CERN working on operation
  • Thanks to Beate Heineman for many of the
    operation slides.

3
The LHC Pixel Detectors
ALICE 10M pixels
CMS 66M pixels
ATLAS 80M pixels
4
LHC status
  • Repairs on sector 3-4
  • Official report of damage and repairs released
    today!https//edms.cern.ch/file/973073/1/Report_o
    n_080919_incident_at_LHC__2_.pdf
  • 24 dipoles, 5 quadrupoles, 6T of He.
  • Cannot return to beam operation until April in
    any case (CERN utility contract)?

An old picture
5
Hybrid pixel technology
High data rate High radiation
IC
Same thing
Make each pixel a tiny stand-alone detector and
operate all pixels in parallel
sensor
Data rate per pixel is small
Charged track signal is localized, but main
backgrounds scale with pixel volume
6
ATLAS Module
6cm
  • 16 readout chips bump bonded to a single sensor
    tile
  • Unique module design for entire detector
  • 50um x 400um pixel size 46K pixels / module
  • Each module has full functionality
  • ATLAS pixel detector is an array of modules all
    connected in parallel

7
ATLAS Pixel detector overview
1744 modules
A bandwidth view of the pixel detector
10Tb/s
100 Gb/s
LVL2
0.1Gb/s pixel hits to tape
LVL1
Works until every Si atom has been hit by a
charged particle
8
Pixel Detector Detailed DescriptionImage from
JINST PIXEL paperhttp//www.iop.org/EJ/abstract/1
748-0221/3/07/P07007
INSERTABLE
9
Pixel Detector Construction
2006
All images from Berkeley Lab View
2005
2007
10
Inner Detector Installation Timeline
ID barrel installed
ID endcaps installed
Ev. heater saga
Cold operation
29/5
19/6
25/8
1/8
2 0
0 6
2
0
0
7
2
0
0 8
18/4
1/5
28/6
Preparatory work
Pixel installed
Cooling failure
Connection complete
FIRST FUNCTIONAL TEST OF FINAL SYSTEM
11
Installation photosJune 2007
12
The Detector Installed
13
Technical challenges and approaches
14
Inaccessible with high services granularity
  • 88 independent cooling circuits (500
    inaccessible custom fittings)?
  • 1744 independently wired modules
  • 400 64-pin connectors at ends of package
  • 60Km of 100um or finer diameter wire, 30 of it
    Al.
  • Optomized for high granularity of control
  • Not heavily redundant
  • Single point failures possible

15
Not just plugging it in
8wks
16
CMS approach
  • The beam pipe is fixed and the detector can
    quickly (days) be inserted or removed
  • Coarse services granularity
  • Plan to fix problems in yearly accesses, instead
    of working around them with fine control
    granularity

17
Summary of on-detector losses
ATLAS pixels
Mostly Random dead pixels/ bumps
Dominated by cooling leaks. Dominated by bias
voltage opens
CMS losses
18
Open bias voltage connections
  • These dominate the not recoverable losses we have
    today
  • In general, defect rate of components not low
    enough to achieve design goals
  • Relied on comprehensive QC of all components to
    achieve end yield, rather than increasing
    effective yield through redundancy.
  • From bumps and wire bonds to the pins of the PP1
    connectors
  • Therefore QC failures yield drop
  • QC testing was blind to a particular bias voltage
    return defect, due to parasitic current paths.
  • Additionally, the sensor HV wire bonding scheme
    had a low enough intrinsic reliability that even
    after QC some weak parts got through (could
    argue that QC was not severe enough)?
  • Can't distinguish w-bond failures from cable and
    connector opens at this stage.

Broken bonds due to lifted flex
19
Evaporative cooling
  • On-detector plumbing has a history of problems.
  • Aluminum tubes corroded during production and
    were replaced
  • Some leaks in custom low mass fittings
  • There are 3 leaks inside the detector, all
    suspected to be on fittings like this
  • No way to access them
  • Leaks have minor effect on cooling performance,
    but fluid loss and gas volume contamination
    must be considered.
  • long term risks still exist
  • Cooling plant and services have been a large
    source of problems during installation and
    commissioning
  • Probably have not seen the last problem here
  • Evaporative heaters 1 order of magnitude more
    problematic than anticipated
  • Understanding long term risks from cooling plant
    still evolving
  • Damage to detector due to faults, contamination,
    etc.
  • Loss of operation

20
Optical data links
  • Unexpected failure rate in off-detector
    components for both pixels and SCT
  • Significant number of dead channels to date
    (30)?
  • Failing components can be replaced, but spares
    are limited
  • Production of some more spares under way, and
    even more being planned.
  • No similar effect seen for on-detector components
    (so far).
  • Diode properties consistent with ESD damage at
    some unknown point (could be during initial
    production)?

Normal I/V
ESD Typ.
21
On-detector laser diode arrays
  • Light emitted by VCSEL arrays on detector and
    received by diode arrays off detector
  • Particular Array-Array configuration does not
    permit single channel level adjustment
  • VCSEL array power and uniformity are temperature
    dependent
  • During surface integration, added resistive
    heaters to on-detector optical boards so that
    their temperature can be actively controlled

22
CMS optical readout
  • No ADC on CMS pixel chip. Analog charge
    information is sent to counting room
  • Worked well in the lab and test beams, but with
    installed detector thermal effects on optical
    components were underestimated (sounds
    familiar?)?
  • Analog Optical Hybrid very temperature sensitive,
    resulting in serious signal baseline and gain
    drifts.

Pixel address is encoded with 5 analog levels
Will it be possible to maintain this performance
for long physics runs?
23
The Detector in Operation
24
When cooling works
  • Typical temperature history for coldest and
    warmest modules during a run
  • Differences are due to known mechanical
    construction details
  • L2 max comes from staves with a double cooling
    pipe (a new pipe inside a corroded pipe)?

L2 max
L1max
L0max
0
Deg C
Disk max
L2min
L0min
Disk min
L1min
25
Calibration
  • Calibration is performed using digital signal
    processors (DSPs) on the VME boards that read out
    the detector.
  • Histograms ONLY of calibration data are
    downloaded off-line, analyzed, and used to
    derive constants.
  • The most common calibration sequence is
  • Inject varying size charge pulses into each pixel
  • Fit the number of recorded hits vs. charge with
    an error function
  • It takes 1.5 hours to run this on the full
    detector
  • This MEASURES the threshold and noise of each
    pixel

Hits
1 pixel
Charge
1 module
26
Threshold and TOT Tuning
Plots for1 module
Already using parameters from module production
After tuning in-situ 40 e- dispersion Can tune
entire detector to within 100e-
Time over threshold charge measurement (every
pixel measures charge)? 3-parameter chip-by-chip
calibration
27
But First, Optical Links had to be tuned
  • For each transmit-receive pair,
  • The power of the laser diode arrays and the bias
    of the photo-diode arrays must be adjusted
  • The time delay of the clock to register the data
    must be adjusted
  • This is done while sending a standard calibration
    data pattern
  • It not always leads to the right operating point
    for realistic data
  • Optical tuning keeps many students and postdocs
    employed

Operable fraction
95
A map bit errors for a single module
Photodiode bias
Clock delay
28
Cosmic ray tracks
  • Rate 0.3Hz
  • Plan is to double this sample

29
Monitoring results
Occupancy in cosmic run (layer 0)?
  • Both on-line and off-line histograms generated
    automatically for shifters to monitor
  • Histograms from Tier0 become available typically
    1h after run start
  • Modules not being operated (5) are white in
    these occupancy and efficiency slides
  • Efficiency for hits on tracks 98

Efficiency for hits on tracks (layer 1)?
30
Lorentz angle measurement
Number of pixels per cluster
Number of pixels per cluster
Track angle to normal incidence (radians)?
Track angle to normal incidence (radians)?
  • Measure .196 /-.004 (stat) Expect
    .224
  • More detailed studies in progress

31
Alignment results
  • Start from survey data
  • Global alignment Pixel to SCT
  • Layer-by layer alignment
  • Stave-by stave alignment
  • Cosmic tracks provide unique sample for alignment
    that is complementary to IP tracks
  • Not very useful for end caps.

Difference in closest approach to beam line
32
The Future
33
Thinking about upgrades Effect of dead channels
34
B-Layer replacement project
  • Original design HAD an insertable B-Layer.
  • Layers 1 and 2 were fixed to rest of Inner
    Detector
  • Around 2001 the design was changed to make the
    full detector insertable in order to meet the
    installation schedule
  • This made the BL not an independent unit?
  • New approach to recover from radiation damage
  • Not a replacement at all but the insertion of a
    new sub-detector inside the pixels, a la CDF L00
    / D0 L0.
  • Nevertheless, still referred to as B-Layer
    Replacement
  • Needs new chip and sensor technology (see FE-I4
    and sensor talks)
  • Other options have been proposed, but not
    considered realizable on a short time scale.
  • Removing the full detector, refurbishing, and
    re-installing it would take 2 years.

35
A trip along the beam pipe
Z-1.4m, towards IP
Z-2m, towards IP
Z-4m, view towards IP
Z-3m, Section
Example layer with smaller beam pipe.
36
Addition of a L00-style pixel layer in ATLAS
C-side
A-side
SQP
SQP
DISK
DISK
BPSS
BPSS
R53
R53
Insert inner Support
Cut flange
Remove one collar
SQP
SQP
DISK
DISK
BPSS
BPSS
R53
R53
Pull out old beam pipe
SQP
SQP
DISK
DISK
BPSS
BPSS
R53
R53
Insert new beam pipe with integrated layer
37
Long Term Upgrades
38
Longer term upgrades
  • Plan to replace entire inner detector
  • 10x the rate (10 tracks per atom at inner
    layer)?
  • Outer layers feel like present inner layer
  • Years of RD already in progress
  • Lighter materials with better thermal performance
  • More radiation hard sensor technologies
  • Higher rate capacity chips
  • Cheaper pixels (much more area)?
  • More efficient, lower mass power distribution
  • Etc.

39
Conclusion
  • The ATLAS pixel detector has been deployed over 1
    year ago, never to be accessed again until
    end-of-life removal, and is now ready for
    collisions.
  • Inaccessibility has been addressed with high
    granularity of services and control.
  • The as installed good channel count is
    excellent, exceeding the design goal of 97.
  • The main problem areas for operation are cooling
    and optical links.
  • Lessons for the future
  • Be more conservative with optical communication
    (other talks will reinforce this)?
  • There is no substitute for full system functional
    testing.
  • A B-Layer replacement is being developed to
    address extreme radiation damage
  • This is not really a replacement, but the
    insertion of a new pixel layer mounted on a
    smaller beam pipe.
  • RD and planning for a major upgrade well under
    way

40
BACKUP
41
Effect of TOT charge measurement
  • From test beam data

42
Production module raking
43
Radiation length
44
Plant Failure and recovery
  • Plant failed on May 1st during the Pixel sign off
    tests.
  • Compressors 3,4 and 5 were found damaged (1,2,6
    were judged ok)?
  • The failure occurred in the magnetic couplers of
    3,4,5
  • (used to seal the C3F8 volume) between the motor
    and the compressor shaft. In case of slippage for
    a long time, Eddy currents will heat inner
    metallic parts and damage (decompose, break)
    neighboring plastic parts.
  • Large plumbing runs could not be cleaned and had
    to be quickly replaces
  • Some time to shake out leaks and defects in new
    plumbing (still on-going)

Other issues
  • Not enough monitoring to in the system to
    properly assess performance upgrades needed
  • Value engineering of plant design has generally
    compromised performance -upgrades needed
  • Beam pipe bake out turned out to be very risky
    and required a live human real-time protection
    system
  • Beam pipe reaches 200C during bake out
  • B-Layer modules are 1cm away and must not exceed
    40C
  • Interlock shuts down BP heat if cooling is lost
  • But this is not good enough because BP heat
    capacity requires cooling to go on for 30minutes
    after heater power is cut!

45
Cooling plant and filter locations
safety valve
Molecular sieve 12?m x 4
safety valve
O1
O2
O3
O4
pneumatic valve
liquid tank
Molecular sieve 12?m
others
manual valve
pneumatic valve
4 liquid lines
PR
Sub-cooler
thermal screen
mixed water
dummy load
pixel
manual valve
6 Haug compressors
4 gas lines
6 X SCT
condenser
BPR
manual valve
UX15
USA15
mixed water
0.1?m Mechanical filter x 6
Write a Comment
User Comments (0)
About PowerShow.com