Status of ATLAS - PowerPoint PPT Presentation

About This Presentation
Title:

Status of ATLAS

Description:

Status of ATLAS – PowerPoint PPT presentation

Number of Views:273
Avg rating:3.0/5.0
Slides: 98
Provided by: jen853
Category:
Tags: atlas | ona | status

less

Transcript and Presenter's Notes

Title: Status of ATLAS


1
Status of ATLAS LHCC Open Session 25th September
2007
2
Topics covered Collaboration and
management Integration and installation
Forward detectors Schedule Global
commissioning of the detector Computing Data
preparation Operation Model Upgrade
organization Preparation for the first
data Note that the last open LHCC
presentation was given on 27th September
2006 This seems like a very long time ago, and
many things have happened since then More
documentation here than explained in the talk
3
ATLAS Collaboration (Status September 2007)
35 Countries 165 Institutions 2000
Scientific Authors total (1600 with a PhD, for
MO share) New Expressions of Interest to be
formally decided in the October CB Santiago
(PUC)/ Valparaíso (UTFSM), Chile Bogotá (UAN),
Colombia
Albany, Alberta, NIKHEF Amsterdam, Ankara, LAPP
Annecy, Argonne NL, Arizona, UT Arlington,
Athens, NTU Athens, Baku, IFAE Barcelona,
Belgrade, Bergen, Berkeley LBL and UC, HU Berlin,
Bern, Birmingham, Bologna, Bonn, Boston,
Brandeis, Bratislava/SAS Kosice, Brookhaven NL,
Buenos Aires, Bucharest, Cambridge, Carleton,
Casablanca/Rabat, CERN, Chinese Cluster, Chicago,
Clermont-Ferrand, Columbia, NBI Copenhagen,
Cosenza, AGH UST Cracow, IFJ PAN Cracow, DESY,
Dortmund, TU Dresden, JINR Dubna, Duke,
Frascati, Freiburg, Geneva, Genoa, Giessen,
Glasgow, Göttingen, LPSC Grenoble, Technion
Haifa, Hampton, Harvard, Heidelberg, Hiroshima,
Hiroshima IT, Indiana, Innsbruck, Iowa SU, Irvine
UC, Istanbul Bogazici, KEK, Kobe, Kyoto, Kyoto
UE, Lancaster, UN La Plata, Lecce, Lisbon LIP,
Liverpool, Ljubljana, QMW London, RHBNC London,
UC London, Lund, UA Madrid, Mainz, Manchester,
Mannheim, CPPM Marseille, Massachusetts, MIT,
Melbourne, Michigan, Michigan SU, Milano, Minsk
NAS, Minsk NCPHEP, Montreal, McGill Montreal,
FIAN Moscow, ITEP Moscow, MEPhI Moscow, MSU
Moscow, Munich LMU, MPI Munich, Nagasaki IAS,
Nagoya, Naples, New Mexico, New York, Nijmegen,
BINP Novosibirsk, Ohio SU, Okayama, Oklahoma,
Oklahoma SU, Oregon, LAL Orsay, Osaka, Oslo,
Oxford, Paris VI and VII, Pavia, Pennsylvania,
Pisa, Pittsburgh, CAS Prague, CU Prague, TU
Prague, IHEP Protvino, Regina, Ritsumeikan, UFRJ
Rio de Janeiro, Rome I, Rome II, Rome III,
Rutherford Appleton Laboratory, DAPNIA Saclay,
Santa Cruz UC, Sheffield, Shinshu, Siegen, Simon
Fraser Burnaby, SLAC, Southern Methodist Dallas,
NPI Petersburg, Stockholm, KTH Stockholm, Stony
Brook, Sydney, AS Taipei, Tbilisi, Tel Aviv,
Thessaloniki, Tokyo ICEPP, Tokyo MU, Toronto,
TRIUMF, Tsukuba, Tufts, Udine/ICTP, Uppsala,
Urbana UI, Valencia, UBC Vancouver, Victoria,
Washington, Weizmann Rehovot, FH Wiener Neustadt,
Wisconsin, Wuppertal, Yale, Yerevan
4
(No Transcript)
5
Financial history (MCHF, CORE costing) as
discussed in the RRB
MoU Construction Memorandum of Understanding,
in 1995 MCHF CtC Cost to Completion, in 2002 MCHF
6
ATLAS Plenary Meeting
Collaboration Board (Chair C. Oram Deputy K.
Jon-And)
Resources Review Board
Spokesperson (P. Jenni Deputies F. Gianotti and
S. Stapnes)
CB Chair Advisory Group
ATLAS Organization September 2007
Technical Coordinator (M. Nessi)
Resources Coordinator (M. Nordberg)
Executive Board
Inner Detector (L. Rossi, K. Einsweiler P.
Wells, F. Dittus)
Tile Calorimeter (B. Stanek)
Magnet System (H. ten Kate)
Data Prep. Coordination (C. Guyot)
Electronics Coordination (P. Farthouat)
Additional Members (T. Kobayashi, M. Tuts, A.
Zaitsev)
Trigger Coordination (N. Ellis)
Computing Coordination (D. Barberis, D. Quarrie)
LAr Calorimeter (H. Oberlack, D. Fournier, J.
Parsons)
Muon Instrum. (G. Mikenberg, F. Taylor, S.
Palestini)
Trigger/DAQ ( C. Bee, L. Mapelli)
Physics Coordination (I. Hinchliffe, October K.
Jakobs)
Commissioning/ Run Coordinator (G. Mornacchi)
7
Operation Model (Organization for LHC
Exploitation) (Details can be found at
http//uimon.cern.ch/twiki//bin/view/Main/Operatio
nModel )
8
Integration and installation progress of the
ATLAS detector
ATLAS superimposed to the 5 floors of building 40
(The distributed construction of detector
components is now essentially completed)
Diameter 25 m Barrel toroid length 26
m End-cap end-wall chamber span 46 m Overall
weight 7000 Tons
9
The Underground Cavern at Point-1 for the ATLAS
Detector
Length 55 m Width 32 m Height 35 m
Side C
Side A
Side C
Side A
10
Toroid system
End-Cap Toroid 8 coils in a common cryostat
Now installed in the cavern
Barrel Toroid parameters 25.3 m length 20.1 m
outer diameter 8 coils 1.08 GJ stored energy 370
tons cold mass 830 tons weight 4 T on
superconductor 56 km Al/NbTi/Cu conductor 20.5 kA
nominal current 4.7 K working point
End-Cap Toroid parameters 5.0 m axial length
10.7 m outer diameter 2x8 coils 2x0.25 GJ
stored energy 2x160 tons cold mass 2x240 tons
weight 4 T on superconductor 2x13 km Al/NbTi/Cu
conductor 20.5 kA nominal current 4.7 K working
point
Barrel Toroid 8 separate coils
11
ATLAS Barrel Toroid test at 21 kA on 9 November
2006
  • The current was ramped up in steps to 20.5 kA
    (nominal current), then to 21 kA in order to
    prove
  • margin, reduced back to 20.5 kA, then provoked
    quench, fast dump, the cold mass heated to
  • Tmax 58 K ? safe operation was demonstrated!

12
End-Cap Toroids All components were fabricated
in industry, and the assembly was done at CERN,
after a major design change implementing
externally adjustable axial force transfer rods
The ECTs were tested at 85 K and 100 K on the
surface (LN2), with excellent mechanical and
ground insulation results
The picture from January 2007 shows the first of
the two ECT cold masses inserted into the
vacuum vessel, and the second one assembled as
well
13
End-cap Toroid installation
The transports and installations were major
operations, involving also specialized
firms The ECTs are 250 tons, 15 m high, 5 m
wide ECT-A was lowered on 13th June, and
ECT-C on 12th July 2007
14
For ECT-A it was useful to remove 2 top BOS
chambers in sect 3/7 and part of the BT vacuum
pipes The clearances to the nose shielding and
the HS structure were like on drawings, 100 mm
very delicate manipulation
ECT-A on 13th June 2007, 2230
15
Central Solenoid (2 T) Operational since July
2006
Magnetic fields
A very large effort is underway to characterize
the Toroid magnetic field - 1800 3-D Hall probes
on chambers to fit the coil positions and
deformations - Detailed calculations and
simulations to determine the influence of all
iron structures ? Biot-Savart calculation iron
contribution gives the field map The goal
is to reach about 1 mT During the November 2006
BT tests first measurements were made on 3 coils
Field mapping machine in solenoid 250 000 points
measured (4 currents)
Detailed analysis of the August
2006 measurement - Fit all components within 4
Gauss rms - Relative sagitta error 6 x 10-4
reached
0 pseudorapidity 2.5
16
Summary remarks and plans for the magnet system
Cryogenic plant was upgraded and optimized to
maximize cooling power and liquefaction in the
right mix Currently the Barrel Toroid and
Central Solenoid rest and wait at 70 K End-Cap
Toroid-A - Installation on rails completed,
connections almost done - Vacuum pumping has
started, followed by cooling down End-Cap
Toroid-C - Installation on rails completed -
Tower installation progressing, preparations for
cooling down have started Next steps - ECT-A
cool-down followed by tests in beginning
November - ECT-C cool-down followed by tests in
end November - Start Toroid system tests, BT and
each ECT separately, at low current in December -
Full current Toroid system tests/operation
starting in March Major efforts and emphasis on
constructing efficient, accurate toroid
magnetic field maps
Main refrigerator in USA15
17
Inner Detector (ID)
The Inner Detector (ID) is organized into four
sub-systems Pixels (0.8 108 channels) Silicon
Tracker (SCT) (6 106 channels) Transition
Radiation Tracker (TRT) (4 105
channels) Common ID items
18
Following the barrel SCTTRT installation and
cabling (August to December 2006) the
installation suffered delays because of the
evaporative cooling heaters problems which have
been regularly discussed with the LHCC
referees After two failures (one in February and
one in May 2007) of barrel heaters they are now
relocated into an accessible place, improved
heater connectors are in fabrication and will be
installed October-November interleaved with EC
SCTTRT commissioning
44 Barrel heaters 72 EC heaters 88 Pixel heaters
New position on the cryostat flange
(Initial position where the barrel heaters would
be blocked by the end-caps)
19
50cm cm long, 2cm dia. 1mm wall inox tube with
10 W resistance coil
Heater union that failed in February
Injected a known current through the 2 shorted
(38 mW ) electrodes and measured with
high-resolution thermocamera
Many investigations and tools have been used to
understand the problem, these are just two
examples
20
ID end-cap installation
ID End-cap A after insertion
End-cap TRTSCT side A was lowered into the
detector on 24th May 2007
End-cap TRTSCT side C was Installed on 19th
June 2007
21
June 25
Pixel installation
June 26
22
Barrel SCTTRT sign-off after installation (End-
caps and Pixels waiting for the evaporative
cooling operation)
SCT Noise 60e as compared to lab tests, only
1 additional module lost during installation (HV
connection), overall 0.3 dead channels
TRT noise level same as in the lab when operated
with SCT
Noise ratio with/without SCT
Noise Occupancy ltNOgt 5.7 10-5 when
operated together
Noise level, Thr 0.15 mip
23
LAr and Tile Calorimeters
Tile barrel
Tile extended barrel
LAr hadronic end-cap (HEC)
LAr EM end-cap (EMEC)
LAr EM barrel
LAr forward calorimeter (FCAL)
24
All calorimeters are installed, and the three LAr
cryostats are cold and filled with LAr
End-cap side A in the extreme open position
25
Examples of monitoring the cryogenics
behaviour Barrel in stable conditions since
August 2006
Average Temperature (LN2 pressure fluctuations)
LAr purity measurements (O2-equiv. impurities in
ppm)
26
Critical for the calorimeters are - The (low
voltage) power supply delivery/rework schedules
for the LAr and the Tile Calorimeters (almost
completed now) - Intervention on the LAr FEB
electronics ongoing (to be finished by
February) - Instabilities in the Tile Calorimeter
drawers refurbishing action underway (to be
finished by February)
27
LAr LVPS retrofitting and FEB interventions on
the detector
All on-detector LVPS have been retro-fitted and
installed, and a back-up design is developed
with two firms given the limited knowledge of the
lifetime for the retrofitted supplies
Relatively simple modifications have to be
implemented on the FEBs (to fix a problem that
could lead to early ageing, and to cure a shaper
instability on 0.1 of the channels) However,
it is significant work to remove, repair,
and reinsert all FEBs
28
LAr HV Back End system Technically the problems
reported earlier to the LHCC have been fixed with
the firm, the only remaining issue is the late
delivery (October) for completion for EC-C
LAr Back End Read Out System Fully available for
the barrel and EC-A detector commissioning, and
being completed for the EC-C
29
Tile Calorimeter LVPS modifications and
refurbishment of the drawers
By now all LVPS have been modified and installed
A few FE components, and in particular connectors
and flexible connections, are systematically
checked and improved on all drawers More than
1/3 of the drawers have been refurbished so far,
curing all previously observed instabilities
30
Muon Spectrometer Instrumentation
End-Wall 1 MDT
Big Wheels (3 TGCs, 1 MDT)
Small Wheel (1 MDT and CSC, TGCs for 2nd coor.
)
Precision chambers - MDTs in the barrel and
end-caps - CSCs at large rapidity for the
innermost end-cap stations Trigger chambers -
RPCs in the barrel - TGCs in the end-caps
The Muon Spectrometer is instrumented with
precision chambers and fast trigger chambers A
crucial component to reach the required accuracy
is the sophisticated alignment measurement and
monitoring system
31
Muon barrel chamber installation is completed
(actually more than 99, as 4 chambers are left
out temporarily for easier access to the
ID) End-cap muon installation has progressed in
parallel on both sides, and will be detailed in
the following slides Besides the chambers
proper there is the whole suite of alignment
system, B-field sensors and temperature sensors
which is crucial for the performance all this is
brought gradually into operation
Very critical for commissioning is the late
delivery of power supplies from CAEN for the
whole muon system, the last ones will only be
available in April 2008
Barrel stations
32
MDT Big Wheel (one plane on both sides, all
installed)
TGC Big Wheel (three planes on both sides, all
installed)
33
Muon Small Wheels and JD shielding disk
integration
The work is well advanced for side C - TGCs
mounted on JD disk - All MDTs and most CSCs
installed The Small Wheels will be
transported as complete integrated units (one for
each side) Side A preparation is also on
schedule, both SM will be ready in November
Muon End-Wall MDT stations
The 192 MDTs are just completed and ready
for installation in the BB5 area Mounting
supports are being installed on the end-walls of
the underground cavern, and the first chamber
will be mounted starting in early October
34
Cables and services
An enormous and impressive work comes to an end
in the coming months the routing design, cable
tray installation, and installation of more than
50000 cables, 3000 flexible pipes, 3400
metallic tubes, large flexible chains
Examples of cables and patch panels, and of a
flexible chain
35
Beam pipe
The central three sections of the beam pipe have
been installed and connected, and they passed an
initial leak test The two sections in the ECT
bores are also installed, and the preparation of
all other outer sections is also well advanced
Installation of the ID beam pipe (with the
Beryllium section) together with the Pixel
package
Connection of the ID section to the LAr EC beam
pipe section
36
ATLAS Installation Schedule Version 9.2
37
Recent configuration - Completion of Big Wheels
(done!), start EO chambers - ID and
calorimeter electronics work - ECT connections
and cool-down
Side C EO chambers ECT magnet connections
Side A TGC2/3 completion EO chambers ECT
magnet connections
38
Configuration 2 During ECT tests
Side A
Side C
Sides A C Completion EO chambers Test the full
beam pipe connections
39
Configuration 3 During Small Wheel installation
and commissioning up to closing
Sides A C Lower and connect SW/JD Move to
large shutdown access mode
40
Configuration 4 Full magnet tests and final
closing
Sides A C Close detector Re-install external
beam pipe JF shielding
41
ATLAS Trigger / DAQ Data Flow
Second- level trigger
SDX1
pROS
stores LVL2 output
Event data requests Delete commands
Gigabit Ethernet
Requested event data
USA15
Regions Of Interest
USA15
Data of events accepted by first-level trigger
1600 Read- Out Links
UX15
150 PCs
VME
Dedicated links
ATLAS detector
Read- Out Drivers (RODs)
Read-Out Subsystems (ROSs)
First- level trigger
RoI Builder
UX15
Timing Trigger Control (TTC)
Event data pushed _at_ 100 kHz, 1600 fragments of
1 kByte each
42
Level-1
The level-1 system (calorimeter, muon and central
trigger logic) is well advanced, the production
and installation almost completed, the focus is
on commissioning and operation, already in the
global detector cosmics runs
43
LVL1 calorimeter trigger
  • Installation in the underground
  • counting room is in progress
  • and nearing completion
  • Also the integration with DAQ,
  • HLT and the LVL1CTP is
  • progressing well

Signal integrity tests are in progress - Example
connectivity Tiles ? Preprocessors for 1/8 of the
Tile Calorimeter system
44
LVL1 muon trigger
Besides for some final module productions (RODs),
the hardware is essentially all installed,
with the important exception of the power
supplies which are on the critical path for
commissioning
The emphasis is on system integration
and commissioning, and gradually
increased operation as part of global detector
running with cosmic rays
End-cap TGCs
Barrel RPCs
Racks and optical fibres in USA15
45
Read-Out System (ROS)
  • All 153 ROSs are installed and
  • commissioned standalone
  • - Each ROS PC is equipped with the final number
  • of ROBIN cards (700 in total including spares,
  • 1600 Read-Out Links total)

ROBIN
About 2/3 of them connected to RODs and fully
commissioned Taking data regularly with final
DAQ
46
HLT/DAQ room on the surface (SDX1)
The HLT/DAQ system farm is operational and is
heavily used for detector commissioning,
calibrations, and global cosmics runs The system
itself is being commissioned in regular TDAQ
technical runs
The first 130 HLT nodes
Final system A total of 100 racks / 2500
highest-performance multi-core PCs (expect to
have about 50 of this in place for July 2008)
47
LHCC milestones evolution
LHCC construction milestones (last full update
done in April, has become obsolete as a tool by
now) Integrated progress plot since the baseline
change in 2003
Construction/installation issues and risks
(Top-Watch List) A list of these issues is
monitored monthly by the TMB and EB, and it is
publicly visible on the Web, including a
description of the corrective actions undertaken
http//atlas.web.cern.ch/Atlas/TCOORD/TMB/
48
ATLAS Forward Detectors
Very Forward Detectors

A common upgrade proposal, first to ATLAS
internal, is in preparation by the FP420
and RP220 ATLAS colleagues
There is considerable progress in this area as
well Not all financing is assured yet for the
FWD detectors, and new contributions are actively
invited and sought for Note ATLAS forward
detector and physics efforts are treated as an
integral part of ATLAS
49
LUCID
The first stage detector, with limited
instrumentation, is ready for integration with
the VJ section of the beam pipe
The two LUCID detectors in the test lab (2x16
PMTs, 10)
VJ beam pipe section
50
Zero Degree Calorimeter
The first stage detector will be only installed
in one arm, and co-exist with LHCf Phase 2 will
still be in one arm only, after completion of
LHCf, and the complete detector in both arms
will follow for phase 3
Phase 1
Phase 2
Inserting coordinate readout fibers in the first
module
Phase 3
51
ALFA
  • Production of a full
  • prototype fibre detector
  • (corresponding to one
  • complete Roman Pot,
  • 1/8 of the total)

Prototype of the ATLAS Roman Pot station in a
test stand with probes to map the precision of
the movements
52
ATLAS Control Room (ACR)
The control room is operational and used during
the cosmic ray commissioning runs, integrating
gradually more and more detector
components Cosmic ray data is collected through
growing segments of the full final Event
Building and DAQ system (so called Milestone
Weeks for combined running)
53
Simulated cosmics flux in the ATLAS cavern
Cosmics data Muon impact points
extrapolated to surface as measured by Muon
Trigger chambers (RPC) Rate 100 m below ground
O(10 Hz)
54
Commissioning Milestone Weeks (M1-M6) Schedule
2007/8
Dates Systems Integration Detector configuration Operations Cosmic run Training ACR
M1 11-19/12 2006 DAQ R/O Barrel Lar Tile CTP Barrel calorimeters Achieve combined run 2 days Tile cosmic trigger N/A Initial setup 5 desks Central DCS
M2 28/2 to 13/3 2007 DAQ/EB DAQ V. 1.7 Muon barrel (S. 13) Monitoring/DQ Barrel calorimeters Barrel Muon Combined runs Mixed runs 2 x weekd ends Tile cosmic trigger RPC cosmic trigger Periodic cosmic runs after M2 After M2 week Increase to 7 desks
M3 4/6 to 18/6 2007 Barrel SCT Barrel TRT Muon EC (MDT, TGC) Offline Barrel and End Cap calorimeters Barrel muon (56) EC muon MDT Barrel SCT, TRT EC muon TGC 1st week focus on operations, checklist management, coordination between desks 1 week Tile Muon cosmic trigger (side A) 4/6 to 11/6 Towards final layout 13 desks
M4 23/8 to 3/9 2007 2 day setup 2 week ends Level-1 Calo HLT DAQ 1.8 Offline 13 Barrel EC calos Barrel EC muon Barrel TRT SCT R/O Level-1 Mu, Calo ATLAS-like operations Use of DQ assessment 1 week Try also calorimeter trigger Whole week Final setup
M5 22/10 to 5/11 2007 ID EC (TRT) Pixel (R/O only) SCT quadrant M4 Pixel (R/O only, no detector) Week 1 system assessment Week 2 ATLAS- like operation 1 week 1 week
M6 February 2008 SCT and Pixel detectors ATLAS-like Operations Global cosmic run
55
M4 Runs Configuration R/O Controls Remarks In
T0 Castor-2 250 TB pool T0 to T1 distribution N/A T0 controls operational Reconstruction histo merging jobs 24TB data 2.7 M events Real time T1 analysis 24/8 to 3/9
DQ About 10 online PCs Framework tools, 2D 3D event display N/A DQ shifts (including WE) Feedback on tools 23/8 to 3/9
DCS Linux based central desk Systems integration CIC infrastructure Central operations Central DCS shifts monitoring and operating the detector Run and slow controls status sharing Impressive integration achievements 23/8 to 3/9
Central Trigger CTP MUCTPI (new board) 2 x ROS Trigger tool utility Automatic busy mask generation In Tile,RPC,TGC, random ROI to RoIB Extensive BCID/BCR/ECR tests 23/8 to 3/9
L1Calo 50 L1calo hardware Full barrel coverage DCS system integrated L1calo - Tile combined timing runs DQ histograms defined integrated 27/8 to 3/9
HLT 120 HLT nodes 480 L2Pu 480 PT (EF) Algorithms CaloRec TrigEFIDcosmics 27/8 to 3/9
DAQ M3 7 online nodes and 4 SFOs New TDAQ version 1.8 Streaming (based on L1 trigger type) 23/8 to 3/9
SCT 4 F/E modules No detector 1 ROD 1 ROS Alarms, monitoring (to Cool), link to run control Monitoring integrated in DQMF BCID,BCR,ECR ok Established power-cut recovery procedure 27/8 to 31/8
TRT Barrel 6 top and 6 bottom stacks Needs 24/7 attendance (gas) 10 RODs 6 ROS DCS integrated with central controls operational monitoring data to Oracle Monitorig histograms defined anditegrated in DQ framework BCID/BCR/ECR ok 23/8 to 3/9
LAr All partitions in Barrel AC EC A C 80 of LAr 12MB evet size Complete DCS system integrated and operational Use 32 samples for cosmics Established power cut recovery procedure DQ histograms integrated in framework 23/8 to 3/9
Tile EBALBA Bottom LBC 55 Tile HV,LV operational from central DCS Cosmic trigger ( 0.5 Hz) signal to L1 Calo DQ histograms integrated in framework BCID/BCR ok 23/8 to 3/9
MDT Ba sector 3,4,5,6 EC full wheel C 224 chambers 75000 ch. 72 RODs 8 ROS HV,LV operational from central DCS DQ assessment muon-wide DQ histograms integrated in framework No ECRs 23/8 to 3/9
RPC Sector 5 AC Trigger logic to IP 2 ROD emulators HV,LV operational from central DCS Trigger rates 200 Hz (no selectivity), lt10 Hz (pointing to IP), BCR,BCID ok 24/8 to 3/9
TGC Sector C09 wheels 1,2,3 Sector C10,C11 whell 1 No R/O DCS fully integrated and operational Trigger uses final HpT boards, ROI to MUCTPI 2Hz, Final data base driven initialization 31/8 to 3/9
56
Run Control panel showing many active DAQ
segments during a combined cosmics run taken on
14th June 2007 (SCT only readout calorimeter and
muon triggers)
57
Example of Detector Control System (DCS) panel
operational in M3 (LAr EC-A)
58
(No Transcript)
59
A cosmics muon (Barrel Muon MDT chambers and Tile
Calorimeter) in M3
60
From M3
Online monitoring of MDT distributions
Maximum tower energies in top and bottom Barrel
Tile Calorimeter partitions
61
Cosmics through MDTs, Calorimeters, and TRT
during M3
62
First cosmics during M3 in a segment of the
end-cap Big Wheels MDT and TGC
MDT TDC distribution for cosmics triggered and
timed with TGCs, online monitoring
End-cap muon track segment
63
Cosmics in a common LAr and Tile Calorimeter run
(M3)
Landau distributions in LAr middle cells
64
M4 cosmics LAr signals
65
Another example from the M4 runs
66
Another example from the M4 runs
67
A 3-D (VP1) display from a M4 cosmics
68
Trigger algorithms and menus
This activity, very closely coupled to the
physics studies, is a very central broad
activity The physics selections will be done
along so-called trigger slices Electron/photon
These four slices are crucial for a large part
of the Jet physics programme, and must be
operational from the start Tau Muon B-physics V
ery low pT threshold, will take advantage of
low-luminosity Missing ET Require refined
detector understanding, and may
therefore b-tagging take some more time to
become fully efficient Minimum bias Very
important for commissioning with
beam Cosmics Available now for commissioning,
several algorithms used in the global cosmics
runs (M4)
The HLT software framework and steering are
operational and is being optimized
69
Examples of photon slice performance studies
Exotics di-photon studies
Direct Photon Production
G???
Various Jet pT samples from 25 GeV to 1 TeV
SM H??? trigger studies
70
Commissioning the trigger, trigger for
commissioning
Trigger menu for initial L1031 cm-2 s-1 being
prepared Note affordable rate to storage 200
Hz (out of 106 Hz interaction rate at L1031)
Item Trigger output rate 1031
(examples) (not prescaled) 2e5
5-10 Hz e15
40 2e15
1 ?20
20 ?6
55 2?4 15 j70
27 4j23
17 ?25xE32
7 ?10i?25i 5

At low initial luminosity can afford low
thresholds w/o prescaling, simple selections,
redundant items, several triggers for calibration
and sanity checks, run High-Level-Trigger in
pass-through mode, etc. Essential to understand
trigger and detector
Preliminary, for illustration
Tracks reconstructed online by combining Muon
chambers, calorimeters and TRT (all
sub-detectors except Pixels and SCT were taking
data)
71
ATLAS Computing and Software Timeline 2007/8
  • Running continuously throughout the year
    (increasing rates)
  • Simulation production
  • Cosmic ray data-taking (detector commissioning)
  • January to August
  • Data streaming tests
  • (done, first step to FDR)
  • February through May
  • Intensive Tier-0 tests
  • From February onwards
  • Data Distribution tests
  • From March onwards
  • Distributed Analysis (intensive tests)
  • May to end 2007
  • Calibration Data Challenge
  • November 2007 to spring 2008
  • Full Dress Rehearsal
  • April
  • GO!

72
Export monitoring (ARDA dashboard) of cosmic ray
data taken duringthe M4 combined commissioning
running week (from detector to offline)
Total throughput (MB/s) Aug 23 Sep 8
Data transferred (GB) Aug 23 Sep 8
Completed file transfers Aug 23 Sep 8
73
Software chain
RAW Event data from TDAQ 1.6 MB ESD (Event
Summary Data) output of reconstruction (calo
cells, track hits, ..) 1 MB AOD (Analysis
Object Data) physics objects for analysis
(e,?,?,jets, ) 100 kB DPD (Derived Physics
Data) equivalent of old ntuples 10 kB
(format to be finalized) TAG Reduced set of
information for event selection 1 kB
Huge efforts was made over last year(s) to keep
ESD and AOD sizes to the above values
(constrained by storage resources). As a result,
ESD and AOD today are better optimized from
technical and content (targeted to first data
taking) point of views
Note SW infrastructure much more complex than in
the above sketch. E.g. one important component is
Database, in particular Condition Database, where
calibration alignment constants and most of
metadata (e.g. detector quality and luminosity
information) are stored
74
The Analysis Model is being finalized
  • This model is very elegant and clean since it
    allows
  • -- the same code to run at all levels, from
    primary AOD to DPD
  • -- seamless code porting Athena ? ROOT
  • -- the same analysis to be performed in
    different
  • frameworks (Athena batch, Athena
    interactive, ROOT)
  • However, it is a viable solution only if
  • No big cpu penality compared to e.g. a flat
    nutple
  • adequate/positive user feedback ? new DPD will
    be part of Release 13
  • production people will be strongly pushed
    to use it

75
Calibration Data Challenge
  • Exercise the full calibration/alignment
    procedure as will need to do with first data
  • Compare performance of realistic detector after
    calibration and alignment
  • to nominal (TDR) performance
  • Understand systematic effects (material,
    B-field), test trigger robustness, etc
  • Learn how to do analyses without knowing exact
    detector layout
  • Timescale first results available, will
    continue and be refined until data taking

76
Full Dress Rehearsal
  • A complete exercise of the full chain from TDAQ
    output to analysis
  • make sure all components (from SW to Computing
    Model) are in place and
  • coherent, find residual bottlenecks, mimic
    real data taking conditions a few
  • months before LHC start-up
  • Produce samples emulating data from TDAQ output
  • -- bytestream format (and no MCTruth )
  • -- mixture of physics processes as expected
    at HLT output
  • -- data organized into trigger-based
    streams e/?, Muon, Jets, ?ETmiss, B-physics,
  • minimum bias Express Stream
    calibration stream(s)
  • -- files are closed at luminosity block
    boundaries (every 1 min.)
  • RAW samples reconstructed at Tier0 ? produce
    ESD, AOD, TAGs
  • RAW, ESD, AOD, TAG replicated at Tier1-s, AOD
    also to Tier2-s
  • Exercise group-based DPD production at
    Tier1/2-s ? end-user analysis
  • Re-processing test at Tier1-s
  • Time-varying detector problems will be injected
    in the the data samples
  • ? run Data Quality on Express Stream to spot
    these and fill Condition DB
  • Will run production shifts at Tier0 and Tier1-s
  • Several rounds of increasing complexity
  • Phase 0 Data Streaming test done
  • Phase 1 Start November 2007
  • Phase 2 Start March 2008

77
Data Preparation activities are in full swing,
and now visibly under a coherent framework,
spanning many areas.
Data Quality Assessment Offline
Commissioning Data Streaming Calibration Alignment
B-fields Event Display,
Cosmic muon during BT test
The TGC and MDT Big Wheel alignment checks during
mounting and displacements are operational
78
Example from the Data Preparation In situ
inter-calibration of the EM calorimeter using
Z?ee events.
Task Correct calorimeter long-range
inhomo-geneities (module-to-module, temperature
and HV instabilities, ), and effects from extra
material in front of the calorimeter
Method Use Z-mass fits in Z?ee events (rate
O(1) Hz at 1033 s1cm2) Inject calibration
coefficients for different regions of size ?????
0.2?0.4
Constant resolution term (local fluctuations
long range term) versus accumulated luminosity
(full detector simulation)
Performance in simulation With 200 pb1,
initial non-uniformities of ? 2.5 can be
recovered to a precision of 0.7, which meets the
requirements (TDR)
79
Example from the Data Preparation A lot of
effort being made to monitor, assess and record
data quality information at all data flow levels
to Tier0
DB from online Config, Calib DCS, DQ status
Front-end
DCS
Tier 0
RODs
LVL1
express
calib
RAW 200Hz, 320MB/s
Xpress reco, calibration
LVL2
Online DQA
Prompt reco (bulk)
Verify
Event Builder
Calib/align DQ status Archives
Online DB
updated calib
SFI (s)
T1 transfer
ESD 100MB/s AOD 20MB/s
TAG DB
updated calib
EF
EF
EF
EF
Shift Log DQ status Archives
T1 Oracle replica
T1 (Late Reproc.)
SFOs
T2 (MC prod.)
T2 replica
80
Operation Task Sharing
The operation of the ATLAS experiment, spanning
from detector operation to computing and data
preparation, will require a very large effort
across the full Collaboration (estimated at 600
FTE effort per year) Over part of the last year
a working group has elaborated a framework that
has been approved by the Collaboration Board in
February 2007 aiming at a fair sharing of these
duty tasks (Operation Tasks, OT) The main
elements are - OTs needs and accounting are
reviewed and updated annually - OTs are defined
under the auspices of the sub-system and activity
managements - Allocations are made in two steps,
expert tasks first, and then non-expert tasks -
The fair share is proportional to the number of
ATLAS members (per Institution or Country) -
Students are favoured by a weight factor 0.75 -
New Institutions will have to contribute more in
the first two years (weight factors 1.5 and
1.25) Note that physics analysis tasks, and
other privileged tasks, are not OTs, of
course An important effort is now going on to
define the OTs, to set up the Web tools to manage
the OT planning, and to gradually implement the
sharing procedure
81
TMB
Detector Operation (Run Coordinator) Detector
operation during data taking, online data
quality,
Trigger (Trigger Coordinator) Trigger data
quality, performance, menu tables, new triggers,
..
Data Preparation (Data Preparation
Coordinator) Offline data quality, first
reconstruction of physics objects, calibration,
alignment (e.g. with Z?ll data)
Computing (Computing Coordinator) SW
infrastructure, GRID, data distribution,

Physics (Physics Coordinator) optimization of
algorithms for physics objects, physics channels
(Sub)-systems Responsible for operation and
calibration of their sub-detector and for
sub-system specific software
82
(No Transcript)
83
ATLAS organization to steer RD for upgrades
ATLAS has, in place and operational, a structure
to steer its planning for future upgrades, in
particular for RD activities needed for
possible luminosity upgrades of the LHC
(sLHC) This is already a rather large and
broad activity The main goals are to -
Develop a realistic and coherent upgrade plan
addressing the physics potential - Retain
detector experts in ATLAS with challenging
developments besides detector commissioning
and running - Cover also less attractive (but
essential) aspects right from the beginning The
organization has two major coordination
bodies Upgrade Steering Group (USG) (Existing
since three years, with representatives from
systems, software, physics, and relevant
Technical Coordination areas) Project Office
(UPO) (Operates since more than a year, fully
embedded within the Technical Coordination) Upgra
de RD proposals are reviewed and handled in a
transparent way within the Collaboration There
is a good and constructive synergy from common
activities with CMS where appropriate The LHCC
would be welcome to act as global review body for
overall ATLAS upgrade plans
84
Note The EU FP7 sLHC proposal is of
direct relevance to the UPO
85
A list of current ATLAS sLHC upgrade RD
activities, and their current status (page 1)
Short name Title Principle contacts Status
17/09/07
Opto Radiation Test Programme for the ATLAS Opto-Electronic Readout System for the SLHC for ATLAS upgrades Cigdem Issever Approved by EB
Staves Development and Integration of Modular Assemblies with Reduced Services for the ATLAS Silicon Strip Tracking Layers C. Haber, M. Gilchriese Approved by EB
ABC-Next Proposal to develop ABC-Next, a readout ASIC for the S-ATLAS Silicon Tracker Module Design Francis Anghinolfi, Wladek Dabrowski Approved by EB
Radiation BG Radiation background benchmarking at the LHC and simulations for an ATLAS upgrade at the SLHC Ian Dawson Approved by EB
n-on-p sensors Development of non-inverting Silicon strip detectors for the ATLAS ID upgrade Hartmut Sadrozinski Approved by EB
SiGe chips Evaluation of Silicon-Germanium (SiGe) Bipolar Technologies for Use in an Upgraded ATLAS Detector Alex Grillo, S. Rescia Approved by EB
3D sensors Development, Testing, and Industrialization of 3D Active-Edge Silicon Radiation Sensors with Extreme Radiation Hardness Results, Plans Sherwood Parker now Cinzia Da Via Approved by EB
Modules Research towards the Module and Services Structure Design for the ATLAS Inner Tracker at the Super LHC Nobu Unno Recommended for approval by USG awaiting CB comments
Powering Research and Development of power distribution schemes for the ATLAS Silicon Tracker Upgrade Marc Weber Under review by USG
TRT RD of segmented straw tracker detector for the ATLAS Inner Detector Upgrade Vladimir Peshekhonov New external reviewer to be found
86
A list of current ATLAS sLHC upgrade RD
activities, and their current status (page 2)
Gossip RD proposal to develop the gaseous pixel detector Gossip for the ATLAS Inner Tracker at the Super LHC H van der Graaf Expression of interest received
SoS Expression of Interest Evaluations on the Silicon on Sapphire 0.25 micron technology for ASIC developments in the ATLAS electronics readout upgrade Ping Gui and Jingbo Ye Reviewers waiting for updated proposal
Thin pixels RD on thin pixel sensors and a novel interconnection technology for 3D integration of sensors and electronics H-G. Moser Updated proposal under review by USG
Muon Micromegas RD project on micropattern muon chambers V. Polychronakos Joerg Wotschack Proposal received, waiting for external reviewer
TGC RD on optimizing a detector based on TGC technology to provide tracking and trigger capabilities in the MUON Small-Wheel region at SLHC G. Mikenberg Expression of interest received
MDTReadout Upgrade of the MDT Readout Chain for the SLHC R. Richter Expression of interest received
MDTGas RD for gas mixtures for the MDT detectors of the Muon Spectrometer P. Branchini Expression of interest received
Selective Readout Upgrade of the MDT Electronics for SLHC using Selective Readout R. Richter Expression of interest received
High Rate MDT RD on Precision Drift-Tube Detectors for Very High Background Rates at SLHC R. Richter Expression of interest received
Diamond Diamond Pixel Modules for the High Luminosity ATLAS Inner Detector Upgrade M. Mikkuz Under review by USG
ID Alignment ID Alignment Using the Silicon Sensors H. Kroha EoI Received
Fast Track Trigger FTK, a hardware track finder M. Shochet Under review by USG
87
Getting ready for the first physics analyses
A major ingredient for the physics preparation is
the understanding of the detector
performance gained in many test beam campaigns,
which culminated in the large 2004 Combined Test
Beam efforts with large-scale set-ups for the
barrel and end-cap regions in the SPS H8 and H6
beams
Full vertical slice of the barrel tested on
CERN H8 beam line May-November 2004
O(1) of the ATLAS coverage
  • 90 million events collected
  • e?, ? ? 1 ? 250 GeV
  • ? ?, ? ?, p up to 350 GeV
  • 20-100 GeV
  • B-field 0 ? 1.4 T
  • Many configurations
  • (e.g. additional material in ID,
  • 25 ns runs, )

(Of course only a few examples out of a rich
amount of data can be mentioned)
- All sub-detectors (and LVL1 trigger) integrated
and run together with common DAQ - Data
analyzed with common ATLAS software - Gained
experience also with Condition DB
88
Tracking and alignment in Inner Detector
9 GeV test beam ps
Pixels
SCT
TRT
6 pixel modules and 8 SCT modules (inside B
0?1.4 T) 6 TRT modules (outside field)
  • Achieved alignment precision 5-10 ?m
  • Corrections (noisy/dead channels, alignment
  • constants) stored in Condition DB

Note no TRT, B1.4 T
Alignment, reconstruction and simulation tools
are in good shape
89
ATLAS preliminary
Muon sagitta resolution measured in the 2004
combined test beam
Data fitted with
  • p muon momentum from beam magnet
  • K1 intrinsic resolution
  • K2 multiple scattering

Data Simulation
K1 50.71.5 ?m K1 403 ?m 0.290.01
X0 0.32 0.02 X0
horizontal error bars give beam spread
Muon alignment (optical sensors) tested by moving
(rotations, displacements) barrel MDT
90
Background fake ETmiss tails from instrumental
effects (calorimeter non-compensation,
resolution, cracks, )
Transition between end-cap (EM, hadronic/HEC) and
forward (FCAL) calorimeters at ?3.2 was studied
with dedicated combined test-beam in H6 beam
Data described well by MC in complex region with
3 different calorimeters and dead material
91
Prospects for physics in 2008-2009 (examples )
We will jump immediately into a new territory
92
The first peaks
After all cuts 4200 (800) J/? (Y) ? ?? evts
per day at L 1031 (for 30 machine x
detector data taking efficiency) 15600 (3100)
events per pb-1
Y
? tracker momentum scale, trigger performance,
detector efficiency, sanity checks,
After all cuts 160 Z ? ?? evts per day at L
1031 600 events per pb-1
? Muon Spectrometer alignment, ECAL uniformity,
energy/momentum scale of full detector,
lepton trigger and reconstruction efficiency,
Precision on?? (Z????? with 100 pb-1 lt2
(experimental error), 10 (luminosity)
93
The first top quarks in Europe
A top signal can be observed quickly, even with
limited detector performance and simple analysis
. and then used to calibrate the detector and
understand physics
  • tt ? 250 pb for tt ? bW bW ? bl? bjj

ATLAS preliminary
3 jets with largest ? pT
  • Top signal observable in early days with no
    b-tagging and simple analysis
  • (3000 evts for 100 pb-1) ? measure ?tt to 20,
    mt to lt10 GeV with 100 pb-1?
  • (ultimate LHC precision on mt 1 GeV)
  • In addition, excellent sample to
  • commission b-tagging, set jet E-scale using W ?
    jj peak,
  • understand / constrain theory and MC generators
    using e.g. pT spectra

94
An ideal candidate for an early discovery A
narrow resonance with mass 1 TeV decaying into
ee-
  • with 100 pb-1 large enough signal for
  • discovery up to m gt 1 TeV
  • signal is (narrow) mass peak on top of small
  • Drell Yan background
  • ultimate calorimeter performance not needed

Is it a Z or a Graviton ? From angular
distribution of ee- can disentangle Z (spin1)
from G (spin2) Requires more data ( 100 fb-1)
Ultimate ATLAS reach (300 fb-1) 5 TeV
95
Another example Supersymmetry
Hints with only 100 pb-1 up to m1 TeV,
but understanding backgrounds requires 1 fb-1
Planning for future facilities would benefit a
lot from quick determination of scale of New
Physics. With 1 fb-1 LHC could tell if
standard SUSY accessible to ?s ?1 TeV ILC.
96
The more difficult case light Higgs boson
Most difficult region need to combine many
channels (e.g. H ? ??, qqH?qq??) with small S/B
For mH gt 140 GeV discovery easier with H ? ZZ()
? 4l (narrow mass peak, small B). H ? WW ? l?l?
(dominant at 160-175 GeV) is counting experiment
(no mass peak)
97
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com