Title: Experiment Control System
1Experiment Control System
- Achievements
- (since LHCC Comprehensive review 5)
- Status
- Plans
2Achievements
- Software
- ECS / HLT interface
- Introduction of DATE infoLogger
- Introduction of mySql
- Interactive status Web page
- Calibration runs framework ready, procedures
implemented for some detectors - Hardware
- PCs with multiple screens
- Tests
- Control of DAQ extended reference system
- Information
- Manual
- Training
- Conferences
3ECS / HLT interface
- Background
- The ECS is a layer of software on top of 4 online
systems (DCS, TRG, DAQ, and HLT). - The exchange of information and commands between
the ECS and the online system goes through
interfaces made of SMI Finite State Machines
(FSM). - The interfaces between ECS and DCS, TRG, and DAQ
were defined and implemented in 2004. - The interface between ECS and HLT has been
- Fully specified and agreed (December 2005)
- Implemented in the ECS (January-February 2006)
- To be tested at point 2 (April 2006)
4reset
ERROR
shutdown
OFF
initialize
implicit transtion
RAMPING_UP__INITIALIZINGltlt intermediate stategtgt
RAMPING_DOWNltlt intermediate stategtgt
implicit transtion
shutdown
INITIALIZED
configure
CONFIGURINGltlt intermediate stategtgt
reset
implicit transtion
CONFIGURED
engage
implicit transtion
ENGAGINGltlt intermediate stategtgt
DISENGAGINGltlt intermediate stategtgt
disengage
READY
implicit transtion
start
HLT proxy - States
stop
RUNNING
5ECS / HLT interface
LHCC comp. Review 5
today
6Hardware installed at Point 2
- 6 SIUs
- Fibre SXL2-CR1
- 20 fibers tested
- - round-trip att. lt 4 dB
- - one way 176 metres
- 2 1 LDCs
- Switch
- 1 GDC
PC
RCU
RCU
RCU
RCU
SXL2
SIU
SIU
SIU
SIU
PC
RCU
RCU
SIU
SIU
(control)
HLT Farm
PX24/CR2
General Purpose Network (GPN)
4 x DRORC
2 x DRORC
DRORC
LDC
LDC
LDC
PX24/CR1
Event Building Network (Gigabit Ethernet)
GDC
Server
Network Gateway
HW available for 2 sectors 12 SIUs 12 DRORCs 12
DIUs
(DATE, ECS)
TDS
7Introduction of DATE infoLogger
- The infoLogger package was originally developed
as a pure DATE component (DAQ only) to store and
retrieve information and error messages. - The requirements of the ECS, in terms of
information and error messages, are very similar
to the requirements of the DAQ. - To avoid the development of a new ECS component
(independent from the DATE infoLogger but very
similar to it) the DATE developers have modified
the infoLogger - Making it more general
- Available to other online systems (e.g. ECS)
- With filtering possibilities (e.g. to retrieve
only ECS messages) - The present version of the ECS uses the DATE
infoLogger - Advantage a lot of functionality with little
effort - Disadvantage the ECS depends on DATE
8DATE infoLogger
9mySql
- The ECS configuration was initially stored in a
database made of simple, flat ASCII files - Not safe if accessed and modified by concurrent
processes - To make it available on many machines the file
system containing the ASCII files must be
exported to and mounted on many machines - A mySql database
- Removes the above problems
- Does not increase the complexity of the system
(mySql is already required by DATE) - The ECS now uses a mySql database
- ASCII files still possible for backward
compatibility reasons - A database editor has been developed (available
only with mySql)
10ECS configuration database editor
11Interactive status Web Page
- Displays information about partitions and
detectors including - Partition structure (list of detectors
active/inactive in a partition) - Status of the ECS components controlling
partitions and detectors Partition Control
Agents (PCA) and Detector Control Agents (DCA) - Status of the Human Interfaces controlling PCAs
and DCAs. - Relies on information stored in the ECS database
and maintained up-to-date by an ECS daemon - Warns the user if the above information is not
regularly updated (e.g. because the ECS daemons
is no longer working)
12Interactive status Web page
13Calibration runs
- Every detector requires several types of
calibration runs - Every type of calibration run consists of a
sequence of operations that must be performed in
a well defined order - Every operation requires the execution of a
program (or a set of programs) on the appropriate
machines (DAQ or DCS machines) - When the sequence of operations is defined and
when the programs implementing these operations
exist, the ECS can steer the calibration runs
starting the different operations in the right
order. - During the last year, the framework to steer
calibration runs has been introduced in the ECS. - The framework is ready for all the detectors
- Logic procedures have been implemented for some
detectors - Requests from the other detectors are expected
soon
14MUON_TRK Detector Control Agent
Startelectronicscalibrationrun
15MUON_TRK Detector Control Agent
Electronics Calibration Run Parameters
16MUON_TRK Electronics Calibration Run
3
1
2
3
17HW multi-screen PCs
- There is no ECS hardware
- The ECS project is pure spirit
- No budget (and almost no manpower)
- The ECS components will run on PCs that are part
of the DAQ system the DAQ Service PCs (DS).
These PCs are used to run - DAQ run Control
- mySql server
- infoLogger server
- DIM domain name server
- The ECS Human Interfaces require very large (or
multiple) screens - Investigations and tests have been successful to
implement multi-screen displays with commodity
equipments - This experience will help when defining and
setting up the ALICE Control Room
18Multi-screen PCs
PC with 2 additional Nvidia NVS 280
cards Linux Xinerama
19ECS tests
- No test beams in 2005
- Main test setup DAQ Extended Reference System
- 2 Local Trigger Units (LTU) TTC modules
- 2 DDL Data Generators (to emulate FEE for 2
detectors) - DAQ system (including one DS where the ECS can
run) - A Filter PC (to emulate the HLT farm)
- A DCS PC with a clone of the HMPID DCS
- All the improvements to the ECS components have
been tested on the above setup
20TTC
2x LTU, 2x TTCvi, 2x TTCexCCT VP325 pcald55
FILTERpcald41
DDGpcald49
DDGpcald40
201/0, 201/1
DDL7, DDL8
403/0
226
214
403/1
213
401/0, 105
DDL4
DDL9
DDL6
DDL10
DDL3
DDL5
DDL1, DDL2
215
223
205/1, 204/1
406/1
406/0
204/0, 205/0
216
HMPID
MUON
LDCpcald33
LDCpcald34
LDCpcald35
LDCpcald36
HLT-LDCpcald42
HLT-LDCpcald48
Gigabit Ethernet Switch
GDCpcald31
GDCpcald32
DS pcald30
DCSalicedcs01
DH1
IT1
IT2
DDL
Fiber Channel switch
Ethernet
FiberCannel
Reference System 19-Dec-2005
DotHill
Infortrend
DH1
DH2
IT1
IT2
21Information
- ALICE DAQ and ECS Users Guide (ALICE-INT-2005-15)
- Part II ALICE Experiment Control System Users
Guide - ICALEPCS 2005
- The ALICE Experiment Control System
- 3 DATE Courses
- 14-15 March 2005
- 27-28 June 2005
- 16-17 March 2006
- The DATE courses include an ECS chapter with
- A theoretical part
- A practical part with demonstration
22ECS status
- The existing version of the ECS contains the
logic to concurrently handle groups of detectors
(Partitions) and to operate individual detectors - It is suitable for detector commissioning. Every
detector can - Perform STANDALONE runs with LTU, DCS, DAQ, and
HLT - Calibration runs (if defined).
- Can steer additional calibration runs for more
detectors - Following the logic that will be defined by the
detector teams - Using the programs that they will provide
- Requires extensions to
- Integrate the Central Trigger Processor
- Correlate ALICE operations and LHC status
- Must be tested as much as possible with real
experimental setups
23Plans
- Comprehensive test at point 2 (April 2006)
- 1 or 2 TPC sectors
- All the online systems (DCS, DAQ, TRG, HLT, ECS)
- Will in particular test the ECS / HLT interface
- Support to the detectors group setting up
STANDALONE runs with multiple online systems
(starting now and going on during the
commissioning period). - Introduction of more calibration runs for more
detectors (as soon as defined). - Integration of the Central Trigger Processor in
the DAQ Extended Reference System (starting in
June 2006 when a CTP will be available at CERN).