The LHC Control System - PowerPoint PPT Presentation

1 / 59
About This Presentation
Title:

The LHC Control System

Description:

Machine-events (Post-mortem trigger, warnings, beam dump, virtual mode events, ... Developed together with OP, based on experience with LEP ... – PowerPoint PPT presentation

Number of Views:81
Avg rating:3.0/5.0
Slides: 60
Provided by: bertrand3
Category:
Tags: lhc | control | mtf | op | post | system

less

Transcript and Presenter's Notes

Title: The LHC Control System


1
The LHC Control System
B. Frammery For the CERN - AB/CO Group
2
Content
  • A brief introduction
  • The hardware infrastructure
  • Machine timing sequencing
  • Data management
  • Communications
  • Software frameworks
  • General services
  • Critical systems for LHC
  • The CERN Control Center
  • A brief status and a few issues

3
A brief INTRODUCTION
4
CERN machines
(LEP)
LHC
5
In 2003 2004
(LEP)
LHC
6
In 2005
(LEP)
LHC
7
Strategy
  • Develop new software and hardware infrastructures
  • For LHC
  • To be used tested on all the new developments
  • To be spread over all the CERN accelerators at a
    later stage
  • Integrate industrial solutions as much as
    possible

Meaning that, meanwhile, the legacy controls
for LINAC2, the PSB, the PS and the SPS are to be
maintained
8
Hardwareinfrastructure
9
LHC control hardware infrastructure
  • The network the Technical Network
  • Dedicated to accelerators technical services
  • No direct connectivity to the outside world
  • Linked to the office network (the Public Network)
  • Security strategy to be deployed from 2006
  • Gigabit backbone
  • A 3-tier structural layout
  • Resource tier (Front Ends for equipment)
  • Business tier (servers for general services)
  • Presentation tier (consoles for GUIs)

Oral TU3.4-30
10
The CERN Technical Network
LHC Local Points
CERN control rooms
Computer Center
CC
CCC
CERN Control Center
11
LHC controls architecture diagram
TCP/IP communication services
CERN GIGABIT ETHERNET TECHNICAL NETWORK
TCP/IP communication services
TCP/IP communication services
BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM
INTERLOCKS, RF SYSTEMS, ETC
ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC
QUENCH PROTECTION AGENTS, POWER CONVERTERS
FUNCTIONS GENERATORS,
ANALOGUE SIGNAL SYSTEM
LHC MACHINE
LHC MACHINE
12
LHC controls architecture diagram
TCP/IP communication services
CERN GIGABIT ETHERNET TECHNICAL NETWORK
TCP/IP communication services
TCP/IP communication services
All the front-end equipment is located in surface
buildings in non-radioactive areas (ease of
maintenance)
LHC MACHINE
LHC MACHINE
13
LHC controls architecture diagram
TCP/IP communication services
CERN GIGABIT ETHERNET TECHNICAL NETWORK
TCP/IP communication services
TIMING GENERATION
RT/LynxOS VME Front Ends
Linux/LynxOS PC Front Ends
cPCI Front Ends
PLCs
TCP/IP communication services
WorldFIP SEGMENT (1, 2.5 MBits/sec)
OPTICAL FIBERS
BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM
INTERLOCKS, RF SYSTEMS, ETC
ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC
QUENCH PROTECTION AGENTS, POWER CONVERTERS
FUNCTIONS GENERATORS,
ANALOGUE SIGNAL SYSTEM
LHC MACHINE
LHC MACHINE
14
LHC controls architecture diagram
TCP/IP communication services
Linux/HP ProLiant APPLICATION SERVERS
PVSS /Linux PC SCADA SERVERS
FILE SERVERS
CERN GIGABIT ETHERNET TECHNICAL NETWORK
TCP/IP communication services
TIMING GENERATION
RT/LynxOS VME Front Ends
Linux/LynxOS PC Front Ends
cPCI Front Ends
PLCs
TCP/IP communication services
WorldFIP SEGMENT (1, 2.5 MBits/sec)
OPTICAL FIBERS
BEAM POSITION MONITORS, BEAM LOSS MONITORS, BEAM
INTERLOCKS, RF SYSTEMS, ETC
ACTUATORS AND SENSORS CRYOGENICS, VACUUM, ETC
QUENCH PROTECTION AGENTS, POWER CONVERTERS
FUNCTIONS GENERATORS,
ANALOGUE SIGNAL SYSTEM
LHC MACHINE
LHC MACHINE
15
Machine Timing sequencing
16
CERN machines
17
Timing Sequencing (2)
18
Timing Sequencing (3)
External events
Edited events
Basic Period 1200/900/600 ms
RS485 Timing Distribution
Central Timing Generator Module
40MHz PLL
GPS One pulse per Second (1 PPS)
Advanced (100us) 1PPS
CERN UTC Time Timing events Telegrams
Synchronized 1KHz
1 PPS
Smart clock PLL
Phase looked 40 MHz Event encoding clock
10 MHz
(25ns steps)
Timing Receivers
Phase locked 10MHz
PSB
Control System
PS
Central Timing Generation
SPS
LHC
UTC time (NTP or GPS)
19
Timing Sequencing (4)
  • The data that are distributed on the timing
    network
  • LHC Telegram (Cycle-Id, Beam-Type, Target LHC
    Bunch-Number, Bucket-Number, Ring, CPS-Batches,
    Basic-Period number, Cycle-Tag, Particle-type)
  • Millisecond clock
  • The UTC time
  • Machine-events (Post-mortem trigger, warnings,
    beam dump, virtual mode events, )

20
Data Management
21
Databases the 4 domains of data
  • Physical equipment
  • Use of the general CERN MTF database for asset
    management

Serial Number
Physical Equipment
22
Databases the 4 domains of data
Installed Equipt Type Optics Powering
Machine Layout
  • LHC machine description
  • LHC layout (mechanical, optics, electrical)
  • DC magnet powering

Equipment Catalogue
Serial Number
1612 electrical circuits 80000 connections
Physical Equipment
23
Databases the 4 domains of data
Controls Configuration
Installed Equipt Type Optics Powering
Machine Layout
Computer Address
Equipment Catalogue
MO4A.1-70
Serial Number
  • Controls Configuration
  • PS Model extended to LHC

Physical Equipment
24
Databases the 4 domains of data
Controls Configuration
Installed Equipt Type Optics Powering
Machine Layout
Computer Address
Equipment Catalogue
gt200000 signals
Operational Data
Serial Number
Settings Measurements Alarms Logging Post-Mortem
Physical Equipment
25
Databases the 4 domains of data
Controls Configuration
Installed Equipt Type Optics Powering
Machine Layout
Computer Address
Consistent naming and identification scheme as
defined in Quality Assurance Plan
Equipment Catalogue
Operational Data
Serial Number
Settings Measurements Alarms Logging Post-Mortem
Physical Equipment
26
Communications
27
The Controls MiddleWare (CMW)
  • Ensemble of protocols, Application Programming
    Interfaces (API) and software frameworks for
    communications.
  • Two conceptual models are supported
  • the device access model (using CORBA). Typical
    use is between Java applications running in the
    middle tier and equipment servers running in the
    resource tier. Unique API for both Java and C.
  • the messaging model (using the Java Message
    Service). Typical use is within the business tier
    or between the business tier and applications
    running in the presentation tier.

28
Software frameworks
29
The software frameworks (1)
FR1.2-5O
  • Front-End Software Architecture (FESA)
  • Complete environment for Real-Time Model-driven
    control software implemented in C for the
    LynxOS and Linux platforms
  • Java framework for accelerator controls
  • Uses J2EE application servers with lightweight
    containers
  • Plain Java objects (no EJB beans)
  • Applications can run (for test) in a 2-tier setup
  • Unified Java API for Parameter Control (JAPC) to
    access any kind of parameter.

TU1.3-5O
TH1.5-8O
30
The software frameworks (2)
WE2.2-6I
  • UNified Industrial Control System (UNICOS)
  • Complete environment for designing, build and
    programming industrial based control systems for
    the LHC.
  • Supervision layer PVSS II (SCADA from ETM)

WE3A.2-60
UNICOS the Java framework for accelerator
controls use the same graphical symbols and color
codes
31
GENERAL SERVICES
32
The Alarm System
  • LHC Alarm SERvice (LASER)

TH2.2-70
  •  Standard  3-tier architecture
  • Java message service (JMS)
  • Subscription mechanism

33
Logging
  • Several 105 parameters will be logged
  • Every data or setting is timestamped (UTC)
  • Parameters are logged
  • on regular intervals (down to 100 ms)
  • on request
  • on-change

34
Analogue signals
The ancestor
  • Open Analogue Signals Information System (OASIS)
  • To visualize and correlate in Real-Time time
    critical signals in the control room
  • 500 signals for LHC 50 MHz bandwidth ( 1000
    in PS/SPS)
  • Distributed cPCI system using analogue MPX and
    oscilloscope modules (Acqiris or other types)
    close to the equipment
  • Triggers through the timing network for precise
    time correlations
  • Standard 3-tier architecture.

TH3A.1-50
35
Core control application software (LSA)
TU1.3-5O
  • Normalized data model valid for
  • Settings, measurements, optics parameters
  • Set of software modules for
  • Optics definition
  • Setting generation management
  • Trims (coherent global modifications of
    settings)
  • Set of generic applications

Developed together with OP, based on experience
with LEP and tested already for 2 new extractions
from SPS (CNGS, TI8)
36
Post Mortem
To take a snapshot of the LHC vital systems.
  • Automatic (typ. when an interlock appears) or
    manual trigger
  • No beam allowed if PM not ready
  • Capture of
  • Logged data
  • Alarms (LASER)
  • Transient recorder signals (OASIS)
  • Fixed displays
  • Analysis
  • A few Gigabytes per Post Mortem capture
  • Structured sorting of causes effects
  • Needed from October 2005 for Hardware
    commissioning
  • Continuous development effort for the years to
    come

TI8 extraction test in October 2004 already
proved the importance of a PM system
37
Critical systems for LHC
38
Powering Interlock System (1)
  • For POWERING, LHC is equal to 8 sectors

39
Powering Interlock System (1)
  • To protect 1612 electrical circuits with 10000
    supraconducting magnets

40
Powering Interlock System (2)
PVSS Console and Server (monitoring
configuration)
PO2.036-3
Technical Network
Siemens PLC (process control configuration)
Profibus
Hardware system
PC_PERMIT
QPS
Power Converter
QPS
Power Converter
QPS
Magnet/QuenchProtectSystem
Power Converter
Patch Panels and Electronics
Power Converters
HW Current loops for connections of clients
PC_FAST_ABORT
CIRCUIT_QUENCH / MAGNET OVERTEMP
POWERING_FAILURE

Beam Permit
UPS
Beam Interlock system
AUG
41
Beam Interlock System (1)
  • Two independent hardware loops as  beam permit 
    signal transmission.
  • Connects the Beam Loss Monitors and many others
    systems to the Beam Dump request.

42
Beam Interlock System (2)
PO2.031-3
Technical Network

User Interfaces (installed in Users rack)
1
Safe Beam Parameter Receiver
Safe Beam Par. (via Timing)
Test Monitoring Module
copper cable
2
Patching
3
Core module
Beam Permit Loops
F.O. interface
Beam Interlock Controller
43
Real-Time Feedback systems
  • LHC orbit feedback
  • 2000 Beam position parameters
  • 1000 steering dipoles
  • 10 Hz frequency
  • LHC tune feedback
  • Modest system 4 parameters and some 30 PCs (up
    to 50 Hz ?).
  • LHC Chromaticity feedback
  • Considered but difficulty to have reliable
    measurements

44
Orbit Feedback system
  • Centralized architecture
  • gt 100 VME crates involved
  • Through the Technical network
  • Tests on SPS in 2004 successful
  • Simulations show 25Hz capability

45
Quench Protection System
PVSS Expert GUI
Retrieve and present data
LHC Logging
Post-mortem
Send data
Alarms (LASER)
PC Gateway
Power Interlocks
PVSS Data Server Supervision/Monitoring
WorldFIP
I2C
I2C
ANALOG
ANALOG
ANALOG
ANALOG
ANALOG
ANALOG
I2C
L H C S u p r a c o n d u c t i n g m a g
n e t s
46
Controls for cryogenics
  • 130 PLCs ( Schneider Siemens)
  • Application built on UNICOS framework

WE3A.2-60
47
Controls for cryogenics
Surface
48
Collimation System (1)
  • Compulsory to gain 3 orders of magnitude in
  • performance beyond other hadron colliders.
  • 162 collimators when fully deployed
  • 5 degrees of freedom 10 measurements
  • of absolute and relative positions and
  • gaps per collimator
  • Synchronous move with 10 mm precision
  • within a few 10 ms in relation with
  • Local orbit
  • Beam loss measurements

PO2.016-2
49
Collimation System (2)
Central Control Application
Function of motor setting, warning levels, dump
levels versus time. Motor parameters (speed, ).
Beam-loss driven functions.
Measurements. Post mortem. Warnings.
From MP channel Intensity, energy, b
BLMs BPM readings
Collimator Supervisory System
Function motor. Motor parameters.
Timing
Functions (motor, warning, dump level). Info and
post mortem.
Motor Drive Control
Warning error levels. Info and post mortem.
Motor and switches.
STOP
Abort
Position Readout and Survey
All position sensors.
Environmental Survey System
Temperature sensors (jaw, cooling water, )
Vibration measurements water flow ratesVacuum
pressure radiation measurementsMotor status
switches
Abort
50
The CERN CONTROL CENTER (CCC)
51
The CERN Control Center
  • A single control room for CERN to control
  • All accelerators
  • All technical services
  • Grown from the SPS (LEP) control room on the
    French CERN site (Prévessin)
  • Work started in November 2004, to be delivered in
    October 2005 to be operational in February 2006
  • All CERN machines operated from the CCC in 2006

52
The CERN Control Center
The Architect drawing
53
The CERN Control Center
40 console modules 16 large LCD displays
The architects view
54
The CERN Control Center
Erich Keller
One of the 20 workplaces of the CCC (for 2
operators )
55
  • A brief Status
  • OF the LHC
  • Control System

56
Status the basic infrastructure
Basic infrastructure conception implementation comments
Network done done CERN security strategy to be applied
VME FECs purchased done LEIR 100 installed, LHC Hardware Commissioning 50 installed
PC gateways purchased done LHC Hardware Commissioning 50 installed
PLC FECs purchased done Cryogenics 60 installed Powering Interlock system 30 installed
WorldFIP done done tunnel Surface buildings deployed 100, qualified 35
Remote reboot done done Installed sectors 7-8, 8-1
Servers purchased provisional installation to be installed in CCC lt Feb 2006
Consoles equipment defined and purchased to be delivered in oct.05 to be installed Nov 2005 - March 2006 for CCC Installed in Field CR - UA83
Central Timing done done to be installed in CCC before March 2006
Timing distribution receivers done done for all modules installed in LHC Points 1, 7 8
57
Status the software components
TH4.2-10
Control . Subsystems . Test opportunities Post Mortem Logging Timing Alarms (LASER) Powering Interlocks Automated Test Procedures Analogue Signals (OASIS) CMW FESA PVSS/ UNICOS Application software/ LSA core
TT40/TI8 extraction test NO YES Partial NO YES NO YES BOTH BOTH Both OK
LEIR beam Commissioning NO YES YES YES YES NO YES BOTH BOTH (vacuum) Generic applics
1st QRL tests NO YES NO YES NO NO NO NO YES YES
QPS surface tests YES NO NO NO NO NO NO FESA NO NO
LSS8L tests YES YES YES YES YES YES NO BOTH YES Partial/OK
Large electrical circuit commissioning YES YES YES YES YES YES NO BOTH YES Partial/OK
SPS/TI2/CNGS YES YES YES YES YES NO YES BOTH YES Partial/OK
Tests in progress
Tests already done
58
Issues (1)
  • Basic Infrastructure
  • Security policy to be implemented on the
    Technical Network without jeopardizing the
    deployment of the Consoles servers.
  • Deployment of the new timing system on the
    pre-injectors.
  • Software
  • While generic application and general services
    are in line, specific application programs for
    LHC cannot yet be specified.
  • Software modules not tested at full scale.

59
Issues (2)
  • Hardware commissioning
  • Time to commission the LHC becomes thinner and
    thinner.
  • Manpower very limited to face both LHC
    installation, hardware commissioning and support
    to operational machines
  • Beam commissioning
  • Some critical systems are pretty late
    (excollimation)
  • Strategy to be found to inject some beam despite
    of all the security systems!!
  • The legacy software
  • To get the manpower for LHC, the existing
    controls infrastructures have been somewhat
    neglected.
  • The restart of the machines in 2006 will be
    difficult.

60
Conclusion
  • The basic LHC control system exists today.
  • There is a strong commitment by everyone to be
    ready to start LHC with beam in Summer 2007.
  • More news in October 2007

61
Thank you for your attention
62
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com