Detector Control LHC

Similar documents
Control slice prototypes for the ALICE TPC detector

Communication Software for the ALICE TPC Front-End Electronics

2008 JINST 3 S Control System. Chapter Detector Control System (DCS) Introduction Design strategy and system architecture

Control and Monitoring of the Front-End Electronics in ALICE

T HE discovery at CERN of a new particle

Short Introduction to DCS, JCOP Framework, PVSS. PVSS Architecture and Concept. JCOP Framework concepts and tools.

ATLAS DCS Overview SCADA Front-End I/O Applications

Stefan Koestner on behalf of the LHCb Online Group ( IEEE - Nuclear Science Symposium San Diego, Oct.

The LHC Compact Muon Solenoid experiment Detector Control System

Oliver Holme Diogo Di Calafiori Günther Dissertori. For the CMS collaboration

THE ALICE experiment described in [1] will investigate

PoS(High-pT physics09)036

2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System

VERY HIGH VOLTAGE CONTROL FOR ALICE TPC

Detector Control System for Endcap Resistive Plate Chambers

AN OVERVIEW OF THE LHC EXPERIMENTS' CONTROL SYSTEMS

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA

Design specifications and test of the HMPID s control system in the ALICE experiment.

JCOP Workshop III. Status of JCOP Activities. 5 th & 6 th June Wayne Salter, CERN IT-CO

The CMS Computing Model

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

Experiment

LHCb Online System BEAUTY-2002

The ATLAS Conditions Database Model for the Muon Spectrometer

Overview of DCS Technologies. Renaud Barillère - CERN IT-CO

Level-1 Data Driver Card of the ATLAS New Small Wheel Upgrade Compatible with the Phase II 1 MHz Readout

Front-End Electronics Configuration System for CMS. Philippe Gras CERN - University of Karlsruhe

IN a system of many electronics boards of many different

Real-time Analysis with the ALICE High Level Trigger.

Vertex Detector Electronics: ODE to ECS Interface

THE ALFA TRIGGER SIMULATOR

IEEE Nuclear Science Symposium San Diego, CA USA Nov. 3, 2015

arxiv: v1 [physics.ins-det] 16 Oct 2017

The detector control system of the ATLAS experiment

Electronics, Trigger and Data Acquisition part 3

The creation of a Tier-1 Data Center for the ALICE experiment in the UNAM. Lukas Nellen ICN-UNAM

Motivation Requirements Design Examples Experiences Conclusion

Muon Reconstruction and Identification in CMS

Implementation of a Control System for the μmegas Detector National Technical University of Athens Stefanos G. Leontsinis 1 1 School of Applied Mathem

Centre de Physique des Particules de Marseille. The PCIe-based readout system for the LHCb experiment

The CERN Detector Safety System for the LHC Experiments

NA60 Detector Control System

SMI++ object oriented framework used for automation and error recovery in the LHC experiments

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Detector Control System board for FAIR. J. A. Lucio Martínez Infrastructure and Computer Systems in Data Processing (IRI) Goethe University Frankfurt

First experiences with the ATLAS pixel detector control system at the combined test beam 2004

Low Voltage Control for the Liquid Argon Hadronic End-Cap Calorimeter of ATLAS

CSCS CERN videoconference CFD applications

APV-25 based readout electronics for the SBS front GEM Tracker

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

Project Specification Project Name: Atlas SCT HV Power Supply (ASH) Version: V2.04. Project History

Data Transfers Between LHC Grid Sites Dorian Kcira

Virtualizing a Batch. University Grid Center

The Detector Control System for the HMPID in the ALICE Experiment at LHC.

The FTK to Level-2 Interface Card (FLIC)

The CLICdp Optimization Process

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

Deferred High Level Trigger in LHCb: A Boost to CPU Resource Utilization

Frontend Control Electronics for the LHCb upgrade Hardware realization and test

Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade

ATLAS, CMS and LHCb Trigger systems for flavour physics

Trigger and Data Acquisition at the Large Hadron Collider

Upgrading the ATLAS Tile Calorimeter electronics

First Operational Experience from the LHCb Silicon Tracker

CLAS12 DAQ & Trigger Status and Timeline. Sergey Boyarinov Oct 3, 2017

Ignacy Kudla, Radomir Kupczak, Krzysztof Pozniak, Antonio Ranieri

Full Offline Reconstruction in Real Time with the LHCb Detector

Validation of the front-end electronics and firmware for LHCb vertex locator.

CMS Alignement and Calibration workflows: lesson learned and future plans

High Level Trigger System for the LHC ALICE Experiment

A Fast VME Data Acquisition System for Spill Analysis and Beam Loss Measurement

Level 0 trigger decision unit for the LHCb experiment

The CMS L1 Global Trigger Offline Software

CLAS12 DAQ, Trigger and Online Computing Requirements. Sergey Boyarinov Sep 25, 2017

Velo readout board RB3. Common L1 board (ROB)

Control Systems: an Application to a High Energy Physics Experiment (COMPASS)

CRU Readout Simulation for the Upgrade of the ALICE Time Projection Chamber at CERN

Data Quality Monitoring Display for ATLAS experiment

Design, Implementation, and Performance of CREAM Data Acquisition Software

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

CERN openlab II. CERN openlab and. Sverre Jarp CERN openlab CTO 16 September 2008

Beam test measurements of the Belle II vertex detector modules

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

TORCH: A large-area detector for precision time-of-flight measurements at LHCb

The new detector readout system for the ATLAS experiment

Development and test of a versatile DAQ system based on the ATCA standard

CMS Conference Report

The ALICE TPC Readout Control Unit

The CMS High Level Trigger System: Experience and Future Development

Burn-In Test Software Development

THE CMS apparatus [1] is a multi-purpose detector designed

Modeling and Validating Time, Buffering, and Utilization of a Large-Scale, Real-Time Data Acquisition System

Straw Detectors for the Large Hadron Collider. Dirk Wiedner

The CMS Event Builder

ALICE TOF Shifters Instructions. ALICE TOF Team 09/11/2009

LHC Detector Upgrades

I. Introduction. Abstract

An Upgraded ATLAS Central Trigger for 2015 LHC Luminosities

Transcription:

Detector Control Systems @ LHC Matthias Richter Department of Physics, University of Oslo IRTG Lecture week Autumn 2012 Oct 18 2012 M. Richter (UiO) DCS @ LHC Oct 09 2012 1 / 39

Detectors in High Energy (Nuclear) Physics Task: measure trajectories and properties of particles Particles liberated from a collision Energy deposited in a medium allows to determine momentum and type of particle M. Richter (UiO) DCS @ LHC Oct 09 2012 2 / 39

Detection Techniques Gas detectors Calorimeters Silicon detectors Time-of-flight (TOF)... just some examples M. Richter (UiO) DCS @ LHC Oct 09 2012 3 / 39

Early Detector Control Rutherford experiment 1909 by Geiger and Marsden Geiger counter M. Richter (UiO) DCS @ LHC Oct 09 2012 4 / 39

Today s experiments Complex system like a factory Compact Muon Solenoid (CMS) Dimensions: 21 m long, 15 m wide and 15 m high Magnetic field: 4 Tesla 12,000 tonnes of iron O(10 7 ) channels to be controlled and read out M. Richter (UiO) DCS @ LHC Oct 09 2012 5 / 39

The Large Hadron Collider (LHC) Collider machine located at CERN/Geneva PbPb - collisions snn = 5.5 TeV L = 10 27 cm 2 s 1 pp - collisions s = 14 TeV L = 10 34 cm 2 s 1 @ ALICE IP L = 5 10 30 cm 2 s 1 M. Richter (UiO) DCS @ LHC Oct 09 2012 6 / 39

LHC Dimensions tunnel of 26.6 km circumference M. Richter (UiO) DCS @ LHC Oct 09 2012 7 / 39

LHC Experiments ALICE ATLAS CMS LHCb M. Richter (UiO) DCS @ LHC Oct 09 2012 8 / 39

LHC Experiments ALICE ATLAS CMS LHCb M. Richter (UiO) DCS @ LHC Oct 09 2012 8 / 39

Challenges in today s experiments Example: CMS Silicon Strip Tracker 9,316,352 readout channels 15,148 modules 24,244 silicon sensors Just one subsystem of CMS How do the control systems scale? How can the collaborative effort be coordinated? What are the requirements for support systems? How is the operational environment for sensors and devices? M. Richter (UiO) DCS @ LHC Oct 09 2012 9 / 39

Detector Control System Basics M. Richter (UiO) DCS @ LHC Oct 09 2012 10 / 39

Tasks of a Detector Control System Continuous, coherent and safe operation of the detector Homogeneous interface to all subdetectors and to the technical infrastructure Ramp up and down of the detector Configuration of electronics LV power distribution and steering HV power distribution and steering Gas systems Synchronization Monitoring Detector Safety In contrast to the detector online systems (data acquisition, trigger), the DCS is all the time in operation M. Richter (UiO) DCS @ LHC Oct 09 2012 11 / 39

A control model Control Levels Supervisory Layer - Back-end (Controls Layer) Field Layer - Front-end Functional modules and devices categorized into different DCS layers common part can be separated from device specific implementations. Operation organized in a process hierarchy M. Richter (UiO) DCS @ LHC Oct 09 2012 12 / 39

Finite State Machines Mathematical model of computation to design a sequence of actions and processes The model consists of states and transitions triggered by actions or conditions M. Richter (UiO) DCS @ LHC Oct 09 2012 13 / 39

Interfaces Interfaces ensure connectivity between modules of the system Communication protocols Module states Bus systems Hardware connectors M. Richter (UiO) DCS @ LHC Oct 09 2012 14 / 39

Top Level Control SCADA - Supervisory Control and Data Acquisition User interface Overall control Visualization of the acquired data Archive M. Richter (UiO) DCS @ LHC Oct 09 2012 15 / 39

Protection Systems Interlocks The concept of interlocks was developed for railway systems An interlock is triggered on a condition which forbids certain actions, e.g. switching on and ramping up An action can only be executed if all interlocks are inactive M. Richter (UiO) DCS @ LHC Oct 09 2012 16 / 39

Protection Systems Interlocks The concept of interlocks was developed for railway systems An interlock is triggered on a condition which forbids certain actions, e.g. switching on and ramping up An action can only be executed if all interlocks are inactive Design the interlocking systems properly! M. Richter (UiO) DCS @ LHC Oct 09 2012 16 / 39

Scalebility of Detector Control Systems The modularity and break up of the system into functional units and state machines facilitate a distributed computing approach. Natural solution to the scalebility issue DCS runs on a network of many nodes Finite State Machines communicate via protocols Logic distributed over all layers of the system which allows a fast reaction time in the Field Layer Data rate to archiving data base can be adjusted M. Richter (UiO) DCS @ LHC Oct 09 2012 17 / 39

Detector Control System @ LHC M. Richter (UiO) DCS @ LHC Oct 09 2012 18 / 39

Conditions Radiation Some of the equipment is installed close to beam pipe and interaction point, radiation can cause failure and damage Magnetic field Equipment inside magnets must be specifically suited Inaccessibility Equipment not accessible because of mechanical reasons and radiation during LHC operation Duration Detectors operated 10-20 years Collaborative development Many people and institutions involved M. Richter (UiO) DCS @ LHC Oct 09 2012 19 / 39

Joint Controls Project (JCOP) Common Controls and Operation for LHC experiments JCOP Framework Detector Safety System (DSS) Gas Control System (GCS) Rack Monitoring and Control System (RCS) JCOP Framework based on a commercial SCADA system - M. Richter (UiO) DCS @ LHC Oct 09 2012 20 / 39

Other Common LHC Control systems Detector Safety System (DSS) Primary goal: Protection of the experiments equipment Detect critical situations and trigger appropriate actions Gas Control System detectors require different types of gas; equipment provided by contractors joint gas control Rack Monitoring and Control common infrastructure and control for rack installation Packaging and Installation Generalized deployment routines M. Richter (UiO) DCS @ LHC Oct 09 2012 21 / 39

Power Supply Commercial devices, important manufacturers: CAEN, WIENER, ISEG, Schneider, Heinziger Communication entirely over OPC M. Richter (UiO) DCS @ LHC Oct 09 2012 22 / 39

Power Supply Example Example: CMS Tracker Power Supply One PowerSupplyUnit(PSU) for a group of 615 modules low voltage supply (2.5 and 1.25V) two high voltage (600V) lines PowerSupplyModule (PSM) combines two PSUs One crate hosts 9 PSMs 6 crates attached to a branch controller Mainframe controls 16 branch controllers 4 Mainframes with 1944 Power Groups and 356 Control Groups The mainframe communicates with the Detector Control System via a standardized protocol - OPC. M. Richter (UiO) DCS @ LHC Oct 09 2012 23 / 39

Hardware solutions in Detector Control Systems Device specific software and hardware implementation are realized in the Field (Device) Layer. Embedded Systems are widely used as they provide flexibility and remote control Field-Programmable Gate Array (FPGA) ATLAS Embedded Local Monitor Board (ELMB) ALICE DCS board Microcontrollers M. Richter (UiO) DCS @ LHC Oct 09 2012 24 / 39

Communication Protocols OPC - OLE for Process Control industry standard mainly driven by automation industry DIM - Distributed Information Management developed and widely used at Cern MODBUS serial communications protocol, industry standard for connecting industrial electronic devices CAN bus - controller area network communication protocol for microcontrollers DIP Cern open source package, message core of DIM M. Richter (UiO) DCS @ LHC Oct 09 2012 25 / 39

Communication Protocol DIM open source communication framework developed at CERN provides network transparent inter-process communication for distributed and heterogeneous environments Server Client architecture Server publishes services Clients can subscribe to any service in the system Server accepts commands sent by a client connection details are hidden from the user framework takes care of the byte ordering and handling of complex data structures dedicated DIM Name Server takes control over all the running clients, servers and their services M. Richter (UiO) DCS @ LHC Oct 09 2012 26 / 39

Some specific implementations M. Richter (UiO) DCS @ LHC Oct 09 2012 27 / 39

ATLAS Detector Control System Only two layers: Back-end and Front-end general-purpose I/O module serves as DCS Front-end interface: Embedded Local Monitor Board (ELMB) M. Richter (UiO) DCS @ LHC Oct 09 2012 28 / 39

Common M. Richter (UiO) software environment DCS @ LHC Oct 09 2012 29 / 39 LHCb Detector Control System LHCb generalizes online and control systems, all under the control of the Experiment Control System (ECS)

LHCb Software Layout Mainly driven by one group, developing all from DAQ system, Event Filter Farm, Communication Protocol and Detector control M. Richter (UiO) DCS @ LHC Oct 09 2012 30 / 39

ALICE Detector Control System DCS ECS DAQ HLT Trigger Four main systems under the control of ECS, four different groups developing and operating the systems M. Richter (UiO) DCS @ LHC Oct 09 2012 31 / 39

ALICE DCS Finite State Machine Finite State Machines on different levels States grouped together and mapped to upper layers M. Richter (UiO) DCS @ LHC Oct 09 2012 32 / 39

ALICE Time Projection Chamber Gas volume 92 m3 Gas mixture: Ne, CO2 (90-10) Maximum electron drift time 250 cm drift: 92µs 72 Readout chambers: MWPCs with cathode pad readout 557568 read out pads and FEE channels 4356 Readout boards 216 Readout Control Units (RCU) Detector Control System includes Gas system Detector Chambers Front-end electronics Calibration system M. Richter (UiO) DCS @ LHC Oct 09 2012 33 / 39

ALICE TPC Finite State Machine Model M. Richter (UiO) DCS @ LHC Oct 09 2012 34 / 39

ALICE TPC Finite State Machine Model M. Richter (UiO) DCS @ LHC Oct 09 2012 34 / 39

DCS for ALICE TPC Front-end electronics M. Richter (UiO) DCS @ LHC Oct 09 2012 35 / 39

DCS board in the Field Layer single board computer Altera EPXA1 FPGA with 32bit ARM processor 100k PLD 8 MB Flash RAM (radiation tolerant) 32 MB SDRAM Ethernet interface JTAG connector Analog to Digital Converter combines LINUX operating system with a Programmable Logic Device (PLD) abstraction layer between hardware and software through device drivers provides low rate data readout path for debugging and monitoring purpose steers and controls the RCU motherboard independently of data readout path update of the RCU firmware readout of the Front-and card and RUC monitoring module M. Richter (UiO) DCS @ LHC Oct 09 2012 36 / 39

Future Development All systems are in production since 2008, the focus is on operation and maintenance Upgrade Plans there will be two Long Shutdowns (LS) in order to upgrade the LHC to the originally foreseen energy Detectors and online systems will be upgraded to handle higher interaction rates Some of the detectors will get a new Front-end, this requires new development in the field layer of the DCS No general redesign of supervisory layer applications planned Update of infrastructure M. Richter (UiO) DCS @ LHC Oct 09 2012 37 / 39

Answers How do the control systems scale? a distributed and modular system How can the collaborative effort be coordinated? common project controls and abstraction functional models interfaces and specifications layers of abstraction reduce need for in-depth knowledge of underlying systems and functionality What are the requirements for support systems? common technical service division: cooling, power distribution, rack services, ventilation How is the operational environment for sensors and devices? radiation magnetic field high temperature stability required inaccessible for most of the time, remote control required M. Richter (UiO) DCS @ LHC Oct 09 2012 38 / 39

Exciting field, combining... Firmware and Software development Device drivers at the boundary between hardware and software Data base functionality High-level abstraction and control Visualization and user interfaces... it s just to get your hands on it! M. Richter (UiO) DCS @ LHC Oct 09 2012 39 / 39

Exciting field, combining... Firmware and Software development Device drivers at the boundary between hardware and software Data base functionality High-level abstraction and control Visualization and user interfaces... it s just to get your hands on it! a smaller project: Geiger counter with USB interface M. Richter (UiO) DCS @ LHC Oct 09 2012 39 / 39