Short Introduction to DCS, JCOP Framework, PVSS. PVSS Architecture and Concept. JCOP Framework concepts and tools.

Similar documents
Detector Control System for Endcap Resistive Plate Chambers

Detector Control LHC

Front-End Electronics Configuration System for CMS. Philippe Gras CERN - University of Karlsruhe

ATLAS DCS Overview SCADA Front-End I/O Applications

Control slice prototypes for the ALICE TPC detector

AN OVERVIEW OF THE LHC EXPERIMENTS' CONTROL SYSTEMS

Motivation Requirements Design Examples Experiences Conclusion

Experiment

First experiences with the ATLAS pixel detector control system at the combined test beam 2004

Oliver Holme Diogo Di Calafiori Günther Dissertori. For the CMS collaboration

2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System

2008 JINST 3 S Control System. Chapter Detector Control System (DCS) Introduction Design strategy and system architecture

ALICE TOF Shifters Instructions. ALICE TOF Team 09/11/2009

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

VERY HIGH VOLTAGE CONTROL FOR ALICE TPC

T HE discovery at CERN of a new particle

The CERN Detector Safety System for the LHC Experiments

Overview of DCS Technologies. Renaud Barillère - CERN IT-CO

Stefan Koestner on behalf of the LHCb Online Group ( IEEE - Nuclear Science Symposium San Diego, Oct.

JCOP Workshop III. Status of JCOP Activities. 5 th & 6 th June Wayne Salter, CERN IT-CO

Design specifications and test of the HMPID s control system in the ALICE experiment.

Design for High Voltage Control and Monitoring System for CMS Endcap Electromagnetic Calorimeter

Control and Monitoring of the Front-End Electronics in ALICE

SOFTWARE SCENARIO FOR CONTROL SYSTEM OF INDUS-2

The detector control system of the ATLAS experiment

Detector controls meets JEE on the web

CMS ECAL Endcap High Voltage PVSS System Expert Manual

Implementation of a Control System for the μmegas Detector National Technical University of Athens Stefanos G. Leontsinis 1 1 School of Applied Mathem

First Operational Experience from the LHCb Silicon Tracker

NA60 Detector Control System

Ignacy Kudla, Radomir Kupczak, Krzysztof Pozniak, Antonio Ranieri

Low Voltage Control for the Liquid Argon Hadronic End-Cap Calorimeter of ATLAS

Common Software for Controlling and Monitoring the Upgraded CMS Level-1 Trigger

The LHC Compact Muon Solenoid experiment Detector Control System

COMPASS Framework documentation

SMI++ object oriented framework used for automation and error recovery in the LHC experiments

IN a system of many electronics boards of many different

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

JCOP FW Training Course. 30 June 4 July 2014 EN-ICE

Muon Reconstruction and Identification in CMS

A Fast VME Data Acquisition System for Spill Analysis and Beam Loss Measurement

DDC Experience at Tilecal

DCS Caen Device Units

Communication Software for the ALICE TPC Front-End Electronics

The CMS data quality monitoring software: experience and future prospects

Accelerator Control System

Power Supply. products. CAEN Short Form Catalog Function Model Description Page. NIM Power Supply N470 4 Channel Programmable HV Power Supply 15

Data Quality Monitoring Display for ATLAS experiment

Control and Spy the TELL1. Cédric Potterat EPFL LPHE

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

The High Performance Archiver for the LHC Experiments

JCOP EXPERIENCE WITH A COMMERCIAL SCADA PRODUCT, PVSS

The Detector Control System for the HMPID in the ALICE Experiment at LHC.

Vertex Detector Electronics: ODE to ECS Interface

0.1 Slow Monitoring and Recording System

AUTOMATION OF POWER DISTRIBUTION USING SCADA

MiniDAQ1 A COMPACT DATA ACQUISITION SYSTEM FOR GBT READOUT OVER 10G ETHERNET 22/05/2017 TIPP PAOLO DURANTE - MINIDAQ1 1

A Web-based control and monitoring system for DAQ applications

BCPro System Product Bulletin

Deployment of NI COTS Hardware and LabVIEW in the C2 FRC Experiment

Using Operator Interfaces to Optimize Performance of Industrial Wireless Networks

ELECTRICAL NETWORK SUPERVISOR STATUS REPORT

Real-time Power System Operation. Energy Management Systems. Introduction

Verification and Diagnostics Framework in ATLAS Trigger/DAQ

Software for implementing trigger algorithms on the upgraded CMS Global Trigger System

Accord Builder. User Guide

Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade

RT12-240V/2.4kW Rectifier Specification

Multilink. Ethernet Communications for Industrial Automation, Power Utility, and Traffic Control markets. Multilin

WW HMI SCADA Connectivity and Integration - Multi-Galaxy

An ATCA framework for the upgraded ATLAS read out electronics at the LHC

FAST site EMU CSC test results a global view from ROOT N. Terentiev

RT4F-110V/25A RECTIFIER

RT4B-110V/12A RECTIFIER

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

CONTROL AND MONITORING OF ON-LINE TRIGGER ALGORITHMS USING GAUCHO

Australian Nuclear Science & Technology Organisation. Upgrade of the ANTARES Computer Control System and our experience of EPICS.

HARDWARE OF SUPERVISORY CONTROL & DATA ACQUISITION SYSTEM

UNICOS* Principles and History

TO KEEP development time of the control systems for LHC

The ATLAS Conditions Database Model for the Muon Spectrometer

THE ATLAS experiment comprises a significant number

UNICOS PVSS (WinCC OA) Project Maintenance

Control Systems: an Application to a High Energy Physics Experiment (COMPASS)

Detector Control System board for FAIR. J. A. Lucio Martínez Infrastructure and Computer Systems in Data Processing (IRI) Goethe University Frankfurt

RT4F-120V/20A-WAC RECTIFIER

SCADA Software. 3.1 SCADA communication architectures SCADA system

The CMS L1 Global Trigger Offline Software

1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France

I/A Series HARDWARE Product Specifications

THE ALICE experiment described in [1] will investigate

2003 TEC beam test : a quick overview

Intelligence Elements and Performance of the FPGA-based DAQ of the COMPASS Experiment

Veesta World Co. V-SCADA2000 Product Series. The cross platform SCADA system

The coolest place on earth

Experience with Data-flow, DQM and Analysis of TIF Data

WinCC-OA and the JCOP Framework

Incident Information Management Tool

LHCb Online System BEAUTY-2002

THE CMS apparatus [1] is a multi-purpose detector designed

Transcription:

Hassan Shahzad, NCP

Contents Short Introduction to DCS, JCOP Framework, PVSS and FSM. PVSS Architecture and Concept. JCOP Framework concepts and tools. CMS Endcap RPC DCS. 2

What is DCS DCS stands for Detector Control System. In all large experiments a control system is required which controls and monitors the experimental hardware and takes necessary actions to avoid failure. At LHC, to control and monitor the installed detectors at experimental areas, an intelligent and powerful DCS is used to assure the reliable and safe operation of the electronics. 3

Layers of DCS at CERN Supervisory Application JCOP FW (Supervisory) PVSS FSM, DB, etc. Supervision PC (Windows, Linux) OPC, PVSS Comms, DIM, DIP OPC, PVSS Comms, DIM, DIP Communication FE Application Device Driver Commercial Devices(CAEN) FE Application UNICOS FW PLC Other Systems Front-end 4

What is JCOP? JCOP stands for Joint COntrols Project. Grouping of representatives from the 4 big Large Hadron Collider (LHC) experiments. Aims to reduce the overall manpower cost required to produce and run the experiment control systems. 5

What is PVSS? The Supervisory Control And Data Acquisition (SCADA) system chosen by JCOP. JCOP chose PVSS (Prozeßvisualisierungs und Steuerungs system) as LHC control system software after extensive evaluation. It is a commercial product from ETM, Austria. Since then, PVSS has been widely adopted across CERN. PVSS is a TOOL to build control systems, it is not a control system! 6

7

What is PVSS (cont.)? PVSS has capabilities for: Device Description Data Points, and Data Point elements Device Access OPC, Drivers etc Alarm Handling Generation, Masking, etc Alarm Display, Filtering, Summarising Archiving, Trending, Logging User Interface Builder Access Control 8

Cont.. The device data in the PVSS database is structured as, so called, Data Points (DP) of a pre defined Data Point Type (DPT). PVSS allows devices to be modeled using these DPTs/DPs. DPTs are similar to Classes in Object Oriented terminology. DPs are similar to Objects instantiated from a Class in Object Oriented terminology. 9

What is PVSS not? PVSS II does not have tools specifically for: Abstract behaviour modelling Finite State Machines Automation & Error Recovery Expert System But FSM (SMI++) does 10

What is FSM? Finite State Machine (FSM) Abstract representation of your experiment. What state is it in? Is it taking data? Is it in standby? Is it broken? Is it switched off? What triggers it to move from one of these states to another? JCOP chose the State Management Interface (SMI++) product which was developed for the DELPHI experiment. SMI++ is vital for controlling & recovering large experiments 11

What is JCOP Framework? A layer of software components These components are produced in collaboration. All these components can work together and make the job easy. It is very useful for large PVSS projects. 12

Experiments using PVSS LHC Experiments All LHC experiments have projects Has been used for many years in test beams Fixed Target Experiments COMPASS Helped in framework development Now in production HARP, NA60, NA48/NA62... 13

PVSS Architecture and Concept. 14

PVSS Architecture PVSS has a highly distributed architecture. A PVSS application is composed of several processes, in PVSS nomenclature: Managers. These Managers communicate via a PVSS-specific protocol over TCP/IP. Managers subscribe to data and this is then only sent on change by the Event Manager, which is the heart of the system. 15

A Typical PVSS System UIM UIM UIM User Interface Layer CTRL API Processing Layer DM EV Communication and Memory Layer D D D Driver Layer 16

PVSS Managers Overview The Event Manager (EVM) is responsible for all communications. It receives data from Drivers (D) and sends it to the Database Manager to be stored in the data base. However, it maintains the process image in memory, i.e. the current value of all the data. It also ensures the distribution of data to all Managers which have subscribed to this data. The DataBase Manager (DBM) provides the interface to the (run-time) data base. 17

Cont.. User Interface Managers (UIM) can get device data from the database, or send data to the database. Ctrl Managers (Ctrl) provide for any data processing as background processes, by running a scripting language. This language is like C with extensions. API Managers (API) Allow users to write their own programs in C++ using a PVSS API (Application Programming Interface) to access the data in the database. Drivers (D) Provide the interface to the devices to be controlled. These can be PVSS provided drivers or user-made drivers. 18

Types of PVSS Systems A PVSS System is an application containing one data base manager and one event manager and any number of drivers, user interfaces, etc. PVSS Managers can run on Windows or Linux and they can all run in the same machine or be distributed across different machines (including mixed Windows and Linux environments). When the managers of one system run distributed across different machines this is called a PVSS Scattered System. 19

Cont.. PVSS can provide control for very large applications, in which case one PVSS system would not be enough. In this case a PVSS Distributed System can be used. As shown in a figure, distributed system is built by adding a Distribution Manager (Dist) to each system and connecting them together. Hundreds of systems can be connected in this way. 20

Distributed System 21

Distributed System System 1 System 2 Operator 1 UI Userinterface Runtime UI Userinterface Runtime Operator 2 Server 1 Server 2 System 3 Single Machine Station CTRL Controlmanager DB Database- Manager UI Userinterface Runtime EV Event- Manager D Driver DIST Distribution Manager Operator 1 UI Userinterface Runtime Server CTRL Control- Manager DB Database- Manager EV Event- Manager D Driver Operator 2 UI Userinterface Runtime DIST Distribution Manager CTRL Control- Manager DB Database- Manager EV Event- Manager D Driver REDU Redundancy Manager DIST Distribution Manager REDU Redundancy Manager DIST Distribution Manager EV Event- Manager D Driver CTRL Control- Manager DB Database- Manager Local Area Network TCP October 23, 2009 Hassan Shahzad, NCP 22

Distributed System Distribution Manager provides the interface between systems Event driven communication leads to low network load Only requested data is transferred Integrated connection monitoring Interconnected 130 systems at CERN. 23

JCOP Framework concepts and tools. 24

Aims of the JCOP Framework Reduce the development effort Reuse of components Hide complexity. Facilitate the integration. Reduce resources for maintenance Homogeneous control system. Easy operation and maintenance. Provide a higher layer of abstraction reduce knowledge of tools interface for non experts Customize & Extend industrial components Modular/Extensible As simple as possible Development driven by the JCOP FW Working Group. 25

What is it (not)? 26

DCS Architecture Supervisory Application JCOP FW (Supervisory) PVSS FSM, DB, etc. Supervision PC (Windows, Linux) OPC, PVSS Comms, DIM, DIP OPC, PVSS Comms, DIM, DIP Communication FE Application Device Driver Commercial Devices(CAEN) FE Application UNICOS FW PLC Other Systems Front-end 27

Tools Provided by JCOP Framework Device Editor and Navigator (DEN) Main user interface to the Framework. Configuring devices, users login. High level view of experiment Includes FSMs Fw Tree View Provides the Tree view of whole project. fwgeneral: Exception handling Panels to give messages to users Help system One html file for every panel. For library routines the help text is within the code itself. 28

Cont.. Fw Config Library Hide complexity of PVSS configs Optimized functions for mass configuration Exceptions are handled Component Installation Tool Trending Greatly simplifies PVSS trends. Configuration DB Fw definitions stored in Oracle. Static definitions (DEN configuration) Recipes (Values, eg for physics / comics / shutdown states 29

Cont.. Mass Configuration Easy configuration of many channels with alerts, settings etc. Access control Hide complexity of PVSS access control! Defines privilege levels and role groups (operator, expert etc.) 30

31

Brief Introduction to CMS and RPC CMS (Compact Muon Solenoid) is one of the main experiment sight at LHC. RPCs are the dedicated detectors used for the first level muon trigger of CMS. Good performance of RPC is essential in assigning the muon to the right bunch crossing at the LHC. The CMS Experiment 32

RPC DCS The RPC DCS has two sub systems: RPC Barrel and RPC Endcap. RPC Barrel DCS is prepared by Italian Group. RPC Endcap DCS is prepared by NCP HEP Group in coordination with Barrel DCS group expert. The Endcap RPC system of CMS detector comprises of 432 double gap chambers. The correct and safe operation of the RPC system requires a sophisticated and complex online Detector Control System (DCS), able to monitor and control the RPC hardware devices. 33

RPC DCS The RPC DCS system has to assure the safe and correct operation of the sub detectors during the entire CMS life time (more than 10 years), detect abnormal and harmful situations and take protective and automatic actions to minimize consequential damages. 34

RPC ENDCAP DCS The RPC Endcap DCS is subdivided into several subsystems: High Voltage (HV), Low Voltage (LV), Environmental Sensors(humidity and temperature ), Gas, Cooling systems. NCP EHEP Group had designed and prepared the software applications for the control and monitoring of the HV, LV systems and environmental sensors. All information is monitored and shared through the Central DCS. Cooling and Gas systems are instead developed centrally by CMS DCS Group. 35

The RPC Endcap Power system: high voltage and low voltage Large part of the RPC Endcap power system is located close to the detector and in particular inside the racks placed on the balconies around the Endcap wheel. Every RPC chamber has been equipped with two independent HV channels (one per gap) and two LV channels. In addition four LV channels are needed to supply each Link Boards Box (LBB), aimed to collect data from each chamber, synchronize and send them to trigger and readout chain in control room. 36

Cont.. In conclusion the entire Endcap RPC power system consists of: 864 high voltage channels, 432 low voltage channels for front end boards on the chambers, 300 low voltage channels for the link boards. The solution chosen by the RPC collaboration for the power system is based on the CAEN EASY (Embedded Assembly SYstem) project. which consists of components made of radiation and magnetic field tolerant electronics and based on a master-slave architecture. 37

Cont.. The following CAEN Hardware is used for RPC Endcap Power System and was purchased by Govt. of Pakistan. CAEN Hardware Short Description Quantity SY1527 HV Mainframe 1 A3512N HV Board 42 A3485 AC to DC Convertor for HV System 1 A1676 Branch Controller for HV System 3 EASY3000 HV CRATE for housing boards 11 A3000FB FAN Units for EASY Crates 11 SY1527 LV Mainframe 1 A3009N LV Board 36 A3486S AC to DC Convertor for LV System 6 A1676 Branch Controller for LV System 4 EASY3000S LV CRATE for housing boards 12 38

ENVIRONMENTAL SENSOR NETWOK The performance of the Endcap RPC detector is strongly related to the temperature and humidity. In particular the noise rate and the dark current of the chamber depend on these two parameters. For this reason monitoring the gas temperature, the humidity and the temperature outside of the chamber is crucial. 39

Cont.. The environmental sensor network is composed of: 72 sensors to measure the outside temperature of the RPC. 48 relative humidity sensors. 24 Water temperature sensors. The temperature sensor is the D592BN, made by Analog devices. All above sensors are powered and read by the 8 CAEN ADC (A3801A) boards, All ADC boards are placed in the balcony around the detector. 40

RPC DCS Hardware Structure Purchased by Pakistan All the RPC subsystems are handled and controlled by the RPC Supervisor, aimed to gather and summarize all the information and to present a simplified but coherent and general view to the end users. 41

RPC Endcap DCS Software In accordance with the CMS official guidelines, all Endcap RPC DCS applications have been developed using the commercial ETM SCADA (Supervisory Control And Data Acquisition) software, PVSS 3.6 and the standard Joint Control Project (JCOP) framework components. The DCS uses the finite state machine (FSM) approach and its follows the RPC subsystem structure (station, ring, chamber, Channel...). 42

Cont.. Example of the Chamber Finite State Machine. Structure of the hierarchy tree of the RPC DCS 43

Cont RPC DCS uses most of the functionalities provided by the PVSS + JCOP software as the final state machine, the GUI, the alarm handler and the ORACLE database interface. 44

Commissioning of RE DCS The NCP EHEP Group started work on DCS since Jan, 2008. The Provisional DCS version is released in August, 2008. This was a temporary setup in building 40 office by using 3 ordinary desktop computer machines(as RE Supervisor, RE LV machine and RE HV machine). This setup is capable to control and monitor the HV system, LV system of three Endcap Discs. i.e. 432 HV channels and 216 LV Channels. 45

Commissioning of RE DCS One User interface panel for example. 46

Provisional DCS Provisional DCS provides functionalities such as: Easy user interface with hardware in form of tree view. Power On or Off the Disc/Ring/Single RPC via UI panel. Settings for Current and Voltage parameters for both HV and LV systems like voltage, current, trip time etc. Generate plots for HV Vs Current for each HV and LV channel of RPC separately. Maintains History for Current and Voltage to analyze in detail the behavior of the hardware. Generate Alarms in case of unusual situations like high current or high temperature of boards. Turn off the respective board/channel in case of bad situation to avoid the failure of hardware, etc.. 47

Global Settings for ring 48

Provisional DCS From August 2008 till February 2009, the provisional DCS is used by CMS shifters on 24 hours basis. The provisional DCS helped a lot to shifters to monitor and analyze in detail the behavior of RPC Endcap HV and LV system. In the mean time several updates were implemented to improve the user interface and to make it more robust and secure. Extensive work has been done on supervisor system to integrate the Barrel and Endcap in one project which is very complicated and requires continuous work and discussions of Barrel DCS expert with Endcap DCS expert. 49

Final RPC DCS In February 2009, the final RPC DCS version is released which has common supervisor for RPC Endcap and Barrel. All RPC Endcap DCS computer machines are installed in CMS central DCS control room and are on CMS internal network. The RPC DCS has been also successfully integrated into the Central DCS and was able to publish its state and receive commands from it. 50

Cont.. Final DCS adds more functionalities in provisional DCS such as: Provide control and monitoring of HV, LV, LBB temperature and humidity channels and sensors on disc level. Global monitoring of RPC Endcap six stations Average and total temperature per disc. Average and total current per disc. Provide HV Scan utility. 51

Cont.. Integration of Gas system in RPC Endcap DCS. More user friendly. Addition of monitoring of online temperature of hardware like boards, power supplies etc. Addition of emergency shutdown capability of HV and LV of all stations. After a short debug phase, the system ran without problems for the entire test period. It was proved to be able to manage properly the interruptions occurred, due to power failures and communication problem with the power supply. 52

Cont.. Final DCS is more robust and secure than provisional DCS and have more detailed information of the hardware. The DCS proved to be a reliable tool for the safe and correct operation of the detectors and trained shifter, were able to operate the detector in a easy and safe way. Some DCS UI panels are shown in next slides. 53

RPC Endcap DCS Overview 54

Global Monitoring 55

56

Cont.. HV Channels for disk 57

Cont.. LBB Channels for disk 58

HV Setting for Disk LV Setting for Disk 59

Cont.. Global Monitoring for disk 60

Boards and AC to DC Convertor Temperature 61

Temperature and Humidity for disk 62

Current plots at 9.2kV for selected RPCs 63

Temperature plots for one station 64

Water temperature sensors plots for one station 65

Conclusion The design and development of the Endcap RPC detector control system is now finished and shifters are using it on 24 hours basis. The good results obtained during the 2008 and 2009 global runs and in August 2009 CRAFT operating whole RPC Endcap system, demonstrated that the RPC DCS was well designed and was able to run in a very stable and safe way for long period. In conclusion, the RPC DCS system, developed following the guidelines of the DCS central group was proved to be a reliable tool and it is ready to become fully operational for the Winter 2009 and whole 2010 when CMS will begin to take data. 66

67