The MICE Run Control System

Similar documents
Update of the BESIII Event Display System

Detector Control LHC

Update of the BESIII Event Display System

Muon Reconstruction and Identification in CMS

Accelerator Control System

PoS(EPS-HEP2017)492. Performance and recent developments of the real-time track reconstruction and alignment of the LHCb detector.

The MICE Physics Computing Capabilities

0.1 Slow Monitoring and Recording System

G4beamline Simulations for H8

Development of Beam Monitor DAQ system for 3NBT at J-PARC

Fig. 1. PANDA experiment at the FAIR facility, Darmstadt. The detector comprises two spectrometers. The target spectrometer features tracking

Performance of the ATLAS Inner Detector at the LHC

The Database Driven ATLAS Trigger Configuration System

The LHC Compact Muon Solenoid experiment Detector Control System

CMS High Level Trigger Timing Measurements

Experience of Developing BEPCII Control System. Jijiu ZHAO IHEP, Beijing ICALEPCS2007 October 18, 2007

Sing n l g e B o B a o rd d Co C m o pu p t u e t rs a nd n Ind n u d s u tr t ial P C C Ha H rdw d are a t t t h t e h CL C S

PoS(Baldin ISHEPP XXII)134

Performance quality monitoring system for the Daya Bay reactor neutrino experiment

First results from the LHCb Vertex Locator

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Adding timing to the VELO

Instrumentation and Control System

arxiv:physics/ v1 [physics.ins-det] 18 Dec 1998

Forward Time-of-Flight Geometry for CLAS12

CMS Alignement and Calibration workflows: lesson learned and future plans

Performance of the GlueX Detector Systems

The CMS data quality monitoring software: experience and future prospects

Full Offline Reconstruction in Real Time with the LHCb Detector

The ATLAS Conditions Database Model for the Muon Spectrometer

b-jet identification at High Level Trigger in CMS

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

BOOSTER RF UPGRADE FOR SPEAR3 *

Short Introduction to DCS, JCOP Framework, PVSS. PVSS Architecture and Concept. JCOP Framework concepts and tools.

Deeply Virtual Compton Scattering at Jefferson Lab

An FPGA Based General Purpose DAQ Module for the KLOE-2 Experiment

Performance of the MRPC based Time Of Flight detector of ALICE at LHC

CMS Conference Report

The Use of LabVIEW FPGA in Accelerator Instrumentation.

Status of Control System. Hiroshi Kaji

PoS(High-pT physics09)036

ONLINE MONITORING SYSTEM FOR THE EXPERIMENT

STAR Time-of-Flight (TOF) Detector

The NOvA software testing framework

The GAP project: GPU applications for High Level Trigger and Medical Imaging

HPS Slow Controls: Performance and Future. N. Baltzell HPS Collaboration Meeting November 16, 2016

Standardization of automated industrial test equipment for mass production of control systems

Data Quality Monitoring Display for ATLAS experiment

Cryogenic System Simulation based on EcosimPro and EPICS

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

Status of the CSNS Control System

Hardware Aspects, Modularity and Integration of an Event Mode Data Acquisition and Instrument Control for the European Spallation Source (ESS)

Measurement of the beam yz crabbing tilt due to wakefields using streak camera at CESR

DIAMOND Control System to the EPICS Meeting at San Jose Dec 2001

Control and Monitoring of the Front-End Electronics in ALICE

05/09/07 CHEP2007 Stefano Spataro. Simulation and Event Reconstruction inside the PandaRoot Framework. Stefano Spataro. for the collaboration

Burn-In Test Software Development

Status of the TORCH time-of-flight detector

Analogue, Digital and Semi-Digital Energy Reconstruction in the CALICE AHCAL

PANDA EMC Trigger and Data Acquisition Algorithms Development

Event reconstruction in STAR

T HE discovery at CERN of a new particle

Development and implementation of the treatment control system in Shanghai Proton Therapy Facility

SoLID GEM Detectors in US

Implementation of a PC-based Level 0 Trigger Processor for the NA62 Experiment

Upgraded Swimmer for Computationally Efficient Particle Tracking for Jefferson Lab s CLAS12 Spectrometer

Construction of the Phase I upgrade of the CMS pixel detector

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

LArTPC Reconstruction Challenges

ONLINE SIMULATION FRAMEWORK THROUGH HTTP SERVICES

MIP Reconstruction Techniques and Minimum Spanning Tree Clustering

Preparation for the test-beam and status of the ToF detector construction

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

Stefania Beolè (Università di Torino e INFN) for the ALICE Collaboration. TIPP Chicago, June 9-14

Vista Controls Vsystem * at the ISIS pulsed neutron facility Bob Mannix, Tim Gray ISIS Controls Group STFC, Rutherford Appleton Laboratory UK

First LHCb measurement with data from the LHC Run 2

Control System

Data acquisition and online monitoring software for CBM test beams

Tracking and Vertex reconstruction at LHCb for Run II

PoS(IHEP-LHC-2011)002

Development of a PCI Based Data Acquisition Platform for High Intensity Accelerator Experiments

Track reconstruction for the Mu3e experiment based on a novel Multiple Scattering fit Alexandr Kozlinskiy (Mainz, KPH) for the Mu3e collaboration

ATLAS, CMS and LHCb Trigger systems for flavour physics

The Progress of TOF on BESIII

Improved vertex finding in the MINERvA passive target region with convolutional neural networks and Deep Adversarial Neural Network

Tracking and flavour tagging selection in the ATLAS High Level Trigger

Fast 3D tracking with GPUs for analysis of antiproton annihilations in emulsion detectors

Production and Quality Assurance of Detector Modules for the LHCb Silicon Tracker

New Development of EPICS-based Data Acquisition System for Millimeter-wave Interferometer in KSTAR Tokamak

FNPL control system: an overview

Preliminary Design of a Real-Time Hardware Architecture for erhic

SYSTEM 8500 Power Converter Family

arxiv:hep-ph/ v1 11 Mar 2002

APPLICATIONS AT THE APS*

Studies of the KS and KL lifetimes and

EPICS: Experimental Physics and Industrial Control System. Control Architecture Reading Group

Australian Nuclear Science & Technology Organisation. Upgrade of the ANTARES Computer Control System and our experience of EPICS.

A Fast VME Data Acquisition System for Spill Analysis and Beam Loss Measurement

PoS(TIPP2014)204. Tracking at High Level Trigger in CMS. Mia TOSI Universitá degli Studi di Padova e INFN (IT)

The LHCb Upgrade. LHCC open session 17 February Large Hadron Collider Physics (LHCP) Conference New York, 2-7 June 2014

Transcription:

Journal of Physics: Conference Series MICE-CONF-GEN-429 OPEN ACCESS The MICE Run Control System To cite this article: Pierrick Hanlet and the Mice collaboration 2014 J. Phys.: Conf. Ser. 513 012012 View the article online for updates and enhancements. Related content - State Machine Operation of the MICE Cooling Channel Pierrick Hanlet and the Mice collaboration - The MICE Muon Beam on ISIS and the beam-line instrumentation of the Muon Ionization Cooling Experiment M Bogomilov, Y Karadzhov, D Kolev et al. - The Memory of MICE: The Configuration Database A J Wilson, D J Colling and P Hanlet This content was downloaded from IP address 37.44.201.95 on 10/02/2018 at 21:01

MICE-CONF-GEN-429 The MICE Run Control System Pierrick Hanlet, for the MICE collaboration Illinois Institute of Technology, Chicago, IL USA 60616 E-mail: hanlet@fnal.gov Abstract. The Muon Ionization Cooling Experiment (MICE) is a demonstration experiment to prove the feasibility of cooling a beam of muons for use in a Neutrino Factory and/or Muon Collider. The MICE cooling channel is a section of a modified Study II cooling channel which will provide a 10% reduction in beam emittance. In order to ensure a reliable measurement, MICE will measure the beam emittance before and after the cooling channel at the level of 1%, or a relative measurement of 0.001. This renders MICE a precision experiment which requires strict controls and monitoring of all experimental parameters in order to control systematic errors. The MICE Controls and Monitoring system is based on EPICS and integrates with the DAQ, Data monitoring systems, and a configuration database. The new MICE Run Control has been developed to ensure proper sequencing of equipment and use of system resources to protect data quality. A description of this system, its implementation, and performance during recent muon beam data collection will be discussed. 1. Motivation Muons, for a neutrino factory or muon collider[1, 2], are produced as tertiary particles in the reaction p+n π+x with subsequent decay π µν, and hence have too large an inherent emittance (beam volume in the 6D position and momentum phase space) for a cost-effective accelerator. They must therefore be cooled to reduce the beam spread both transversely and longitudinally. Due to the short muon lifetime, the only feasible technique is ionization cooling, which has as yet only been studied in simulations. The international Muon Ionization Cooling Experiment (MICE) at the ISIS accelerator at Rutherford Appleton Laboratory (UK), will demonstrate muon ionization cooling with a variety of beam optics, muon momenta (140-240 MeV/c), and emittances. The results of these studies will be used to optimize neutrino factory and muon collider designs. MICE will measure a 10% reduction in beam emittance with a 1% resolution, making it a precision experiment with a 0.1% relative resolution. Thus, it is imperative that the systematic errors be minimized and well understood. For this reason, as well as budget constraints, MICE is a staged experiment in which the parameters of the beam, detectors, tracking, and cooling channel components are studied in detail in each step. 2. MICE Description To make the measurement, MICE will: (1) create a beam of muons, (2) identify the muons and reject other particles using particle physics techniques, (3) measure the muon emittance in tracking spectrometers, (4) cool the beam in low-z absorbers, (5) restore the longitudinal component of the muon momenta, (6) measure the emittance downstream of the cooling channel, Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. Published under licence by IOP Publishing Ltd 1

Figure 1. MICE beamline schematic. Figure 2. MICE Cooling Channel: 2 tracking spectrometers sandwiching cooling channel of 3 AFC modules and 2 RFCC modules. and (7) re-identify muons and reject events with electrons from decayed muons. A more complete description of MICE can be found here[3] and in the MICE technical design report[4]. The muon beam is created using a titanium target which is dipped at 1 Hz with acceleration 90g into the ISIS beam halo during the last 3 ms of the acceleration cycle. The pions produced in the collision are transported to the MICE Hall and momentum selected using conventional dipole and quadrupole magnets; here, a quadrupole triplet (Q1-3) and dipole (D1). These pions decay into muons within the superconducting Decay Solenoid (DS) which serves both to increase the path length of the pions as well as focus the pions and muons. The emerging muons are then momentum selected and transported to the cooling channel with a dipole (D2) and quad triplets (Q4-6 & Q7-9), see Fig. 1. Particle Identification (PID) is performed with two threshold Cherenkov counters and two time-of-flight scintillator hodoscopes (ToF0 & ToF1) up and down stream of the last triplet. Decayed muons are rejected using the last ToF plane (ToF2), KLOE-light calorimeter (KL), and electron-muon ranger (EMR) downstream of the cooling channel. As of the writing of this paper, data have been collected with all detectors but the EMR, though we are presently taking first data with this detector. The ToF detectors were calibrated to have time resolutions of 51 ps/58 ps/52 ps for ToF0/ToF1/ToF2, respectively[5]. The final MICE cooling channel will consist of 3 Absorber/Focusing Coil stations (AFCs) interleaved with 2 RF/Coupling Coil stations (RFCCs). This cooling channel is sandwiched between two identical tracking spectrometers (TSs), which are comprised of 5-coil superconducting solenoid magnets, or Spectrometer Solenoids (SSs) and trackers. Each tracker consists of 5-stations of 3 stereo-view planes of scintillating fibers with 1400 350 µm fibers/plane. It is positioned inside the bore of the longest 1.3 m coil which provides a uniform 4 T field. The remaining SS coils serve to match the magnetic optics to that of the the cooling channel. The trackers will be used to measure muon trajectories, and thus momenta, both upstream and downstream of the cooling channel. In this way, the particle emittance, which is calculated as an ensemble of individual measurements, will be measured before and after cooling, such that the difference in measurements directly measures the cooling effect. The full cooling channel is shown in Fig. 2. As of the writing of this document, MICE is preparing to introduce tracking spectrometers TS1 and TS2 and the first AFC module into the cooling channel. This is the MICE Step IV configuration. Therefore, following the present EMR run, MICE will go into a long construction period with the expectation of running Step IV in 2015. 2

Figure 3. Functional description of EPICS. 3. MICE Controls & Monitoring Since MICE is a precision experiment, it is imperative that we tightly control systematic errors, which is accomplished, in part, by carefully monitoring experimental parameters. MICE also has a wide variety of hardware components to be controlled and monitored. These considerations require a mature Controls and Monitoring (C&M) framework. The EPICS[6] (Experimental Physics and Industrial Control System) platform was chosen for all of MICE C&M because of its reliability, existing support for a wide variety of hardware devices, flexibility to add new hardware devices, large selection of existing user applications, and a world-wide support network. It is open source software accessable from [6]. EPICS s backbone is a local area network (LAN) to which hardware components are interfaced, via their drivers, with EPICS Input/Output Controllers (IOCs), see Fig. 3. The IOCs generate process variables (PVs) which carry the values of hardware parameters (e.g. pressure or temperature). Additional PVs can be derived in software. Further description of the PVs is provided by fields which serve to increase functionality; e.g. scanning rates, engineering units, high and low alarm limits, operating limits, to name a few. The PVs are then made available on the LAN, such that the IOC is a combination of computer, software, and server. Writing to a PV is the control and reading from a PV are the monitoring parts of C&M. A wide variety of user interfaces to the EPICS IOCs are performed using EPICS Channel Access (CA). In this way IOCs can interact to share information, hardware can be controlled and monitored with graphical user interfaces (GUIs), errors can be identified with alarm handlers, and relevant operating parameters can be archived. 3.1. Subsystems For the purpose of C&M, MICE is divided into the following systems: Beamline target, conventional beamline magnets, decay solenoid, proton absorber, moveable beamstop, diffuser, luminosity monitor. PID GVa1, ToF 1/2/3, Ckov A/B, KL, and EMR. Spectrometers (2) SS magnets and fiber trackers. AFCs (3) LH 2 absorber module (or solid absorbers) and focus coils (FC). RFCCs (2) 4 201 M Hz RF accelerating cavities and 1 superconducting solenoid coaxially surrounding the RF cavities. Environment & Services temperatures, humidity, radiation, water and air flows, pressures, and leak monitoring. Data Acquisition and Electronics. 3

3.2. Controls Hardware The larger systems: beamline magnets, decay solenoid, trackers, and target have control systems built by a controls team at Daresbury Laboratory in the UK. Each IOC is a VME based system with a Hytek processor running VxWorks. Sensor controllers are interfaced via RS232. CANbus is employed for interlocks and digital controls, while analog devices are monitored and controlled with VME based ADCs and DACs. The SS and FC magnets are sufficiently complex to require 2 VME crates, and thus IOCs, each. Presently, while testing the SS and FC magnets, standalone control systems have been deployed. These will be replaced by integrated racks when the magnets are installed in the experimental hall as early as Fall 2013. The LH 2 system, due to its explosive nature, is controlled by Omron PLCs and is completely self-contained. EPICS is used solely for remote monitoring of this system. Other IOCs for MICE have been implemented on Linux PCs. These include Ckov, radiation monitoring, high voltage for the PID detectors, proton absorber, beamstop, RF tuners, environment monitoring, air conditioning, LH 2 monitoring, and computer/electronics heart beat monitoring. These IOCs employ a variety of interfaces: serial RS232 and RS485, SNMP, and TCP/IP. Though the C&M hardware are built separately, requirements are defined by the subsystem owners. MICE is an international collaboration, with institutions from around the world providing subsystems and components. It is therefore the challenge of the C&M team to provide uniform control and interfaces to all of the apparatus. 3.3. Other EPICS Applications Most of the MICE graphical user interfaces (GUIs) are based on EPICS edm; though there are some relic GUIs based on QT. These are used for both remote control and monitoring, and employ features such as related displays, hidden buttons, and color coded PVs to indicate alarms when the parameters exceed their limits. Alarms are also made audible by the EPICS alarm handler (ALH). This is used extensively in the control room and with the stand-alone systems for SS and FC testing. In the alarm handlers, PVs are grouped for convenience. The ALH functionality of configurable flags (which are PVs) allows these groups of alarmed PVs to be enabled/disabled on the fly. The purpose of the MICE ALH is to provide early notification that equipment is approaching a dangerous state as well as to protect MICE data quality. It is important to note that the equipment interlocks serve to protect the equipment from damage; the ALH is meant to notify operators so as to prevent equipment from getting to the point of an interlock trip. MICE also uses the EPICS Archiver to archive selected parameters with either regular, selectable frequencies or when a change occurs whose magnitude exceeds a dead band. These data may later be used in corrections for data analyses or to help debug equipment. Due to the international nature of MICE, collaborators around the globe need to be able to remotely monitor their equipment. The EPICS gateway is implemented in MICE to allow for this possibility. The gateway is a secure means of allowing read-only, remote access to the values and fields of the PVs. This allows remote users to display the PVs by locally running EPICS applications, therefore reducing the bandwidth required for forwarding the graphical displays. 4. MICE Run Control MICE Run Control (RC) is a higher level of control which integrates the C&M systems for: Target, Environment, Beamline Magnets, DAQ Monitoring, PID, and cooling channel elements (for Step IV). Its purpose is to ensure: (1) experimental readiness of the apparatus, (2) that subsystems share resources appropriately, (3) that run settings are systematically and reliably set, (4) that run conditions/settings are reliably stored for each run, and (5) that run statistics are properly gathered/stored at the end of each run. 4

Figure 4. MICE Run Control Algorithm. The RC start run sequence is to (1) query operator for run type, trigger conditions, (2) retrieve the beamline and cooling channel parameters from the configuration database (CDB), (3) set the beamline and cooling channel parameters, (4) wait for beamline and cooling channel readiness, and finally when these are ready and as the run begins, (5) record the run parameters to the CDB. Once the run is underway, RC continuously monitors the DAQ collecting statistics associated with the run (target, ISIS losses, triggers, scalers, etc.), sums these, and records them to the CDB at the end of a run. This algorithm is shown graphically in Fig. 4. In the figure, the RC responsibilities are shown in blue boxes, interface with the CDB, in violet boxes, and user input in green boxes. At the beginning of a run, the user selects a menu-driven run configuration: run-type, trigger, beamline settings and RC uses these inputs to query the CDB. C&M uses these parameters to set the beamline, detectors, and cooling channel elements after checking their present status. If after a specified period a system does not acheive the desired configuration, the user is presented with buttons to additional guis to intervene in setting the parameters; e.g. a magnet does not ramp up to current because the water flow was never turned on. Once all systems are ready, RC presents the user with an option to write a comment and then an option to start the DAQ. After the first trigger is accepted, RC reads all of the present run parameters and stores them in the CDB. Throughout the run, RC continuously checks for an end of run flag. When this is found, RC collects all of the target, DAQ, and scaler data, presents the user with an option to write end of run comment, and stores these data in the CDB. As seen in Fig. 4, at step (3) above, RC divides its responsibilities between the beamline, the PID detectors, and the cooling channel. Due to the complexity of the C&M systems for the cooling channel elements, checking for their readiness and ensuring appropriate control would require a multitude of checks. To address this challenge, the anticipated next version of RC will employ MICE State Machines[7] to check for readiness of the cooling channel components, thus greatly simplifying its role. In this way, RC simply checks the states of the cooling channel elements, rather than their details. During the early stages of MICE (Step I in which consisted of only the muon beamline and PID), RC was successfully used to bring up standard run settings and to store configurations for 5

each run. The standard settings included preset magnet currents for the conventional beamline magnets and the DS as well high voltage settings for the detectors. At the end of each run, the number of target activations, triggers, beam condition averages, and other scaler data were stored in the CDB. Later data analyses used these run conditions and run summary data. 5. Conclusion The MICE Run Control has been developed as a high level C&M tool to interface major subsystems to ensure systematic and reliable operation of the Target, Beamline Magnets, DAQ Monitoring, PID, and cooling channel elements. In worknig with the state machines for the cooling channel components, RC will serve to safely operate the equipment and protect data quality for a high precision measurement of the ionization coolnig principle. References [1] C. Ankenbrandt et al., PRST-AB 2, 81001 (1999); M.M. Alsharo a et al., PRST-AB 6, 81001 (2003). [2] Feasibility Study-II of a Muon-Based Neutrino Source, eds. S. Ozaki et al., BNL-52623, http://www.cap.bnl.gov/mumustudyii/fs2-report.html. [3] M. Bogomilov et al., The MICE Muon Beam on ISIS and the beam-line Instrumentation of the Muon Ionization Cooling Experiment, J.Inst. 7(2012) P05009. [4] G.Gregoire et al., MICE and International Muon Ionization Cooling Experiment Technical Reference Document, (October 2004), http://www.isis.rl.ac.uk/archive/accelerator/mice/tr/mice Tech ref.html. [5] R. Bertoni et al., Nucl. Instrum. Meth. A615 (2010) 14. [6] http://aps.anl.gov/epics [7] P. Hanlet et al., State Machine Operation of the MICE Cooling Channel, Proc. Intl. Conference on Computing in High Energy and Nuclear Physics 2013, J. Phys.: Conf. Series 447 (submitted). 6