Table of Contents. TB2009DataAnalysis...1. Introduction...2 People Involved in Analysis/Software...2 Communication...2. Data Preparation...

Similar documents
CALICE testbeam data model and preparations

The Run 2 ATLAS Analysis Event Data Model

Introduction to ROOT. M. Eads PHYS 474/790B. Friday, January 17, 14

Exercise 1: Basic Tools

Data Analysis in ATLAS. Graeme Stewart with thanks to Attila Krasznahorkay and Johannes Elmsheuser

Fall 2007 Research Report

TangeloHub Documentation

Systems Programming Advanced Software Development

ECE 2036 Lab 1: Introduction to Software Objects

CLEO III Cathode Hit Calibration

Anaplan Informatica Connector

Adobe Marketing Cloud Using FTP and sftp with the Adobe Marketing Cloud

pyframe Ryan Reece A light-weight Python framework for analyzing ROOT ntuples in ATLAS University of Pennsylvania

BreatheExample Documentation

IoT with Intel Galileo Gerardo Carmona. makerobots.tk

The current topic: Python. Announcements. Python. Python

Data Analysis Frameworks

AMS 200: Working on Linux/Unix Machines

Parallel Programming Pre-Assignment. Setting up the Software Environment

CS1600 Lab Assignment 1 Spring 2016 Due: Feb. 2, 2016 POINTS: 10

Python lab session 1

An Introduction to Root I/O

Automatic Creation of a Virtual Network with VBoxManage [1]

Whole genome assembly comparison of duplication originally described in Bailey et al

Lab 1: First Steps in C++ - Eclipse

Python for Non-programmers

Implementing an ADT with a Class

Hall-C Analyzer & Hall-C Replay

Using RANCID. Contents. 1 Introduction Goals Notes Install rancid Add alias Configure rancid...

arxiv: v1 [physics.ins-det] 14 Jul 2017

Table of Contents EVALUATION COPY

Objectives. Introduce static keyword examine syntax describe common uses

Kaivos User Guide Getting a database account 2

Introduction to UNIX. Logging in. Basic System Architecture 10/7/10. most systems have graphical login on Linux machines

Brekeke SIP Server Version 2 Authentication Plug-in Developer s Guide Brekeke Software, Inc.

Configuration and Build System

Shell / Python Tutorial. CS279 Autumn 2017 Rishi Bedi

Version control system (VCS)

Installing the Product Software

Using NCSL DAQ Software to Readout a CAEN V792 QDC. Timothy Hoagland

OBTAINING AN ACCOUNT:

Course "Data Processing" Name: Master-1: Nuclear Energy Session /2018 Examen - Part A Page 1

C/C++ Programming. Session 10

HEP data analysis using ROOT

Motivation was to facilitate development of systems software, especially OS development.

Home Page. Title Page. Contents. Page 1 of 17. Version Control. Go Back. Ken Bloom. Full Screen. Linux User Group of Davis March 1, Close.

Bash command shell language interpreter

musrsim and musrsimana tools for

Revision Control and GIT

HippoDraw and Python

PROGRAMMING OLYMPIAD FINAL ROUND Environment Manual. 1 Introduction 2. 2 Local environment 2

Remote Process Explorer

ROOT. Introduction. Spring 2010 Lecture 5. S. Lehti and V.Karimäki. COMPUTING METHODS IN HIGH ENERGY PHYSICS (page 1)

Computer Science 2500 Computer Organization Rensselaer Polytechnic Institute Spring Topic Notes: C and Unix Overview

Java Reference Card. 1. Classes. 2. Methods. 3. Conditionals. 4. Operators

SLS Detectors software installation

AIDA-2020 Advanced European Infrastructures for Detectors at Accelerators. Presentation. A generic data acquisition software framework, EUDAQ2

ROOT Analysis Framework (I) Introduction. Qipeng Hu March 15 th, 2015

Python. Karin Lagesen.

Getting Started with GEM

The cluster system. Introduction 22th February Jan Saalbach Scientific Computing Group

CSCI 2132 Final Exam Solutions

CSE 303 Lecture 2. Introduction to bash shell. read Linux Pocket Guide pp , 58-59, 60, 65-70, 71-72, 77-80

CloudShell 7.1 GA. Installation Guide. Release Date: September Document Version: 2.0

Pointers, Dynamic Data, and Reference Types

Makefiles Makefiles should begin with a comment section of the following form and with the following information filled in:

Good Luck! CSC207, Fall 2012: Quiz 1 Duration 25 minutes Aids allowed: none. Student Number:

Motivation was to facilitate development of systems software, especially OS development.

IBM Proventia Management SiteProtector Policies and Responses Configuration Guide

IceWarp to IceWarp Migration Guide

User manual. Helsinki University of Technology T Software Development Project I

CVS Application. William Jiang

Chapter 4 Defining Classes I

CS 1110 SPRING 2016: GETTING STARTED (Jan 27-28) First Name: Last Name: NetID:

The Go Programming Language. Frank Roberts

Introduction to Supercomputing

Operating System Interaction via bash

Project Compiler. CS031 TA Help Session November 28, 2011

Installation Guide. CloudShell Version: Release Date: June Document Version: 1.0

Hands-on Exercise Hadoop

Bash for SLURM. Author: Wesley Schaal Pharmaceutical Bioinformatics, Uppsala University

Oregon State University School of Electrical Engineering and Computer Science. CS 261 Recitation 1. Spring 2011

HOMEWORK 9. M. Neumann. Due: THU 8 NOV PM. Getting Started SUBMISSION INSTRUCTIONS

Measuring baseline whole-brain perfusion on GE 3.0T using arterial spin labeling (ASL) MRI

Basic program The following is a basic program in C++; Basic C++ Source Code Compiler Object Code Linker (with libraries) Executable

Introduction to Shell Scripting

3rd Party Application Deployment Instructions

Numerical Algorithms for Physics. Andrea Mignone Physics Department, University of Torino AA

BanzaiDB Documentation

Object-oriented Programming for Automation & Robotics Carsten Gutwenger LS 11 Algorithm Engineering

Lab 14 - Introduction to Writing Behaviors for the IvP Helm

Assumptions. History

1. ECI Hosted Clients Installing Release 6.3 for the First Time (ECI Hosted) Upgrading to Release 6.3SP2 (ECI Hosted)

Hall D and IT. at Internal Review of IT in the 12 GeV Era. Mark M. Ito. May 20, Hall D. Hall D and IT. M. Ito. Introduction.

COMP322 - Introduction to C++ Lecture 02 - Basics of C++

EE516: Embedded Software Project 1. Setting Up Environment for Projects

Using Doxygen to Create Xcode Documentation Sets

Arizona s First University. More ways to show off--controlling your Creation: IP and OO ECE 373

Programming in Python

Installing Sage POS and SQL

Transcription:

Table of Contents TB2009DataAnalysis...1 Introduction...2 People Involved in Analysis/Software...2 Communication...2 Data Preparation...3 Data Analysis Tasks...4 Software Tasks...5 Core Analysis Package...6 Checking out, installing and running the package...6 Check out from SVN...6 Setup environment to compile...6 Compile the package...7 Examples...7 Example 1: Convert Raw BAT/MM data for a single run into rootuple (single-file mode)...7 Example 2: Convert Raw BAT/MM data for all runs from logbook (batch mode)...8 Example 3: How to access RunInfo from standalone ROOT...8 Core Analysis Output...9 User Analysis...11 Setup the environment to checkout and compile user packages...11 Check out...11 Package Components...11 User Modules...11 Definition...12 Invocation by the analysis package...12 Use your modules...13 i

TB2009DataAnalysis TB2009DataAnalysis 1

Introduction Muon Atlas MicroMegas Analysis ( MAMMA) is an analysis framework that is being developed with a goal to provide a coherent framework to analyze the test beam data collected for the ATLAS Muon upgrade project (more information here: MuonMicromegas ). There are several steps involved currently to convert raw data collected using the Bonn Atlas Telescope (BAT) and Micromegas (MM) data acquisition systems into rootuples which can be analyzed using ROOT. The scope of the analysis framework includes easier conversion of raw data. People Involved in Analysis/Software List of people involved and willing to contribute to the analysis (please add yourself if you are interested) W. Park (U. South Carolina) K. Nikolopolous (Athens/BNL) V. Kaushik (U. Arizona) Communication For the analysis activities we have a hypernews that one can subscribe to. All discussions (and nuts-and-bolts of analysis) should be addressed in this forum hn-atlas-muon-micromegas-testbeam-analysis@cernspamnot.ch Introduction 2

Data Preparation This stage involves converting BAT data and the raw MM data into rootuples. The run information from the AtlasMMLogbook is read and stored for use in analysis by the RunInfo class. All the relevant information is contained in this class. In addition, the temperature, relative humidity and the pressure information (only for MM) is also read from the input and stored in the ntuples. The BAT data consists of strip information for the four modules (labeled 1,2,3,6) along with the charge on each strip, the strip ID and the run and event numbers. Every BAT run number has a corresponding (and different) MM run number which is stored in the RunInfo object. The MM raw data also consists of charge read out over several time-samples for each strip, the run and event numbers along with the auxiliary information about the gas mixture used, the strip pitch and width and inclination of the chamber with respect to normal and a comments field. There are several issues with the raw data itself, which needs to be addressed before analysis can be performed which is also the scope of data-preparation stage. These are listed below: BAT internal alignment Mapping Synchronization of BAT events with MM events Data Preparation 3

Data Analysis Tasks Convert all available BAT and MM runs into analysis ntuples Strip data (charge measurement for time samples) for BAT / MM Auxiliary data (event information, pressure, humidity, gas mixture etc.) Alignment Studies Internal BAT Alignment External Alignment Mapping Synchronization of BAT/MM events Data Quality Monitoring Reconstruction Fitting time samples and extracting amplitude/time for each strip. Clusterization - grouping strips into a cluster. Charge and position measurement Extrapolation of tracks from BAT to MM Matching hits/ finding track segments in MM Determine position resolution -TPC Studies Data Analysis Tasks 4

Software Tasks Documentation Doxygen documentation of code Description of available information in the ntuples Data-taking/Monitoring Reading configuration from GUI improving the look/feel of existing monitoring GUI Software Tasks 5

Core Analysis Package The MAMMA package is available for download at this SVN link. Checking out, installing and running the package The requirements are as follows: You need an AFS user account and be able to access to one of the lxplus@cernspamnot.ch nodes Python (v 2.5 or higher) ROOT (v 5.20 or higher) To setup ROOT environment one can do the following in bash shell: (or use equivalent setenv commands in c-shell): export ROOTSYS=/afs/cern.ch/sw/lcg/app/releases/ROOT/5.22.00b/slc4_ia32_gcc3 export PATH=$ROOTSYS/bin/:$PATH export LD_LIBRARY_PATH=${ROOTSYS}/lib:$LD_LIBRARY_PATH export PYTHONPATH=${ROOTSYS}/lib:$PYTHONPATH Once you log into the lxplus node with your username and password you can request for a scratch area (usually 100 MB is minimum). We assume that you have at least 100 MB in your user area for this purpose. Check out from SVN cd mydir export SVNGRP=svn+ssh://svn.cern.ch/reps/atlasgrp svn co $SVNGRP/Institutes/Arizona/micromegas/trunk micromegas At this stage you will see the package being checked out and output looks something like this.. 12:46]tb2009> svn co $SVNGRP/mumegas A micromegas/input A micromegas/mmcore/python/mamma.py...... A micromegas/makefile Checked out revision 3436. Setup environment to compile cd micromegas source setup.(c)sh export PATH=$PYTHONDIR/bin:$PATH python At this stage the output looks like this.. 1:21]micromegas> python Python 2.5 (r25:51908, Oct 18 2007, 16:26:11) [GCC 3.4.6 20060404 (Red Hat 3.4.6-8)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> Core Analysis Package 6

Compile the package (g)make At this stage the output looks like this.. 12:46]micromegas> gmake mamma --> Making RunInfo.d mamma --> Making TdcDecoder.d mamma --> Making NTupleHandler.d mamma --> Making mplib.d mamma --> Making mp.d...... mamma --> Linking /afs/cern.ch/user/v/venkat/scratch0/tb2009/mumegas/bin/mumegas.exe mamma --> Making Ana.d mamma --> Compiling src/ana.cxx mamma --> Generating dictionary src/mumuser_dict.cxx mamma --> Compiling src/mumuser_dict.cxx mamma --> Making shared library: /afs/cern.ch/user/v/venkat/scratch0/tb2009/mumegas/lib/libmumuser.so mamma --> All OK +++Start application program mamma -m [This is to start the monitoring GUI] mamma -h [This will display available switches] The output looks like this 12:47]micromegas> mamma +------------------------------------------------------+ Muon ATLAS MicroMegas Analysis Usage: mamma [opts] args Help : mamma --help or -h +---------------------------------------------------+ Examples Example 1: Convert Raw BAT/MM data for a single run into rootuple (single-file mode) To run this example, you need two input files (BAT and MM raw data), the logbook and weather information. All the input parameters are specified using a configuration file which is provided in input/test.config. mamma -c input/test.config The output looks like this --> mumegas.exe : Switching to batch mode --> mumegas.exe : Reading configuration from : input/test.config --> mumegas.exe : Dumping configuration -------------------------------------------------------------------------- mmega.batfile : /afs/cern.ch/user/v/venkat/scratch0/tb2009/run2711.bdt mmega.mumfile : /afs/cern.ch/user/v/venkat/scratch0/tb2009/run3448.raw mmega.rootfile : input/tb09_mm3448_bat2711.root mmega.runinfo : input/mm_run_book.csv mmega.weatherinfo : input/mm_temp.txt Compile the package 7

rootfile = bdtfullpath = /afs/cern.ch/user/v/venkat/scratch0/tb2009/run2711.bdt End of Run Events counted: 10041 10041 events processed in total. Done. PHOS MONITOR RECEIVED START OF RUN 3448. PHOS MONITOR RECEIVED START OF RUN 3448. PHOS MONITOR RECEIVED END OF RUN 3448. Mamma: MONITOR RUN 3448 10044 events processed. 10045 events processed. DONE - now exiting Example 2: Convert Raw BAT/MM data for all runs from logbook (batch mode) This job is performed using a different configuration where the path to input files are specified and the program loops over all the runs (which are read and stored from logbook) and creates the output rootuples in the destination specified. The runs are labeled as TB_MM001234_BAT005678_sync00XX.root. This job option is not for users interested in analysis, but for those involved in data-preparation. Example 3: How to access RunInfo from standalone ROOT Setup MAMMA the usual way. Then you may access the RunInfo in the following way: root [0] gsystem->load("libhist.so"); root [2] gsystem->load("libmmutil.so"); root [3] gsystem->load("libmmcore.so"); root [4] TFile *_file0 = TFile::Open("SimpleFastReco_refit_before_4744.root") root [5].ls TFile** SimpleFastReco_refit_before_4744.root TFile* SimpleFastReco_refit_before_4744.root KEY: TTree ntp;1 ntp: combined mm and bat ntuple KEY: TTree RunInfo4744MM;1 RunInfo for MM Run: 4744 and BAT Run 4017 root [6] RunInfo4744MM->Scan() ************************************************************************************************ * Row * runinfo.m * runinfo.b * runinfo.m * runinfo.b * runinfo.s * runinfo.e * runinfo.s * ************************************************************************************************ * 0 * 4744 * 4017 * 10053 * 10053 * -999 * ALTRO * R13 * ************************************************************************************************ (Long64_t)1 root [7].q ========== Example 1: Convert Raw BAT/MM data for a single run into rootuple (single-file mode) 8

Core Analysis Output The analysis output is a ROOT ntuple containing the combined information (BAT and micromegas) along with run information for each run. The contents of the ntuple are tabulated below. It is assumed that most data analysis tasks listed above will be performed on this combined ntuple. A user analysis framework (similar to the core framework) has been developed and is available to use. As the analysis tasks proceed and mature, it is envisaged that they will be included into the core framework. The user analysis part is described in the next section. Sl. Datatype Variable Comment 1 Int_t bat_run BAT run number 2 vector < int > bat_id1x strip Index in layer 1 3 vector < int > bat_q1x charge of the strip in layer 1 4 vector < int > bat_id2x 5 vector < int > bat_q2x 6 vector < int > bat_id3x 7 vector < int > bat_q3x 8 vector < int > bat_id6x 9 vector < int > bat_q6x 10 vector < int > bat_id1y 11 vector < int > bat_q1y 12 vector < int > bat_id2y 13 vector < int > bat_q2y 14 vector < int > bat_id3y 15 vector < int > bat_q3y 16 vector < int > bat_id6y 17 vector < int > bat_q6y 18 Int_t mm_run MM run number 19 Int_t mm_evt No. of MM events in this run 20 Float_t mm_time Amplitude from gamma_2 fit for time channel from HIGHGAIN 21 Float_t mm_timex sampling time at maximum amplitude from gamm_2 fit for time channel from HIGHGAIN 22 Float_t mm_time2 Amplitude from gamma_2 fit for time channel from LOWGAIN 23 Float_t mm_timex2 sampling time at maximum mplitude from gamm_2 fit for time channel from LOWGAIN 24 Int_t mm_qmaxstrid Strip Id (after mapping) with highest charge in the event 25 Float_t mm_maxstrq Amplitude of the strip Id with highest charge 26 Float_t mm_maxstrt Time of the strip Id with highest charge Core Analysis Output 9

28 vector < int > mm_higain 1 : high gain 0: lowgain 29 vector < int > mm_addr channel address ranging from 1-128 30 vector < int > mm_strid Strip Id for channel after mapping 31 vector < int > mm_qmaxsample maximum sample charge before fit 32 vector < int > mm_tqmaxsample sampling time at maximum for before fit 33 vector < float > mm_ped pedestal 34 vector < float > mm_noise noise (RMS) 35 vector < float > mm_qmaxfit Amplitude from gamma_2 fit 36 vector < float > mm_tqmaxfit ampling time at maximum amplitude 37 vector < float > mm_tau tau parameter of gamma_2 function (width at halfmax) 38 vector < float > mm_thalf half-rising time 39 Int_t mm_nclu No. of clusters per event 40 vector < int > mm_clsz cluster size 41 vector < int > mm_clstr0 strip Index of the beginning for a cluster 42 vector < int > mm_clpkid cluster peak id for strip 43 vector < float > mm_clmnpos cluster position from stripid mean 44 vector < float > mm_clcogpos cluster str wgt pos (mean position) 45 vector < float > mm_clchgsum cluster charge sum 46 vector < float > mm_cltime time from weighted average of strip time 47 vector < float > mm_clpkchg cluster peak charge Core Analysis Output 10

User Analysis It is assumed that users are aware of setting up their environment to run ROOT (5.22 or higher), python (2.5 or higher) and a compatible gcc (g++) compiler. User analysis involves analyzing the output of the core analysis (ROOT tuple). The following operating systems / architectures are supported currently. If one wants to use others, they have to try it by themselves. Operating Sytem architecture compiler SLC 4, 5 i386, i686 (32, 64 bit) gcc34, gcc43 Setup the environment to checkout and compile user packages Download and save the configuration script dpdenv.slc5.i686.gcc43.sh to a directory of your choice. Everything will be installed under this base directory (called your work area). Download this script here cd your_work_area source dpdenv.slc5.i686.gcc43.sh Check out Check out core and user packages svn co -q svn+ssh://venkat@svn.cern.ch/reps/atlasgrp/institutes/arizona/dpdanalysis/ core svn co -q svn+ssh://venkat@svn.cern.ch/reps/atlasgrp/institutes/arizona/dpdanalysis/ mmuser Note 1: Replace venkat with your AFS user name Note 2: You might have to authenticate yourself before you check out the packages using kinit Set up environment source./dpdroot_setup.sh If file does not exists. Download/copy the file to the base directory. Check if Makefile.mmuser exists in the base directory. If not then do: cp mmuser/dpd.root/makefile./makefile.mmuser Compile the package source./setup.sh make -f Makefile.mmuser Package Components Run with the option --help for more information. Both files are in the path if environment is set up properly. bin/checkdpd root_file prints out characteristics (variable names and their types) of the content of the root_file bin/rundpd -f list_file -c config_file run analysis on files specified in the text file list_file. Default configuration file is mmuser/config/mmuser.config (use: -c mmuser/config/mmuser.config) User Modules User Analysis 11

Definition A user module is a new class here called CSample defined in the following files (referenced from the base directory of the work area):./mmuser/src/sample.cxx implementation file (Download)./mmuser/mmuser/sample.h header file (Download) The following class declaration defines minimum requirements for a module: class CSample : public DPDAnalysisEvent { public : CSample(const std::string& name); CSample(); ~ CSample() {;} void LinkData(DPDInput *data = 0); int ProcessEvent(DPDInput *data=0, Long64_t entry = 1, bool islast = false); void Begin(); void End(); private: std::string m_name; Int_t mm_run; ClassDef(CSample, 1) }; where: CSample(const std::string& name); //constructor to be called from text.cxx with a meaningful name void Begin(); // put all your initialization in this function called ad the beginning of the analysis void LinkData(DPDInput *data = 0); // link root file data with your member variables here int ProcessEvent(DPDInput *data=0, Long64_t entry = 1, bool islast = false); // called for each event, do your processing here void End(); // called at the end of the analysis (i.e. save file here) To pass message to the console use the m_log member (inherited from DPDAnalysisEvent): m_log << INFO << "Begin user analysis. " << Endl; Invocation by the analysis package The module is added to the analysis package by modifying two files:./mmuser/exec/test.cxx where analysis module(s) are pushed back into a vector. The order in which they are pushed back is the order in which they are executed. Search for the declaration of std::vector<dpdanalysis*> my; Create Sample class instance and add it to the list (change q2 to the next available number): CSample q2("samplemodule"); my.push_back(&q2); The second file./mmuser/mmuser/mmuser_linkdef.h is used to inform ROOT of your class CSample. Add a line like the one below: // user analysis #pragma link C++ class MMBDT+; #pragma link C++ class CSample+; Finally recompile the analysis package. Definition 12

Use your modules To actually use the module it needs to be specified in the used configuration file: Here we change the default:./mmuser/config/mmuser.config Add module name as pushed back into the vector to the values list of the azdpd.analysis key: ## name(s) of analysis modules to ## to run over tree(s) on event-by-event ## basis. use comma separated list azdpd.analysis: MMBDT, samplemodule In the same file you specify names of the trees to be linked to the analysis: ## name(s) of the tree(s) to link ## use comma separated list azdpd.treenames: ntp Major updates: -- VenkateshKaushik - 04 Aug 2009 Responsible: VenkateshKaushik Last reviewed by: Never reviewed dpdenv.slc5.i686.gcc43.sh.txt: script to setup environment to run user analysis on lxplus machines (SLC5, gcc43) This topic: Atlas > TB2009DataAnalysis Topic revision: r9-05-oct-2010 - MarcinByszewski Copyright & by the contributing authors. All material on this collaboration platform is the property of the contributing authors. Ideas, requests, problems regarding TWiki? Send feedback Use your modules 13