CMX (Common Merger extension module) Y. Ermoline for CMX collaboration Preliminary Design Review, Stockholm, 29 June 2011

Similar documents
Status and planning of the CMX. Wojtek Fedorko for the MSU group TDAQ Week, CERN April 23-27, 2012

CMX Hardware Status. Chip Brock, Dan Edmunds, Philippe Yuri Ermoline, Duc Bao Wojciech UBC

CMX Hardware Overview

ATLAS Level-1 Calorimeter Trigger

ATLAS Level-1 Calorimeter Trigger

Testing Discussion Initiator

Module Performance Report. ATLAS Calorimeter Level-1 Trigger- Common Merger Module. Version February-2005

CMX online software status

Upgrade of the ATLAS Level-1 Trigger with event topology information

The electron/photon and tau/hadron Cluster Processor for the ATLAS First-Level Trigger - a Flexible Test System

ATLAS L1 Calorimeter Trigger / DCS

Track-Finder Test Results and VME Backplane R&D. D.Acosta University of Florida

Calorimeter Trigger Hardware Update

LVL1 e/γ RoI Builder Prototype

RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Status Review November 20, 2003

ATLAS Level-1 Calorimeter Trigger Jet / Energy Processor Module. Project Specification (PDR) Carsten Nöding Uli Schäfer Jürgen Thomas

RPC Trigger Overview

L1Calo FW Meeting. Progress and Plans: FEX ATCA Hub. Dan Edmunds, Yuri Ermoline Brian Ferguson, Wade Fisher, Philippe Laurens, Pawel Plucinski

ATLAS TDAQ RoI Builder and the Level 2 Supervisor system

BES-III off-detector readout electronics for the GEM detector: an update

DHCAL Readout Back End

Level-1 Missing Energy Significance Trigger Specification

Muon Port Card Upgrade Status May 2013

L1Calo Joint Meeting. Hub Status & Plans

An Upgraded ATLAS Central Trigger for 2015 LHC Luminosities

The ATLAS Level-1 Muon to Central Trigger Processor Interface

The CMS Global Calorimeter Trigger Hardware Design

Trigger Layout and Responsibilities

TOF Electronics. J. Schambach University of Texas Review, BNL, 2 Aug 2007

Prototyping NGC. First Light. PICNIC Array Image of ESO Messenger Front Page

EMU FED. --- Crate and Electronics. ESR, CERN, November B. Bylsma, S. Durkin, Jason Gilmore, Jianhui Gu, T.Y. Ling. The Ohio State University

ATLANTIS - a modular, hybrid FPGA/CPU processor for the ATLAS. University of Mannheim, B6, 26, Mannheim, Germany

Trigger and Data Acquisition: an ATLAS case study

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review June 5, 2002

Construction of the Phase I upgrade of the CMS pixel detector

DTTF muon sorting: Wedge Sorter and Barrel Sorter

Upgrading the ATLAS Tile Calorimeter electronics

L1Calo Status Report. efex jfex gfex L1Topo ROD Hub TREX FOX Mapping FTM Link-Speed Tests. 15 October 2015 Ian Brawn, on behalf of L1Calo 1

A 3-D Track-Finding Processor for the CMS Level-1 Muon Trigger

Level 0 trigger decision unit for the LHCb experiment

RESPONSIBILITIES CIEMAT, MADRID HEPHY, VIENNA INFN, PADOVA INFN, BOLOGNA INFN, TORINO U. AUTONOMA, MADRID & LV, HV PS SYSTEMS.

The ATLAS High Level Trigger Region of Interest Builder

Challenges and performance of the frontier technology applied to an ATLAS Phase-I calorimeter trigger board dedicated to the jet identification

DT TPG STATUS. Trigger meeting, September INFN Bologna; INFN Padova; CIEMAT Madrid. Outline: most updates on Sector Collector system

DTTF muon sorting: Wedge Sorter and Barrel Sorter

Muon Trigger Electronics in the Counting Room

A synchronous architecture for the L0 muon trigger

APV-25 based readout electronics for the SBS front GEM Tracker

Timing distribution and Data Flow for the ATLAS Tile Calorimeter Phase II Upgrade

Readout-Nodes. Master-Node S-LINK. Crate Controller VME ROD. Read out data (PipelineBus) VME. PipelineBus Controller PPM VME. To DAQ (S-Link) PPM

Update on PRad GEMs, Readout Electronics & DAQ

The new detector readout system for the ATLAS experiment

Frontend Control Electronics for the LHCb upgrade Hardware realization and test

Status of ADF System Design October 2002

Status of the ATLAS Central Trigger

ATLAS TDAQ DATAFLOW. RoI Builder Manual. Abstract ATL-DQ-ON Document Version: 3 Document Issue: 6. Document Date: 22/2/05 Document Status:

I/O Choices for the ATLAS. Insertable B Layer (IBL) Abstract. Contact Person: A. Grillo

CMS Calorimeter Trigger Phase I upgrade

Using Pulsar as an upgrade for L2 decision crate Ted Liu, FNAL (for CDF Pulsar group)

A real time electronics emulator with realistic data generation for reception tests of the CMS ECAL front-end boards

ATLAS Level-1 Calorimeter Trigger Prototype Processor Backplane. Project Specification (FDR)

Optimizing latency in Xilinx FPGA Implementations of the GBT. Jean-Pierre CACHEMICHE

Nuclear Physics Division Data Acquisition Group

J. Castelo. IFIC, University of Valencia. SPAIN DRAFT. V1.0 previous to 5/6/2002 phone meeting

Implementation of the Jet Algorithm. Sam Silverstein Anders Ferm Torbjörn Söderström

FT Cal and FT Hodo DAQ and Trigger

Read-Out Driver (ROD) and Data Challenges

A flexible stand-alone testbench for facilitating system tests of the CMS Preshower

Design and Implementation of the Global Calorimeter Trigger for CMS

IEEE Nuclear Science Symposium San Diego, CA USA Nov. 3, 2015

Design and Test of Prototype Boards for the L1 Calorimeter Trigger Upgrade at the D0 Experiment

Field Program mable Gate Arrays

Read-out of High Speed S-LINK Data Via a Buffered PCI Card

Development and test of a versatile DAQ system based on the ATCA standard

The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System

S-LINK: A Prototype of the ATLAS Read-out Link

Trigger Report. W. H. Smith U. Wisconsin. Calorimeter & Muon Trigger: Highlights Milestones Concerns Near-term Activities CMS

USCMS HCAL FERU: Front End Readout Unit. Drew Baden University of Maryland February 2000

LHC Detector Upgrades

PROGRESS ON ADF BOARD DESIGN

Schematic. A: Overview of the Integrated Detector Readout Electronics and DAQ-System. optical Gbit link. 1GB DDR Ram.

Velo readout board RB3. Common L1 board (ROB)

The WaveDAQ system: Picosecond measurements with channels

Development of a digital readout board for the ATLAS Tile Calorimeter upgrade demonstrator

HCAL Trigger Readout

2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System

HCAL FE/DAQ Overview

MPC-SP02 Jitter Budget

Atlantis MultiRob Scenario

The Front-End Driver Card for the CMS Silicon Strip Tracker Readout.

Software for implementing trigger algorithms on the upgraded CMS Global Trigger System

Investigation of High-Level Synthesis tools applicability to data acquisition systems design based on the CMS ECAL Data Concentrator Card example

The FTK to Level-2 Interface Card (FLIC)

Deployment of the CMS Tracker AMC as backend for the CMS pixel detector

Vertex Detector Electronics: ODE to ECS Interface

Status of Trigger Server electronics

CSC Trigger Motherboard

MDC Optical Endpoint. Outline Motivation / Aim. Task. Solution. No Summary Discussion. Hardware Details, Sharing Experiences and no Politics!

Summary of Optical Link R&D

Transcription:

(Common Merger extension module) Y. Ermoline for collaboration Preliminary Design Review, Stockholm, 29 June 2011

Outline Current L1 Calorimeter trigger system Possible improvement to maintain trigger quality Topology information in real-time data path Functional requirements Project specification overview Development schedule Technical aspects CMM/ differences FPGA & Links modes of operation firmware development MSU test stand 1/0

Current L1 Calorimeter trigger system 4 crates Counts of identified objects meeting specified thresholds. 2 crates Jets and em/tau clusters identified in different subsystems 2/0 Region of Interest (ROI) topology information read-out only on L1Accept.

Possible improvement to maintain trigger quality Add topology information to the real-time data path Examples using local topology information (single calo quadrant) Identify spatial overlap between e/tau clusters and jets Use local jet Et sum to estimate energy of overlapping e/tau object Requires jet energies to be added to real time data path Examples using global topology Non back-to-back jets Rapidity gaps Invariant or transverse mass calculations Jet sphericity Required upgraded CMM and Topology Processor Simulation study In progress (see talk on Monday) 3/0

0.2 0.2 0.1 0.1 Topology information in real-time data path Calorimeter signals (~7200) Digitized E T Cluster Processor 56 modules Backplane Merging 8 modules Analogue Sums Receivers PreProcessor 124 modules Backplane New Topology Processor To CTP Jet/Energy 32 modules Merging 4 modules L1 Muon Readout data Readout Driver 14 modules Readout Driver 6 modules Region-ofinterest data Add ROI positions to the real-time data path, enabling new algorithms based on event topology (in new Topological Processor) Modify firmware in processor modules to increase data transfer rate over crate backplane (40 Mbit/s -> 160 Mbit/s) Replace merging modules (CMM) with upgraded hardware () Add new Topological Processor (TP) 4/0

functional requirements Backward compatibility : be designed to fit in the CMM positions in the processor crates, inherit all main logical components, electrical interfaces, programming model and data formats of the current CMM, be able to implement all different versions of CMM FPGA logic, adapted to new hardware. Data source for topological processor: receive extra data from upgraded processor modules over the crate backplane at higher data transfer rate (160Mb/s), transmit data to the TP via multi-fiber optical ribbon link(s), optionally electro-optical data replication using available spare transmitters transmit extra data from upgraded processor modules to the L1Calo DAQ and RoI Read-Out Drivers (RODs). Insurance policy" option against unforeseen Optional standalone mode - may require (unnecessary) extra complexity have to be weighted against benefits 5/0

project specification overview The version 0.7 of the project specification available: http://ermoline.web.cern.ch/ermoline// This document specify: functional requirements, CMM/ differences, technical aspects of the implementation. The engineering solutions will be reflected in the following detailed hardware and firmware specifications Comments from Jim, Uli, Ian, Sam, Dan, Philippe, Hal, Chip Added into document Next steps: Jul 2011 - Jan 2012: Preliminary design study, engineering specification, design documentation, test rig checked out at MSU Feb 2012 - Sep 2012: Prototype design and test Sep 2012: Production Readiness Review 6/0

development schedule 2011: Project and engineering specifications project Preliminary Design Review (this week) Preliminary design studies Test rig installed, checked out at MSU 2012: Prototype design and fabrication schematics and PCB layout Production Readiness Review Prototype fabrication, CMM firmware ported on Basic tests for backward compatibility in test rig at MSU 2013: Prototype testing/installation/commissioning, final fabrication Full prototype tests in test rig at CERN firmware development and test Test in the L1Calo system during shutdown Fabricate and assemble full set of modules 2014: Final commissioning in the L1Calo trigger system in USA15 7/0

Technical aspects (1): CMM/ differences Main modifications to the CMM hardware: replacement of the obsolete FPGA devices by new parts to receive data at 160Mb/s from the backplane, transmit and receive data via multi-fiber optical ribbon link using transceivers in FPGA, implementation of the G-link protocol in firmware, implementation of multi-fiber optical ribbon links. 8/0

Technical aspects (2): FPGA & Links The new FPGA or FPGAs for the board shall provide sufficient: IO pins, compatible with the L1Calo system backplane signal levels, pins (~640) for all external interfaces of two original CMM FPGAs high speed serial transceivers for data transmission and reception, Minimum: 8 to 18 transmitters (TP, RODs); optionally - fan-out and reception internal logical resources (logical blocks and memories). Virtex 6 / Virtex E: ~ x2 LUTs, x4 FFs, x10 RAM, x2.5 faster G-Link implementation Original part obsolete: G-Link transmitter chips (HDMP 1022) -> G-Link protocol emulation in FPGA FPGA GTX transmitter at 960 Mbit/s Transceiver Infineon V23818-M305-B57 -> Avago AFBR-57M5APZ Multi-fiber (12 fibers) optical ribbon links GTX and GTH Virtex 6 FPGA transceivers parallel fiber modules: SNAP12 or Avago (-> compatible with TP) 9/0

modes of operation overview VME-- / TCM Processors (40 MHz) (CMM emulation) ROI / DAQ CTP (40 MHz) Backward compatible mode: CMM firmware ported to h/w Looks like CMM in current system other No optical links to TP VME-- / TCM Processors (160 MHz) other (upgrade mode) ROI / DAQ CTP (40 MHz) TP (upgrade data) (+data replication) Upgrade mode: new data format data processing/reduction to fit links in a single TP module data replication to multiple TPs VME-- / TCM Processors (160 MHz) /TP ROI / DAQ CTP (40 MHz) to /TP Standalone mode (optional) : Insurance policy" option data reception from other other from /TP 10/0

Backward compatible mode 2 FPGA transmitters for the DAQ and ROI G-Links 11/0

Upgrade mode with data replication / fan-out 12 modules N TP modules (slices) Optical ribbon links (12 fibres) 4 CP crates 2 JP crates Optical splitter 12/0 TP crate 12 transmitters per 4-6 with data processing 4-6 transmitters for the DAQ and ROI G-Links (not shown) Spare transmitters (out of 72) - for data replication / fan-out to N TPs 54-64 with data processing

Standalone mode (optional) 4 CP crates 12 modules Passive fibre re-grouping or Passive optical fan-out and re-grouping 2 JP crates To other modules may be used without TP The role of the topological processor can be executed by one (or several) module(s) in the system. Require inter communication - data fan-out and re-grouping 13/0

firmware development Two little-overlapping activities with different sub-sets of people Porting the existing CMM firmware to the new hardware MSU (), RAL/Stockholm (CMM) +? new FPGA selection, I/O pin allocation, signal levels, clock distribution new G-Link implementation in FPGA will be used in upgrade modes test firmware in the test rig hardware, no VHDL test-benches New firmware for the upgrade modes of operation MSU (), Mainz (TP, JEM), Birmingham (CPM), RAL (ROD) +? new interfaces development, data transfer ->TP (MSU) algorithm development for the TP (Mainz), Optionally - applicability for (MSU) test-benches: data source for from upgraded CPM (Birmingham) and JEM (Mainz) data source for TP from (MSU) [also for ROD and CTP?] Data files for the test-benches: from simulation software and MC 14/0

MSU test stand Proposed MSU test stand CPM/Jem and TTC crates at CERN The test rig will be required: To acquire initial knowledge on CMM module operation To develop and test the Initially assembled at CERN, tested and then sent to MSU Hardware (without DCS) Online software Online simulation Hardware available -> focus on software Testing procedure: Backplane data transfer Optical & LVDS links (2 nd ) ROD connection -> at CERN 15/0

Back-up slides

commissioning schedule 17/0