HCAL TPG and Readout

Similar documents
HCAL TPG and Readout

HCAL TPG and Readout

HCAL FE/DAQ Overview

USCMS HCAL FERU: Front End Readout Unit. Drew Baden University of Maryland February 2000

HCAL Trigger Readout

CMS Trigger/DAQ HCAL FERU System

Front End Electronics. Level 1 Trigger

DHCAL Readout Back End

HCAL DCC Technical Reference E. Hazen - Revised March 27, 2007 Note: Latest version of this document should be available at:

CMX Hardware Status. Chip Brock, Dan Edmunds, Philippe Yuri Ermoline, Duc Bao Wojciech UBC

RPC Trigger Overview

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Status Review November 20, 2003

EMU FED. --- Crate and Electronics. ESR, CERN, November B. Bylsma, S. Durkin, Jason Gilmore, Jianhui Gu, T.Y. Ling. The Ohio State University

HCAL HTR Pre-Production board specifications

RESPONSIBILITIES CIEMAT, MADRID HEPHY, VIENNA INFN, PADOVA INFN, BOLOGNA INFN, TORINO U. AUTONOMA, MADRID & LV, HV PS SYSTEMS.

Optimal Management of System Clock Networks

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review June 5, 2002

Alternative Ideas for the CALICE Back-End System

BES-III off-detector readout electronics for the GEM detector: an update

Track-Finder Test Results and VME Backplane R&D. D.Acosta University of Florida

Electronics on the detector Mechanical constraints: Fixing the module on the PM base.

Versatile Link and GBT Chipset Production

RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters

Update on PRad GEMs, Readout Electronics & DAQ

CMX Hardware Overview

Project Specification. Project Name: ATLAS Level-1 Calorimeter Trigger TTC Decoder Card (TTCDec) Version: November 2005

Muon Port Card Upgrade Status May 2013

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System

CSC Trigger Motherboard

S-LINK: A Prototype of the ATLAS Read-out Link

Trigger Layout and Responsibilities

Calorimeter Trigger Hardware Update

Status and planning of the CMX. Wojtek Fedorko for the MSU group TDAQ Week, CERN April 23-27, 2012

Prototyping NGC. First Light. PICNIC Array Image of ESO Messenger Front Page

Optimizing latency in Xilinx FPGA Implementations of the GBT. Jean-Pierre CACHEMICHE

APV-25 based readout electronics for the SBS front GEM Tracker

Velo readout board RB3. Common L1 board (ROB)

The CMS Global Calorimeter Trigger Hardware Design

Deployment of the CMS Tracker AMC as backend for the CMS pixel detector

FELI. : the detector readout upgrade of the ATLAS experiment. Soo Ryu. Argonne National Laboratory, (on behalf of the FELIX group)

MPC-SP02 Jitter Budget

Trigger Report. W. H. Smith U. Wisconsin. Calorimeter & Muon Trigger: Highlights Milestones Concerns Near-term Activities CMS

ATLAS TDAQ RoI Builder and the Level 2 Supervisor system

PC-MIP Link Receiver Board Interface Description

Mezzanine card specifications for Level-2 Calorimeter Trigger Upgrade

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA

The new detector readout system for the ATLAS experiment

CMX (Common Merger extension module) Y. Ermoline for CMX collaboration Preliminary Design Review, Stockholm, 29 June 2011

AIDA Advanced European Infrastructures for Detectors at Accelerators. Presentation

Design and Implementation of the Global Calorimeter Trigger for CMS

DTTF muon sorting: Wedge Sorter and Barrel Sorter

PXIe FPGA board SMT G Parker

TTC/TTS Tester (TTT) Module User Manual

TOF Electronics. J. Schambach University of Texas Review, BNL, 2 Aug 2007

Muon Trigger Electronics in the Counting Room

Acquisition system for the CLIC Module.

LASER INTERFEROMETER GRAVITATIONAL WAVE OBSERVATORY -LIGO-

DTTF muon sorting: Wedge Sorter and Barrel Sorter

J. Castelo. IFIC, University of Valencia. SPAIN DRAFT. V1.0 previous to 5/6/2002 phone meeting

The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System

I/O Choices for the ATLAS. Insertable B Layer (IBL) Abstract. Contact Person: A. Grillo

L1Calo Joint Meeting. Hub Status & Plans

MSU/NSCL May CoBo Module. Specifications Version 1.0. Nathan USHER

The Front-End Driver Card for the CMS Silicon Strip Tracker Readout.

The MROD. The MDT Precision Chambers ROD. Adriaan König University of Nijmegen. 5 October nd ATLAS ROD Workshop 1

A flexible stand-alone testbench for facilitating system tests of the CMS Preshower

ANTC205. Introduction

FELIX the new detector readout system for the ATLAS experiment

Scintillator Data Acquisition System

Readout-Nodes. Master-Node S-LINK. Crate Controller VME ROD. Read out data (PipelineBus) VME. PipelineBus Controller PPM VME. To DAQ (S-Link) PPM

SRS scalable readout system Status and Outlook.

Investigation of High-Level Synthesis tools applicability to data acquisition systems design based on the CMS ECAL Data Concentrator Card example

Straw Detectors for the Large Hadron Collider. Dirk Wiedner

Status of Trigger Server electronics

The Nios II Family of Configurable Soft-core Processors

Application Note: Optical data transmission with the Avocet Image Sensor

New slow-control FPGA IP for GBT based system and status update of the GBT-FPGA project

40-Gbps and 100-Gbps Ethernet in Altera Devices

FIC for L2 Inputs? With Updates and Decisions from Workshop

OPTIONS FOR NEXT GENERATION DIGITAL ACQUISITION SYSTEMS

Optical SerDes Test Interface for High-Speed and Parallel Testing

Signal Conversion in a Modular Open Standard Form Factor. CASPER Workshop August 2017 Saeed Karamooz, VadaTech

The CMS Event Builder

Schematic. A: Overview of the Integrated Detector Readout Electronics and DAQ-System. optical Gbit link. 1GB DDR Ram.

Testing Discussion Initiator

Terasic THDB-SUM SFP. HSMC Terasic SFP HSMC Board User Manual

Status of the CSC Trigger. Darin Acosta University of Florida

A HT3 Platform for Rapid Prototyping and High Performance Reconfigurable Computing

Challenges and performance of the frontier technology applied to an ATLAS Phase-I calorimeter trigger board dedicated to the jet identification

Timing distribution and Data Flow for the ATLAS Tile Calorimeter Phase II Upgrade

Design of Pulsar Board. Mircea Bogdan (for Pulsar group) Level 2 Pulsar Mini-Review Wednesday, July 24, 2002

Selection of hardware platform for CBM Common Readout Interface

SMT9091 SMT148-FX-SMT351T/SMT391

Development and test of a versatile DAQ system based on the ATCA standard

MDC Optical Endpoint. Outline Motivation / Aim. Task. Solution. No Summary Discussion. Hardware Details, Sharing Experiences and no Politics!

Multi-Gigabit Transceivers Getting Started with Xilinx s Rocket I/Os

Overview of SVT DAQ Upgrades. Per Hansson Ryan Herbst Benjamin Reese

Implementation of on-line data reduction algorithms in the CMS Endcap Preshower Data Concentrator Cards

Detector Data Acquisition Hardware Designs and Features of NGC (New General Detector Controller)

Transcription:

HCAL TPG and Readout CMS HCAL Readout Status CERN Drew Baden University of Maryland March 2002 http://macdrew.physics.umd.edu/cms/ see also: http://tgrassi.home.cern.ch/~tgrassi/hcal/ CMS/CERN. Mar, 2002 HCAL TriDAS 1

Readout Crate Changes Outside world slow monitoring, controls. Bit3 PCI/VME interface Known quantity TTC fiber Front End Gbit Ethernet @ 1.6 Gb 3U Rack Computer Dual processor DELL, fast PCI bus Chris Tully to take responsibility I m keeping track of M&S for operations Not yet added to WBS (I think.) UIC/HRC replaced by TTC/Clock Fanout HRC did some of the above B I T 3 F a n O u t D C C (s) H T R H T R... H T R H T R Clock fanout is critical for VME crate functions Fanout to all HTRs Fewer HTR/crate Mapping considerations Dual-width DCC considerations DAQ 20 m Copper Links 1 Gb/s Not yet settled. I don t like it but we ll see Calorimeter Regional Trigger CMS/CERN. Mar, 2002 HCAL TriDAS 2

HCAL TRIGGER and READOUT Card No functional changes since Dec-2001 I/O on front panel: Inputs: Raw data: 16 digital serial fibers from QIE, 3 HCAL channels per fiber = 48 HCAL channels Inputs: Timing (clock, orbit marker, etc.) PECL Outputs: DAQ data output to DCC Two connector running LVDS TPG (Trigger Primitive Generator, HCAL Tower info to L1) via P2/P3 Use aux card to hold Tx daughterboards Via shielded twisted pair/vitesse FPGA logic implements: Level 1 Path: Trigger primitive preparation Transmission to Level 1 Level 2/DAQ Path: Buffering for Level 1 Decision No filtering or crossing determination necessary Transmission to DCC for Level 2/DAQ readout CMS/CERN. Mar, 2002 HCAL TriDAS 3

Demonstrator Status Demonstrator 6U HTR, Front-end emulator Data, LHC structure, CLOCK 800 Mbps HP G-Links works like a champ Dual LCs This system is working. FEE sends clock to HTR, bypasses TTC HCAL FNAL source calibration studies in progress Backup boards for 02 testbeam Decision taken 3/02 on this (more ) Anticipate we will abandon this card for testbeam DCC full 9U implementation FEE HTR DCC S-Link CPU working Will NOT demonstrate HTR firmware functionality as planned Move to 1.6 Gbps costs engineering time Firmware under development now 6U HTR Demonstrator 6U FEE CMS/CERN. Mar, 2002 HCAL TriDAS 4

HTR Dense Scheme Throughput: 17 Gb/s Latency: + 2 T CK P1 8 FE fibers 24 QIE-ch s OPTICAL Rx (8 LC) FPGA Xilinx Vertex-2 SLB -PMC SLB -PMC to DCC 8 FE fibers 24 QIE-ch s LVDS Tx OPTICAL Rx (8 LC) FPGA Xilinx Vertex-2 P2 P3 SLB -PMC SLB -PMC SLB -PMC Trigger output 48 Trigger Towers to DCC LVDS Tx SLB -PMC 9U Board CMS/CERN. Mar, 2002 HCAL TriDAS 5

Dense HTR Dense (48 channel) scheme is now the baseline Money Fewer boards! Programmable logic vs. hardware Avoid hardware MUXs Maintain synchronicity Single FPGA per 8 channels Both L1/TPG and L1A/DCC processing Next generation FPGAs will have deserializers built in Xilinx Vertex-2 PRO and Altera Stratix announced Saves $500/board $100k ~20 connections to deserializer reduced to 1 connection at 1.6 GHz Single clock would serve 8 deserializers Probably won t get to have any of these chips until summer 02.schedule may not permit We will keep our eye on this 48 channels x 18 HTR x LVDS Tx to DCC exceeds DCC input bandwidth So, need 2 DCC/crate (but fewer crates) CMS/CERN. Mar, 2002 HCAL TriDAS 6

Prototype Status TPG transmission changed From SLB mezzanine cards to Backplane aux card Solves mechanical problems concerning the large cables to Wesley 1.6 GHz link Wider traces, improved ground planes, power filtering, etc. Deserializer RefClock fanout TTC daughterboard changed to TTC ASIC Fixed TI deserializer footprint problem Clocking fixes Next iteration estimate Submit in 2 weeks Stuffed and returned by April 1 FPGA+8 deserializers Out to DCC OLD DESIGN VME FPGA TTC and Clock distribution Dual LC Fiber Connector CMS/CERN. Mar, 2002 HCAL TriDAS 7

Current Hardware Status HTR Dual LC (Stratos) Receivers 2 deserializers 1.6 GHz link is the hardest part Made a LinkOnly board 2 dual LCs feeding 4 TI deserializers TI TLK2501 TRANSCEIVER 8B/10B decoding 2.5Volts 80MHz frame clock 20 bits/frame Internal use only This board works. We know how to do the link now Did not test the tracker NGK option NGK Tracker Dual LC (Stratos) Transceiver CMS/CERN. Mar, 2002 HCAL TriDAS 8

HTR Issues Optical link Stratus LC s work well, available, not very expensive, probably will get cheaper. Tracker solution? We think no this option appears to be dead. Clocking NGK/Optobahn not responsive Time scales for HTR is this summer Tracker group has kept us at arms length with respect to vendors Anticipate much ado about getting quotes and signing orders schedule risk is too great Savings is approximately $50/channel ($150k overall) Expect LC s to get cheaper will the NGK? Jitter requirements are surprising refclk needs to be 80MHz ± ~30kHz to lock and stay locked. This is because we are using a Transceiver, not a Receiver TI does not have a Receiver this is Gigabit ethernet, so it s meant for 2-way We can implement in 2 ways Onboard crystal PECL clock fanout Will have both for next iteration, board that will be in the testbeam summer 02 CMS/CERN. Mar, 2002 HCAL TriDAS 9

HTR Clocking TTC provides input clock for the VME crate modules. Clocks needed: DCC not critical HTR: Deserializers (16) need 80MHz clock with ~40ps pkpk jitter TPG transmission needs 40MHz clock with ~100ps pkpk jitter Pipeline needs 40MHz clock synchronous with data transmission Options eliminate: 80MHz crystal (eliminates 1 Mux) TTC Fanout Board clock to deserializers (eliminates 1fi 2 Fanout and 1 Mux) We will see what we learn at the Testbeam 02 TTC Fanout Board TTCrx HTR Board d e SLB Board TTC MUX 1 to 8 Fanout s e r i TTC PECL Fanouts to HTRs 80 MHz LVPECL Crystal 1fi 2 Fanout 1 to 8 Fanout a li z e r Clock/BC0 80 MHz s 40 MHz Clock/2 CMS/CERN. Mar, 2002 HCAL TriDAS 10

HTR Schedule Prototype 4/1/02 Prepare Prod n Prototype 1 FTE month April 02 Production Phase Checkout Prod n Production Prototype Prototype 1 FTE month May 02 Jun-Aug 02 Production Run Fall 02 Production Checkout 3-6 FTE month FPGA coding ongoing Checkout 2-3 FTE months April 02 In parallel May 02 OK? Testbeam Run May/June 02 Next iteration requires less checkout time. Fix 2 FTE weeks Cards at CERN July 1 at latest Testbeam Phase Testbeam Phase Next Checkout: 3 months Fix problems: 0.5 months Next checkout if needed: 1 month Production Phase Prep for next pre-production run: 1 month Checkout for production prototype: 1 month Production checkout: 3-6 months FPGA coding: 6 months? Total: 1.5 FTE years, around $180k Integration: ongoing difficult to say CMS/CERN. Mar, 2002 HCAL TriDAS 11

Current Project Timeline 2000 2001 2002 2003 2004 Demonstrator Project 1.6 GHz Link Pre-Prod Production Installation Begins between March and Sept 2003 FNAL source calib. Done Test Beam Jun-Sep Slice Test I STILL SOME UNCERTAINTIES Vertex-2 PRO or Altera Stratix Global clocking scheme Clock jitter CMS/CERN. Mar, 2002 HCAL TriDAS 12

DATA CONCENTRATOR CARD therboard/daughterboard design: VME motherboard to accommodate PCI Interfaces PCI interfaces (to PMC and PC-MIP) VME interface In production (all parts in house) PC-MIP cards for data input 3 LVDS inputs per card 6 cards per DCC (= 18 inputs) Engineering R&D courtesy of D In production (purchasing underway) 6 PC-MIP mezzanine cards - 3 LVDS Rx per card Data from 18 HTR cards PMC Logic Board Buffers 1000 Events Logic mezzanine card for Event Building, Monitoring, Error-checking S-Link64 output to TPG/DCC and DAQ TTCrx Fast busy, overflow to TTS Giant Xilinx Vertex-2 1000 (XC2V1000) Transmission to L2/DAQ via S-Link Fast (Busy, etc) TPG DCC DAQ CMS/CERN. Mar, 2002 HCAL TriDAS 13

Current Status DCC Motherboard VME Motherboard Production starting 5 prototypes in hand for CMS. All production parts bought PCB / Assembly order ~ May 02 CMS/CERN. Mar, 2002 HCAL TriDAS 14

Current Status DCC Logic Board and LRBs PC-MIP Link Receiver Design approved except for change to RJ-45 connector for links Final prototype PCBs on order Production parts on order Production to start ~ June 02 Logic Board final prototype Decisions about S-Link Data Width / Card location Expect final PCB design late CY 2002 Production in early 2003; driven by final decisions about functionality CMS/CERN. Mar, 2002 HCAL TriDAS 15

HCAL TIMING FANOUT Module Fanout of TTC info: Both TTC channels fanout to each HTR and DCC Separate fanout of clock/bc0 for TPG synchronization dasilva scheme Single width VME module CMS/CERN. Mar, 2002 HCAL TriDAS 16

Testbeam Clocking The only sane thing is to run the entire setup from a single high purity clock TTC input is single-ended LIMO. Source of our clock had better be clean. Chris will make us a 6U VME board 35MHz crystal, care taken on transmitter. Will be in same VME crate as TTC Can also put 30MHz, 37MHz, 40MHz jump selected for playing around We will have the same high quality clock for Tx and Rx HTR will have Crystals as a backup just in case. Single-ended Fiber Front End Fiber Data, QIE fi HTR Our Clock TTC System Readout Crate CMS/CERN. Mar, 2002 HCAL TriDAS 17

Cost to Completion M&S Not much change in DCC, VME Rack, VME Crate unit costs HTR cost increases by ~7% $320/board due to: $100/board due to quality requirements on traces (need constant impedance lines) $100/board for clock circuitry (Xtal, PECL stuff, etc.) $120/board for LC s (old estimate was based on quads, but we re going with duals) Addition of HTR backplane card to support 48 channel HTR net savings Cost decreases will surely come, but we don t know now. LC s will only go down in price TI deserializers are transceivers, receivers will be cheaper, TI will have to compete FPGAs w/deserializers. HRC replaced by TTC Fanout + Bit3 + Crate CPU Mapping really constrains us Some HTR will not be full of SLBs Still requires 1 SLB transition card per HTR Some crates will not be full of HTRs Original cost had up to 18 HTR/crate, now it s around 14 Results in a few more crates than 9/01 cost estimate CMS/CERN. Mar, 2002 HCAL TriDAS 18

Cost to Completion M&S (cont) Item 9/01 3/02 Unit Total ($k) Unit Total ($k) Comment $4.8k $1043 $5.1k $1128 Includes LED cards plus 7% increase per card Crates $5.5k $ 72 No change $ 88 3 more crates from mapping considerations Racks $3k $ 18 No change $ 24 2 more racks, 2 crates/rack $5k $ 150 No Change $ 180 3 more crates, 2 DCC/crate SLB+TPG $100+$100+$0 $ 110 $100+$100+$300 $ 176 SLB transition boards, already added to 2.1.7.15 s+slb_hex $5k $ 85 Old project, changed to Fanout $1.5k $ 40 Current UIC project, based on best guess for Fanout 3 (VME+PCI $3.8k/crate $ 61 Not accounted for in HCAL TriDAS Lehman 2000 + fiber) k Computer $2.9k/comp $ 23 Not accounted for in HCAL TriDAS Lehman 2000 L: $1478 $1720 $242k difference: HTR additions: $85k SLB transition card: $66k added Bit3/Rack Computer: $84k never costed before? More Racks/Crates: $52k due to mapping TTC Fanout: $40k change of task Total Reductions: $85k HRC change of task CMS/CERN. Mar, 2002 HCAL TriDAS 19

HTR Itemized Cost (Appendix) Costs as of April 2001. 12 fibers/card, 3 channels/fiber Lehman 01 slide FPGA (4 per card) PC Board (from D0 project) Fab & Assembly Fiber Receiver (PAROLI) Deserializer receivers Vitesse/LVDS Connectors (no P3!) Misc (FIFOs, MUX, VME, etc) Total $750 $200 $200 $480 $430 $ 50 $200 $400 $2,710 4x$750=$3k fi 2x$1.2k=$2.4k Tullio doesn t believe it. Up to $500 Tullio doesn t believe it. Up to $400 $640 for 16 dual LC s 8 vs. $580 for 16 TTC/clock circuitry since then X2 = $5420 v. $5120 without reductions CMS/CERN. Mar, 2002 HCAL TriDAS 20

Additional M&S Costs Test stands never accounted for $29k/crate total Proposal calls for 3 test stands - $87k total 1 @ FNAL 1 @ UMD/BU/Princeton 1 @ CERN EM Trigger This one used for testbeam and then moved to USCMS electronics area Rohlf estimates 1 crate ($5.5k), 2DCCs ($10k), 12 HTR ($62k) If we need additional Bit3 and CPU, it s another $6.7k Total would be $84k Testbeam We will be producing HTRs that may, or may not, be final versions. If so, no problem. If not, additional costs Estimate around 6 HTR including 2 spares, comes to around $40k 1 VME crate $5,500 1 VME Rack Computer $2,900 1 Bit3 w/fiber $3,820 2 HTR $10,252 1 TTC fanout $1,500 1 DCC $5,000 Total $28,972 CMS/CERN. Mar, 2002 HCAL TriDAS 21

Cost to Complete Effort BU/UIC Difficult to predict UMD: 1.5 FTE years (previous slide) BU 1.5 FTE years UIC Testbeam extra work: 3 FTE months Finish DCC prototype (FPGA code): 4 FTE months DCC final prototype : 6 FTE months SLINK64, 2-slot or Transition board Test engineering: 2 FTE months 1 FTE engineer, should be finished with TTC fanout by Fall 02 FNAL might want to keep him around to help with system clock issues CMS/CERN. Mar, 2002 HCAL TriDAS 22

HTR (Maryland): 6U Demonstrator done Testbeam Effort Project Status Summary OK if no disasters But integration is going to be a big effort and will interfere with Production effort Production Effort FPGA coding ongoing Will work on HTR in parallel w/testbeam effort TTC fanout First board assembled and tested Next iteration underway - Will be ok for testbeam Crate CPU issues Chris Tully is playing with Bit3 and DELL 3U rack mounted dual CPU DCC 9U Motherboard done., PCI meets 33MHz spec, Production run beginning Link Receiver Cards - Built, tested, done. Logic board Underway. Only issues are FPGA Coding, Backplane card vs. double-width CMS/CERN. Mar, 2002 HCAL TriDAS 23