ALICE inner tracking system readout electronics prototype testing with the CERN ``Giga Bit Transceiver''

Similar documents
New slow-control FPGA IP for GBT based system and status update of the GBT-FPGA project

Level-1 Data Driver Card of the ATLAS New Small Wheel Upgrade Compatible with the Phase II 1 MHz Readout

A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade

Validation of the front-end electronics and firmware for LHCb vertex locator.

Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade

Deployment of the CMS Tracker AMC as backend for the CMS pixel detector

IEEE Nuclear Science Symposium San Diego, CA USA Nov. 3, 2015

Integration and design for the ALICE ITS readout chain

The Phase-2 ATLAS ITk Pixel Upgrade

Development of a digital readout board for the ATLAS Tile Calorimeter upgrade demonstrator

Investigation of Proton Induced Radiation Effects in 0.15 µm Antifuse FPGA

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

The Design and Testing of the Address in Real Time Data Driver Card for the Micromegas Detector of the ATLAS New Small Wheel Upgrade

The ALICE trigger system for LHC Run 3

The GBT-SCA, a radiation tolerant ASIC for detector control and monitoring applications in HEP experiments

A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

MiniDAQ1 A COMPACT DATA ACQUISITION SYSTEM FOR GBT READOUT OVER 10G ETHERNET 22/05/2017 TIPP PAOLO DURANTE - MINIDAQ1 1

Construction of the Phase I upgrade of the CMS pixel detector

Versatile Link and GBT Chipset Production

Modules and Front-End Electronics Developments for the ATLAS ITk Strips Upgrade

Acquisition system for the CLIC Module.

Development and test of the DAQ system for a Micromegas prototype to be installed in the ATLAS experiment

PoS(High-pT physics09)036

ATLAS Tracking Detector Upgrade studies using the Fast Simulation Engine

Results of Radiation Test of the Cathode Front-end Board for CMS Endcap Muon Chambers

The Online Error Control and Handling of the ALICE Pixel Detector

Frontend Control Electronics for the LHCb upgrade Hardware realization and test

Adding timing to the VELO

THE ATLAS INNER DETECTOR OPERATION, DATA QUALITY AND TRACKING PERFORMANCE.

RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters

Prototyping of large structures for the Phase-II upgrade of the pixel detector of the ATLAS experiment

The LHCb VERTEX LOCATOR performance and VERTEX LOCATOR upgrade

Radiation-Hard ASICS for Optical Data Transmission in the First Phase of the LHC Upgrade

Optimizing latency in Xilinx FPGA Implementations of the GBT. Jean-Pierre CACHEMICHE

Upgrading the ATLAS Tile Calorimeter electronics

TEST REPORT POWER SUPPLY AND THERMAL V2

Fast data acquisition measurement system for plasma diagnostics using GEM detectors

Results on Array-based Opto-Links

Development of scalable electronics for the TORCH time-of-flight detector

Journal of Instrumentation. Related content. Recent citations OPEN ACCESS. To cite this article: A Boccardi 2013 JINST 8 C03006

ProtoFMC Safety Precautions Revision History

Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns

Simulating the RF Shield for the VELO Upgrade

Simulation of digital pixel readout chip architectures with the RD53 SystemVerilog-UVM verification environment using Monte Carlo physics data

The CCU25: a network oriented Communication and Control Unit integrated circuit in a 0.25 µm CMOS technology.

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Standardization of automated industrial test equipment for mass production of control systems

Evaluating a radiation monitor for mixed-field environments based on SRAM technology

Development and test of a versatile DAQ system based on the ATCA standard

SoLID GEM Detectors in US

Detector Control LHC

SPECS : A SERIAL PROTOCOL FOR EXPERIMENT CONTROL SYSTEM IN LHCB.

Muon Port Card Upgrade Status May 2013

First Operational Experience from the LHCb Silicon Tracker

TEST, QUALIFICATION AND ELECTRONICS INTEGRATION OF THE ALICE SILICON PIXEL DETECTOR MODULES

Physics Procedia 00 (2011) Technology and Instrumentation in Particle Physics 2011

A flexible stand-alone testbench for facilitating system tests of the CMS Preshower

The GBT-based Expandable Front-End (GEFE)

Control slice prototypes for the ALICE TPC detector

BTeV at C0. p p. Tevatron CDF. BTeV - a hadron collider B-physics experiment. Fermi National Accelerator Laboratory. Michael Wang

Tracker Optical Link Upgrade Options and Plans

FELI. : the detector readout upgrade of the ATLAS experiment. Soo Ryu. Argonne National Laboratory, (on behalf of the FELIX group)

SMT-FMC211. Quad DAC FMC. Sundance Multiprocessor Technology Limited

Initial Single-Event Effects Testing and Mitigation in the Xilinx Virtex II-Pro FPGA

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA

Control and Monitoring of the Front-End Electronics in ALICE

Physics Procedia 00 (2011) A high speed serializer ASIC for ATLAS Liquid Argon calorimeter upgrade

CSCS CERN videoconference CFD applications

The new detector readout system for the ATLAS experiment

Data acquisition system of COMPASS experiment - progress and future plans

Prototype Opto Chip Results

CANbus protocol and applications for STAR TOF Control

Trigger Server: TSS, TSM, Server Board

Performance of the ATLAS Inner Detector at the LHC

Associative Memory Pattern Matching for the L1 Track Trigger of CMS at the HL-LHC

The special radiation-hardened processors for new highly informative experiments in space

Intelligence Elements and Performance of the FPGA-based DAQ of the COMPASS Experiment

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Summary of Optical Link R&D

SFPFMC User Manual Rev May-14

OPTIONS FOR NEXT GENERATION DIGITAL ACQUISITION SYSTEMS

BES-III off-detector readout electronics for the GEM detector: an update

RPC Trigger Overview

The First Integration Test of the ATLAS End-Cap Muon Level 1 Trigger System

Versatile Link PLUS Transceiver Development

Muon Reconstruction and Identification in CMS

First experiences with the ATLAS pixel detector control system at the combined test beam 2004

Simulation study for the EUDET pixel beam telescope

TOF Electronics. J. Schambach University of Texas Review, BNL, 2 Aug 2007

All Programmable SoC based on FPGA for IoT. Maria Liz Crespo ICTP MLAB

Timing distribution and Data Flow for the ATLAS Tile Calorimeter Phase II Upgrade

SoLID GEM Detectors in US

Update of the BESIII Event Display System

Communication Software for the ALICE TPC Front-End Electronics

GBT-SCA Slow Control Adapter ASIC. LHCb Upgrade Electronics meeting 12 June 2014

Tracking and flavour tagging selection in the ATLAS High Level Trigger

arxiv: v2 [nucl-ex] 6 Nov 2008

CMS High Level Trigger Timing Measurements

Next generation associative memory devices for the FTK tracking processor of the ATLAS experiment

Transcription:

Journal of Instrumentation OPEN ACCESS ALICE inner tracking system readout electronics prototype testing with the CERN ``Giga Bit Transceiver'' Related content - The ALICE Collaboration - The ALICE Collaboration - The ALICE Collaboration To cite this article: J. Schambach et al View the article online for updates and enhancements. This content was downloaded from IP address 46.3.194.141 on 25/11/2017 at 12:06

Published by IOP Publishing for Sissa Medialab Topical Workshop on Electronics for Particle Physics, 26 30 September 2016, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany Received: November 14, 2016 Accepted: December 17, 2016 Published: December 28, 2016 ALICE inner tracking system readout electronics prototype testing with the CERN Giga Bit Transceiver J. Schambach, a,1 M.J. Rossewij, b K.M. Sielewicz, c,d G. Aglieri Rinella, c M. Bonora, c,e, J. Ferencei, f P. Giubilato, c and T. Vanat f,g on behalf of the ALICE collaboration a Department of Physics, The University of Texas at Austin, 2515 Speedway, Austin, TX, U.S.A. b Department of Physics, Scientific Instrumentation, Utrecht University, Sorbonnelaan 4, Utrecht, Netherlands c EP Department, CERN, CH-1211 Geneva 23, Switzerland d The Institute of Electronic Systems, Warsaw University of Technology, Nowowiejska 15/19, Warsaw, Poland e Department of Computer Sciences, University of Salzburg, Jakob-Haringer-Str. 2, Salzburg, Austria f Department of Digital Design, Czech Technical University in Prague, Thákurova 9, Praha, Czech Republic g Department of Nuclear Spectroscopy, Nuclear Physics Institute ASCR, Řež 130, Řež, Czech Republic E-mail: Joachim.Schambach@cern.ch Abstract: The ALICE Collaboration is preparing a major detector upgrade for the LHC Run 3, which includes the construction of a new silicon pixel based Inner Tracking System (ITS). The ITS readout system consists of 192 readout boards to control the sensors and their power system, receive triggers, and deliver sensor data to the DAQ. To prototype various aspects of this readout system, an FPGA based carrier board and an associated FMC daughter card containing the CERN Gigabit Transceiver (GBT) chipset have been developed. This contribution describes laboratory and radiation testing results with this prototype board set. Keywords: Data acquisition circuits; Digital electronic circuits; Front-end electronics for detector readout; Radiation-hard electronics 1Corresponding author. CERN 2016 for the benefit of the ALICE collaboration, published under the terms of the Creative Commons Attribution 3.0 License by IOP Publishing Ltd and Sissa Medialab srl. Any further distribution of this work must maintain attribution to the author(s) and the published article s title, journal citation and DOI. doi:10.1088/1748-0221/11/12/c12074

Contents 1 Introduction 1 2 ALICE ITS upgrade 1 3 ITS upgrade readout system architecture 3 4 Readout Unit (RU) prototype and GBTxFMC 3 5 GBT testing 5 1 Introduction The ALICE ( A Large Ion Collider Experiment ) experiment is studying strongly interacting hadronic matter using nucleus-nucleus, proton-nucleus, and proton-proton collisions at the CERN Large Hadron Collider (LHC). During the LHC Long Shutdown 2 (LS2) in the years 2019/20 the ALICE detector will be upgraded to make high-precision measurements of rare probes (like charm and bottom decays) at low p T while sampling the full 50 khz Pb-Pb interaction rate foreseen following the shutdown with a new trigger-less readout mode of various upgraded sub-detectors. The Inner Tracking System (ITS) upgrade will replace the current ITS with a new seven-layer silicon pixel detector based on Monolithic Active Pixel Sensor (MAPS) technology to provide more precise displaced vertex reconstruction and low p T tracking by reducing the material budget compared to the current ITS by a factor 4, installation of a reduced diameter beam-pipe, which allows the innermost layer to be closer to the vertex, and also providing for continuous readout to sample the full interaction rate. 2 ALICE ITS upgrade The upgraded ALICE ITS design [1] employs seven layers of silicon MAPS sensors configured in two separate barrels as shown in figure 1. The basic sensing units of the ITS are 30 mm 15 mm MAPS sensors with 29.24 26.88 µm 2 pixels specifically developed for the ALICE ITS and implemented in 0.18 µm TowerJazz CMOS technology [2, 3]. The sensors are thinned to 50 µm in the inner barrel and 100 µm in the outer barrel. A total of 24,120 of these sensors are distributed in 7 concentric layers with radii from 22 mm to 400 mm, subdivided azimuthally into units called staves with lengths from 29 cm in the innermost 3 layers to 150 cm in the outermost 2 layers, and provide a total detection area of 10 m 2 segmented into more than 12.5 billion pixels. The inner barrel consists of three layers of staves of the same architecture. The outer barrel consists of two intermediate layers and two outer layers, each with distinct stave architectures. On the inner barrel, each stave consists of 9 sensors with individual differential data lines running at 1.2 Gbps, a shared bi-directional differential control and monitoring line, and a shared 1

Figure 2. Inner Layer (top) and Outer Layer (bottom) stave architecture and connectivity. differential clock line. On the outer barrel staves, sensors are combined into modules consisting of 2 rows of 7 sensors, where 1 master sensor interfaces with 6 slave sensors on a flex printed circuit (FPC) containing the data, control, monitoring and clock signals. The two master sensors each provide a differential data line running at 400 Mbps. These modules are arranged in two rows of 4 modules for the staves of the middle two layers and two rows of 7 modules for the outermost two layers. The clock and control/monitoring lines are shared between the master sensors of the modules on a stave. A schematic view of these two stave architectures is shown in figure 2. There are a total of 192 staves, 48 staves in the three inner barrel layers, 54 for the two middle layers, and 90 for the two outer layers. 2 Figure 1. Layout of the upgraded Inner Tracking System [1].

3 ITS upgrade readout system architecture The ITS upgrade requires a new readout system designed to be able to read the ITS data up to a rate of 100 khz for Pb-Pb collisions and 400 khz for p-p collisions, including a safety factor of 2. The readout system connects to the sensor data, control, and clock lines on the detector side, receives trigger and control information from the Central Trigger Processor (CTP) of the ALICE trigger system, and delivers sensor data to the data acquisition system over (bi-directional) optical fibers to the Common Readout Unit (CRU) of the ALICE data acquisition system in the counting room. The readout system also needs to fulfil ancillary functions like controlling and monitoring the power system in order to quickly detect and recover from latch-up states in the sensors. The current design for this readout system (as shown in figure 3) foresees a modular Readout Unit (RU), each connected to one stave, resulting in a total of 192 RUs for the ITS. The RUs will be located 5 m from the end of the staves in the experimental hall. This location is characterized by a radiation environment resulting in a total ionizing dose of < 10 krad and a high-energy hadron flux (capable of causing single-event upsets in the electronics) of 1 khz cm 2. The RUs thus need to be designed radiation tolerant while being able to deal with the high-speed data links of the sensors. The design of the RUs consists of an FPGA to handle the data collection and formatting, as well as the control of the sensors, and a radiation-hard fiber-optic transmission chipset, the Gigabit Transceiver Optical Link (GBT) developed by the GBT collaboration for the LHC experiment upgrades [4, 5]. The GBT consists of the GBTx serializer/deserializer ASIC, the VTRx/VTTx optical transceiver/transmitter modules, and the GBT-SCA slow control ASIC. 4 Readout Unit (RU) prototype and GBTxFMC In order to evaluate various aspects of this design including the radiation tolerance, a prototype RU has been designed consisting of a Kintex-7 FPGA based carrier board ( RUv0a ) and an FMC (FPGA Mezzanine Card, [6]) that accommodates the GBT chipset ( GBTxFMC ). The separation into a carrier card and an FMC allowed for concurrent development by different collaborating groups and also provides the flexibility to use the GBTxFMC with other users and carrier boards containing different FPGAs from various manufacturers. The RUv0a prototype board is described 3 Figure 3. Architecture of the ITS Readout System.

Figure 4. Prototype RU consisting of the Kintex-7 based carrier board and the FMC accommodating the GBT chipset. in more detail in [7]. The overall architecture of the prototype system illustrating the division in the two different cards and the connections between these is illustrated in figure 4. The GBTxFMC is a single width FMC accommodating the GBTx and GBT SCA chip, and a VTRx and VTXx module. The VTRx provides the bi-directional 4.8 Gbps line-rate Electrical/Optical conversion and is connected to the GBTx. The VTXx can either be a VTRx or VTTx where the gigabit data signals are connected directly to the FMC connector with the intention to be controlled by the FPGA transceivers. Figure 5 shows a picture of GBTxFMC illustrating the various GBT components placed on this board. The VTRx laser diode power can be controlled from the GBTx via I 2 C. The VTXx laser diode power can be controlled by the SCA master I 2 C. Depending on jumper settings, these I 2 C can also go to the carrier board via the FMC connector. The SCA JTAG master interface goes to the FMC interface with the intention to support remote (re)configuration and scrubbing of the carrier board FPGA. Scrubbing refers to an error correction technique that periodically inspects the configuration memory of the FPGA for errors caused by single-event upsets due to the radiation environment at the location of the RUs and then replaces the detected errors with a copy of the configuration data kept in an external memory. The SCA is connected to the GBTx by the EC e-links. To support applications where the SCA is located more remotely (e.g. on the carrier board), the EC e-link can be connected to the FMC connector by placing bypass resistors and removing the SCA. All GBTx e-links involved in the 320 Mbps transfer (including the ones used additionally in GBT wide bus mode) are routed through the FMC-LPC (Low Pin Count) pins providing the full GBTx bandwidth when using GBTxFMC with an LPC carrier board. Within each e-link group, all links are length matched within 25 ps. Between the groups, the length mismatch of the individual e-links is at most 170 ps. 4

The additional e-links needed to run all e-links at 160 Mbps are routed on the HPC (High pin count) connector pins. The additional e-links needed for operating at 80 Mbps are not routed in this design. So for 80 Mbps, only half the GBT bandwidth is available. Wherever possible, the mapping of the e-links on the FMC is compatible with the CERN e-link FMC board [10]. All e-links go through the FMC A-bank, which is normally related to the FPGA IO banks powered by Vadj (= 2.5 V on GBTxFMC). Most FPGAs (including the Kintex-7) support LVDS with an IO bank voltage of 2.5 V. The Single Ended (SE) IO go through the B-bank (this always implies an HPC FMC) which is normally related to the VIO_B rail (= 1.5 V on GBTxFMC). The single ended signals are LVCMOS at 1.5 V. The GBTx SE signals are RESETB, RxDataValid, TxDataValid, and GBTxI2C. The SCA SE signals connected are SCRESET_B, JTAG, SCI2C, 2 I2 C master and 6 of the 32 GPIO. The GBTxFMC provides an 8 pin header that allows direct connection of the CERN GBT group s USB-I2C interface dongle [10] for configuration of the GBTx registers. The GBTxFMC follows the FMC standard wherever possible but lacks full compliance due to the absence of an I2 C PROM which is supposed to contain the FMC hardware definition information. Moreover, the VIO_B is an output rail in the FMC standard for the carrier whereas on the GBTxFMC it is used as a 1.5 V input rail. It must also be noted that the optional JTAG pins are operated in reverse direction on this board. 5 GBT testing The combination of the RU prototype cards RUv0a and GBTxFMC allowed for testing both the connections between the GBTx and FPGA as well as some of the interfaces provided by the GBTSCA. One of the first aspects investigated with this system was the electrical connection between the GBTx and the FPGA. As shown in figure 4, the ten 320 Mbps e-links of the GBTx on GBTxFMC and their associated e-link clocks are connected to the FPGA on RUv0a with and without leveltranslators to two different voltage FPGA IO banks. Bit-error tests on these two different connection schemes showed that these level translators are not needed to interface with the Kintex-7 I/Os. The RUv0a board includes an SFP+ optical transceiver module controlled by FPGA transceiver lines to allow implementation of the GBT protocol in the FPGA ( GBT-FPGA, [8]), thus enabling 5 Figure 5. Picture of the GBTxFMC bottom side showing the various GBT components.

Figure 6. Block diagram of the GBT test setup with 2 RUv0a boards, one running the GBT-FPGA firmware, the other interfaced to GBTx. the emulation of the counting house electronics on the same board. The GBT-FPGA firmware can either be part of the firmware that also controls the GBTx or on a second RUv0a. Both the GBT- FPGA firmware and the GBTx control firmware are interfaced to a USB interface on RUv0a, which allows control and monitoring of these firmware modules from a PC. An I 2 C firmware module allows one to configure and monitor the GBTx chip via USB. A block diagram showing the GBT test setup with the firmware implemented in two different boards is shown in figure 6. Data for either the GBTx or the GBT-FPGA core are provided by a firmware pattern generator implementing various fixed and variable test patterns. The GBTx includes an internal pattern generator, that can be used as a data source alternatively to the firmware pattern generator. Configuration of the GBTx also allows the received data to be used as a source for the GBTx transmitter. The received data in either system is then checked by a firmware module, pattern errors are logged by counters in firmware that can be read out over USB to the controlling PC. In addition to data transmission tests this test setup was also used to test various interfaces of the GBT-SCA. On the RUv0a, the GBT-SCA implemented interfaces are the JTAG channel, which is connected to the FPGA configuration port, 6 of the 32 GPIO pins, which are connected to bi-directional I/Os on the FPGA, and I 2 C, connected to FPGA I/O controlled by a firmware I 2 C slave module. A firmware interface to the GBT-FPGA core was developed to control the various GBT-SCA interface channels through the GBT-FPGA via the USB connection to the PC. Another firmware module interfaces the GPIO bits in the FPGA to the USB for both control and monitoring. The GPIO bits could thus either be set via the GBT-FPGA core communicating with the SCA chip through the GBTx and then verified at the FPGA, or vice-versa. The JTAG channel is connected to the configuration port of the FPGA with the intention to load FPGA firmware over the GBT link and perform partial reconfiguration. To verify proper functioning of the JTAG channel, we performed a JTAG_ID transaction from the PC to the FPGA, using the GBT-SCA as the JTAG master. The I 2 C channel was verified using a simple I 2 C slave firmware in the FPGA to allow responses to read and write transactions programmed via the USB interface to the GBT-SCA. 6

Figure 7. Plots of the number of FEC and SEU corrections vs time. In the top plot, GBTx is running in transceiver" mode; in the bottom plot GBTx is running in TX only" mode. The configuration shown in figure 6 was also used to test the behavior of the GBTx chip in radiation. In this case, the board with the GBT-FPGA core was placed in an area shielded from the radiation, while the board with the GBTx was mounted behind an aluminum plate with a hole that allowed to only expose the GBTx chip to the beam, while shielding the FPGA and other components on the board. The irradiation experiments were conducted at the cyclotron of the Nuclear Physics Institute of the Academy of Sciences of the Czech Republic in Řež near Prague. The machine provides a proton beam with an energy range from 6 to 37 MeV with equivalent proton flux ranges from 10 4 to 10 14 cm 2 s 1, over a uniform area of about 2.5 2.5 cm 2 [9]. Measurements were performed with a flux of 1 10 8 cm 2 s 1 at a proton energy of 36 MeV at the cyclotron s pipe output, which results in an energy of 32 MeV incident on the metal case of the GBTx chip. The error counters of both pattern checkers were read out about once per second and logged on the control PC. The USB interface also allowed the readout of various GBTx internal monitoring registers, including the number of Forward Error Corrections (FEC) performed by the GBTx ASIC on the received data and the number of SEU corrections in the GBTx configuration registers. The plots in figure 7 show two representative examples of the number of FEC and SEU corrections in the configuration registers for GBTx vs time for two different running modes of GBTx (note that the GBTx monitoring registers are not resettable and only 8 bits, so the counts start at some random value and wrap at 256). In the top plot, GBTx is running in transceiver mode 7

and the registers plotted are read from the irradiated GBTx; test patterns are generated on the board outside the radiation area. In the bottom plot, the GBTx in the beam is running in transmitter mode; the test patterns to be transmitted in this case are generated by the internal pattern generator of the irradiated GBTx. A second GBTx outside the radiation area is receiving the test patterns and the number of FEC are read from this GBTx, while the SEU corrections are read from the irradiated GBTx. The pattern checkers registered no bit-errors in either case, thus verifying the effectiveness of the FEC performed in the GBTx logic. Comparing the number of FECs per unit time in the plots shows that the receiver section of the GBTx is much more sensitive to radiation than the transmitter section, while the number of SEU corrections per unit time for the two different configurations of the GBTx is about the same. Acknowledgments This work was supported in part by the U.S. Department of Energy, Office of Nuclear Science, under Contract Numbers DE-AC02-05CH11231 and DE-SC0013391. Measurements at the Nuclear Physics Institute of the Academy of Sciences of the Czech Republic in Řež (CANAM infrastructure) were supported by MŠMT project No. LM2015056. References [1] ALICE collaboration, Design report for the upgrade of the ALICE Inner Tracking System, J. Phys. G 41 (2014) 087002 [CERN-LHCC-2013-024] [ALICE-TDR-017]. [2] G. Aglieri Rinella, The ALPIDE pixel sensor chip for the upgrade of the ALICE Inner Tracking System, Nucl. Instrum. Meth. A, in press. [3] M. Mager, ALPIDE, the monolithic active pixel sensor for the ALICE ITS upgrade, Nucl. Instrum. Meth. A 824 (2016) 434. [4] P. Moreira et al., The GBT, a proposed architecture for multi-gb/s data transmission in high energy physics, talk given at Topical Workshop on Electronics for Particle Physics (TWEPP 2007), September 3 7, Prague, Czech Republic (2007). [5] P. Moreira et al., The GBT project, talk given at Topical Workshop on Electronics for Particle Physics (TWEPP 2009), September 21 15, Paris, France (2009). [6] FMC, FPGA Mezzanine Card, ANSI/VITA 57.1 standard, http://www.vita.com/fmc. [7] K. M. Sielewicz et al., Prototype readout electronics for the upgraded ALICE Inner Tracking System, talk given at Topical Workshop on Electronics for Particle Physics (TWEPP 2016), September 26 20, Karlsruhe, Germany (2016). [8] M. Barros Marin et al., The GBT-FPGA core: features and challenges, 2015 JINST 10 C03021. [9] Nuclear Physics Institute, ASCR, Department of Accelerators, http://accs.ujf.cas.cz/. [10] S. Baron, Versatile Link Demo Board (VLDB), Specifications 1.01 (2015). 8