The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

Size: px
Start display at page:

Download "The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland"

Transcription

1 Available on CMS information server CMS NOTE 2002/xxx The Compact Muon Solenoid Experiment CMS Note Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Draft of 1 January 2002 Link System and Crate Layout of the RPC Pattern Comparator Trigger K. anzuzi, H. Katajisto, E. Tuominen, D. Ungaro Helsinki Institute of Physics, Helsinki, Finland J. Królikowski a),m.i.kudla a) Warsaw University, Warsaw, Poland K. Pozniak a) Warsaw University of Technology, Warsaw, Poland M. Górski b),g.wrochna b),p.zalewski b). Soltan Institute for Nuclear Studies, Warsaw, Poland Abstract The baseline crate and link layout is described in CMS IN 2000/044 and Trigger TDR. In the present paper an attempt is made to improve on the baseline design by minimizing the number of crates as well as the number and the lengths of interconnections. a. Partially supported by the Polish Committee for Scientific Research under grant KN 115/E-343/SPU/P-03/DZ 8/99. b. Partially supported by the Polish Committee for Scientific Research under grant KN 621/E-78/SPU/P-03/DZ 5/99.

2 1 RPC geometry 1.1. Coordinate systems The absolute CMS coordinate system is defined with respect to the LHC ring. The X axis points towards the center of the ring, the Y axis points up and the Z axis is defined as per a right handed coordinate system. The absolute azimuth φ runs from the X axis (at 0 ) towards the Y axis arrel The twelve-fold symmetry of the CMS magnet yoke naturally leads to the partitioning of the barrel RPC system into 12 sectors. We call them physical sectors as opposed to the logical trigger sectors discussedinsec.2.3.the sectors are centered at n 30. They are numbered from 1 to 12 anticlockwise (same direction as φ), starting from φ=0. Chambers in each sector are perpendicular to the radial direction at the sector center. There are four muon stations, which in the case of RPC are denoted by R1, R2, R3 and R4. Stations R3 and R4 contain one (double gap) RPC layer each. Stations R1 and R2 contain two RPC layers. In this paper we distinguish them by subscripts in and out for inner and outer layer of each station respectively. Each RPC in R3 and R4 (except R4/4 and R4/10) is divided in φ into two parts. The barrel consists of 5 wheels. RPC strips in the reference plane of the trigger are divided in 3 inside each wheel. This is required to ensure proper trigger segmentation in η (see Sec. 2.2). As the reference plane we have chosen outer R2 in wheels -2,2 and inner R2 in wheels -1,0,1 (see Fig. 1). All other RPC are divided in 2 inside each wheel. wheel -2 wheel -1 wheel 0 wheel +1 wheel +2 R2 out R2 in R Figure 1: R2 layout. The reference plane is divided in 3 in each wheel. Z 1.3. Endcap There are four endcap muon stations, which in the case of RPC are denoted by RE1, RE2, RE3 and RE4. It is forseen to upgrade RE2 with a second plane called in this paper RE5. Each station contains one (double gap) RPC layer. Each station is divided in R into 3 chambers denoted RE*/1, RE*/2 and RE*/3. Chambers RE2/1, RE3/1 and RE4/1 cover 20 in φ, whereas all other RE cover 10. Each chamber is equipped with strips further divided in η in 2, 3, or 4, as shown in Fig. 2. In this paper they are denoted by letters a, b, c and d. c b a c b a d c b a /3 /2 /1 RE2/3c Figure 2: Endcap RPC naming convention. b a 10 RE1 RE2 (=RE5) RE3 = RE4 not to scale 2

3 2 Trigger segmentation 2.1. Requirements Segmentation of the trigger is a result of the RPC segmentation into strips. It has to obey the following requirements: coverage up to η =2.1 4RPCplanestobecrossedbyeachtrack(asfaraspossible) strips of φ=5/16 and η~0.1, rectangular in the barrel and trapezoidal in the endcaps trigger towers defined by R2 in in wheels -1,0,1, by R2 out in wheels -2,2 (Fig. 1) and by RE2 in the endcap (Fig. 3a). In addition there are specific requirements for the endcap: each strip connected to 1-2 trigger η-towers (Fig. 3b) each trigger tower connected to 1-2 strips in η (Fig. 3c) continuous active area in φ (overlap of frames, Fig. 3d). a) b) c) d) 2.2. Trigger η-towers Figure 3: Requirements on the RPC geometry and trigger segmentation in the endcap. Trigger segmentation in η which fulfills the above requirements is shown in Fig. 4. The logical plane numbers in each tower are denoted close to the RPC. Please note, that in most of the towers logical and physical plane numbers are the same. Only in towers 6, 7 and 8 the logical tower assignment is different from the physical one. R (cm) tower number η 100 logical plane number Z (cm) Figure 4: Geometry of RPC strips and trigger towers. 3

4 2.3. Trigger φ-sectors Trigger logic is naturally divided into 12 sectors, 30 degrees each. One sector of one η-tower is treated by one Trigger oard. Relation of this logical sectors to physical chambers has been optimized taking into account physics performance, technical feasibility and overall cost of the system (Fig. 5).! " # $ % * & ) ' # $ " %! & ' ) * 4 2 A JA? J H JH EC C A H I A C A J= JE 4 2 A JA? J H F D O I E? = I A C A J= JE JH EC C A H C E? = I A C A J= JE Figure 5: Layout and numbering of detector (physical) and trigger (logical) 30 sectors in φ. Each Pattern Comparator (PAC) chip uses 8 strips in the reference plane. ecause of the track bending the number of strips from other planes connected to a PAC is larger. This means that some strips are shared by neighbouring PAC s. Hence, at sector boundaries some strips are shared by two sectors. In other words, each Trigger oard, housing 12 PAC chips, must receive data from 30 plus some overlap. The exact values of the overlap needed in the barrel are given in Table 1. In some sectors of the barrel the number of strips connected to a PAC is slightly smaller than the one given in Table 1 because of dead areas due to mechanical construction of the detector. The overlaps in the endcap are equal to the low p t ones in the barrel. In any case the margin required in the endcap is not larger then ±10. It was found that the optimal layout of logical trigger sectors is with centres at φ=20 +n 30, i.e. the first sectors ranges from 5 to 35. It means that the logical trigger sectors are shifted with respect to the physical sectors (see Fig. 5) by 20. This way each trigger sector receives signals from two physical sectors and each physical sector sends signals to two trigger sectors. Table 1: Overlap of logical trigger sectors in the barrel. There is 12 PAC s per 30 sector. logical plane physical chamber PAC input (in strips) low p t sector opening strips / sector physical chamber PAC input (in strips) high p t sector opening strips / sector 1 R1 in R1 in R2 ref R2 ref R2 nref R R1 out R

5 Trigger sectors centered at φ=20 +n 30 arealsoverywellmatchedtotherpcgeometryintheendcaps.itcan be seen from Fig. 6. In the endcaps chambers RE2/1, RE3/1 and RE4/1 cover φ=20. All others cover φ=10. They are connected to optical links in modules of 30. The trigger sectors are marked with 10 margins. One can see that it is enough to connect one link to each trigger sector in RE2 and two links to each trigger sector in other stations RE2/23 20 RE2/1 endcap links other 10 RE other 20 RE sector 3 sector 1 sector 4 sector 2 sector 12 Figure 6: Logical trigger sectors and physical endcap chambers. 3 Link System layout The Link System layout was presented in [2] and [5]. Since then several modifications have been proposed for chambers RE*/1. elow we describe the current baseline Link oards RPC signals from Front End oards (FE) are send through an LVDS twisted pair cables to Link oards (L). Each Link oard receives data from up to 6 FEs = 96 channels. Slave Link oard () compresses the data and sends them through an LVDS twisted pair cable to a Master Link oard (ML). Each Master Link oard (ML) is connected to not more then 2 Slave Link oards (). ML compresses its own data, merges them with the data and transmits them to the Counting Room through an optical fiber. In the Counting Room the data from each fiber are converted again to LVDS and split between several Trigger oards (T). Decompression on a T is performed by Link Demultiplexer (LDMUX). Special Large Link oard (LL) are foreseen for RE1/1. Each LL receives data from 8 FEs = 128 channels. There are two optical fibers coming out of each three LL (compare Fig. 8 and Fig. 9). In both barrel and endcaps, each link covers 30 in φ. Each link is sent to two trigger sectors (in φ), except for the RE2 (the reference plane in the endcap), which data go to only one sector (see Fig. 6). Each T needs data from two physical sectors, except for RE2 (the reference plane in the endcap), which needs data from one sector only. 5

6 3.2. arrel links RPC strips from one physical 30 sector of one barrel wheel are connected to 5 optical fibres denoted A,, C, D, E respectively, as shown in Fig. 7. The total number of barrel links = 5 links 12 sectors 5wheels=300. E D R4 E D E D R3 E D R C C C A R2 out R2 in C C C A A A R1 out R1 in A A Z wheels -2, 2 wheels -1, 0, 1 Figure 7: arrel RPC links in one physical 30 sector of one wheel. Links A, and C require 1 ML + 2 each. Links D and E require 1 ML + 1 each, except for sectors 4 (CMS top) and 10 (CMS bottom) which require 1 ML + 2. This is because R4/4 consists of 4 chambers, each having 3 FEs (12 FEs in total) and R4/10 consists of 2 chambers, each having 5 FEs (10 FEs in total). All other R4 have at most 6 FEs, which can be handled by a single L. Thus we have: / sector / wheel = =10 (sector 4 and 10), or =8 (other 10 sectors) / wheel = 2 sectors sectors 8=100 Totalbarrel=5wheels 100 = 500 TotalbarrelML=5wheels 12 sectors 5ML=300. Link oards in each sector are housed in Link oxes. In each box in sectors 4 and 10 there are ML = 15 L in total. In other sectors there are ML = 13 L in each box. Inventory of the Link System in the barrel is given in Table 2. Table 2: Inventory of the Link System in the barrel Link oxes ML ML+ sector 4, sector 1, 2, 3, 5, 6, 7, 8, 9, 11, wheel barrel

7 3.3. Endcap links RE chambers are connected to optical links in the way shown in Fig. 8. not to scale 10 = 32 strips ML ML ML ML ML ML L S L L M L L S L ML ML ML RE1 RE2 (=RE5) RE3 = RE4 ML = Master Link oard = 96 strips = Slave Link oard = 96 strips 0-2 1ML 1fibre LML = Large ML = 128 strips L = Large = 96 strips 2L 1LML 2fibres Figure 8: Endcap RPC Link oards. The absolute φ position should be read from Fig. 6. not to scale 10 = 32 strips RE1 RE2 (=RE5) RE3 = RE4 Figure 9: Endcap RPC links. Special Large Link oards (LL) for RE1/1 are placed in the corner between barrel and endcap hadron calorimeter. They are housed in 6 Link oxes per endcap distributed every 60 in φ. There are 6 LL per box. Two of them are Masters (LML), each sending data to 2 optical fibers. Each LML receives data from 2 L. RE1/2,3 and RE2 chambers are mounted on the oposite surfaces of the same iron disk YE1. Therefore they share common Link oxes. There are 12 those boxes in each endcap, each box serving 30 and housing 6 ML + 10 = 16 Link oards. RE3 chambers, likewise RE4, are connected to 6 Link oxes in each endcap, each box serving 60 and housing 6ML+10=16Linkoards. 7

8 If chambers RE5 are installed, they should be connected to 12 Link oxes in each endcap, each box serving 30 and housing 4 ML + 6 = 10 Link oards. The number of endcap Link oards is given in Table 3 and Table 4. Table 3: Inventory of Large Link oards for RE1/1. per box per 2 endcaps φ LML L LML+L boxes LML L LML+L RE1/ Table 4: Inventory of Link oards in the endcap. per box per 2 endcaps φ ML ML+ boxes ML ML+ RE1/2,3+RE RE RE (RE5) total without RE total with RE The overall inventory of the Link System is given in Table 5. The number of boards and links without RE5 are given together with those including RE5 (in parenthesis). Table 5: Link System overall inventory (in parentesis - with RE5). ML LML L fibers barrel endcap 288 (384) 480 (624) (432) total 588 (684) 980 (1124) (732) 8

9 0 R (cm) Naming convention A new naming convention is proposed which is more suitable for VHDL language description. It uses only digits and letters and avoids signs like +, - or /. The barrel links are named swc, where s - detector side: N-negative, M-middle, P-positive w - wheel number: 0, 1, 2. c - link: A,, C, E, D TheendcaplinksarenamedEsdrp, where s - detector side: N-negative, P-positive d - disk number: 1, 2, 3, 4, (5) r - RPC ring number 1, 2, 3 (in this convention the chambers are named REd/r, e.g. RE1/1) p - strips radial id: ab, cd (see Fig. 2). R (cm) M0C P0E P1D P1E P2D P2E P2C P1C P0 P1A P1 P2A P2 EP11cd EP11ab EP12 EP13 EP22 EP23 (EP52) (EP) EP21cd (EP51cd) EP21ab (EP51ab) EP31 EP32 EP33 EP43 EP42 EP Z (cm) EN43 EN42 EN41 EN33 EN32 EN31 EN23 EN22 (EN) (EN52) EN21cd (EN51cd) EN21ab (EN51ab) EN13 EN12 N2 EN11cd EN11ab N2E N2D N1E N1D N2C N2A N1 N1C N1A N0E N M0C Figure 10: New naming convention for the RPC links. 9

10 4 Layout of trigger crates 4.1. TDR version φ-rings Single Trigger oard (T) houses 12 PAC processors and covers η φ =1η-tower 30. Single Trigger Crate (TC) houses 12 Trigger oards of the same η-tower and thus covers a full ring of η φ =1η-tower 360. The number of crates is equal to 33 (the number of η-towers). The major drawback of this solution is difficult connection of optical links to the Trigger oards. Splitter oards receives data from optical fibers and distribute them to different Trigger oards. One link has to be split up to 2 sectors in φ andupto4towersinη (the worst cases are the barrel links A1, 1, A2), i.e. up to 2 4=8 Trigger oards located in 4 different crates New layout η-wedges In order to overcome the drawbacks mentioned in the previous section a new layout of Trigger Crates was proposed. Each Trigger Crate houses now Trigger oards of the same 30 -sector in φ. Technology advances make possible the Trigger oard to house 24 PAC s instead of 12 and cover twice larger area: η φ =1η-tower 30. In such case the full wedge of η φ =33η-towers 30 is covered by a single Trigger Crate housing 17 T s. Trigger (logical) wedges are shifted by 20 with respect to the physical ones and each Trigger Crate needs to receive data from two physical wedges. It is possible to arrange Splitter oards into 12 crates, each covering one 30 physical wedge. The system would consist of 6 racks, each housing two Trigger Crates and one or two Splitter Crates (SC) as shown in Fig. 11. The signals from each Splitter Crate should be distributed to the Trigger Crates in the same rack and to the Trigger Crates in one neighbouring rack. In order to close the φ-ring without very long connection one has to install the racks in the following order: 1,12,2,11,3,10,4,9,5,8,6,7. Therefore the longest SC TC connection would have the length of 2 racks distance see (Fig. 11). The number of copper connections could be largely reduced if each link is optically split into two. In such case all copper connections stay within a single rack. One Splitter Crate could receive (61) links from a given detector wedge (Table 6), and 45 from the neighbour. In the table 5 stands for 5 wheels, whereas 2 stands for 2 endcaps. Table 6: Links per trigger wedge ( η φ =33η-towers 30 ). links barrel RE1 RE2 RE3 RE4 (RE5) total from detector (2 4) (61) shared with φ-neighbour total (8) 98 (106) Connections between the links and the Trigger oards are listed in Table 7. It is seen that the maximum number of links received by the Trigger oard is 14. Table 7 can be considered as a map of the Trigger Crate backplane Combined Trigger/Splitter Crate options. Copper connections between the crates could be completely eliminated if the same crate could house both Trigger oards and Splitter oards. This could be achieved in three different ways. The Splitter oards and Trigger oards can occupy alternate positions in a Trigger Crate. The SC TC connection would be provided by the backplane. The main drawback of this possibility is that there is not enough slots in one crate and one should either have 24 Trigger Crates with double Trigger oards (24 PAC s) or 12 Trigger Crates with Trigger oards having 48 PAC s each (rather unfeasible option). 10

11 Trigger Crates_1 Splitter Crates_1 links 1-2 links 12-1 links links 2-3 links 4-5 links Splitter Crates_2 Trigger Crates_ COPPER links 3-4 links 9-10 links 8-9 links 5-6 links 6-7 links 7-8 O P T I C A L COPPER Figure 11: Layout of racks for the 12 Trigger Crates option. Another possibility is to combine functionality of Trigger and Splitter oards. This would make the Trigger oard more complicated, but the backplane traffic would be reduced. Yet another possibility is to replace backplane by a middle-plane with Trigger oards inserted form the front and Splitter oards connected to the rear of the crate. This option seems to be the most attractive and we consider it in greater detail in the following sections. Combined Trigger/Splitter oards may require larger (12 U?) crates. Assuming 2 such crates per rack one can arrange the 12 Trigger Crates in 6 racks as shown in Fig. 11, understanding that each pair of splitter and trigger boxes corresponds to a single crate Switch option The major challenge of the previous option is complicated backplane. It can be simplified if the Splitter oard could partially unpack the data and direct them in separate streams for each Trigger oard. It would require rather complicated buffer handling which may result in increased latency and limited bandwidth. For that reason this option does not look very realistic Data distribution through the middle-plane Connectors New design of the Link oard uses 1.6 Gbit/s links. This bandwidth is equivalent to 40 bits transmitted at 40 MHz. Due to the DC balance coding only 8/10 bits can be used for data, hence 32 bits of data can be transmitted every LHC clock cycle. In order to reduce the number of interconnections one can transmit the data from Splitter oards to Trigger oards sending 8 bit words on LVDS lines with 160 MHz frequency. In addition, one needs to send 80 MHz clock recovered from the link and maybe a Data Marker indicating the end of data frame. Thus, calculating size of connectors one has to count pins (2/bit) for each link. Each Trigger oard receives data from up to 14 links. This corresponds to pins. For example, one can use for this purpose two SCSI connectors, each having 5 32=160 pins. 11

12 Table 7: Connection of links to Trigger oards. Trigger planes from neighbour wedge are marked with *. Trigger oard 0 covers tower 0. Trigger oard 2 covers towers 2 and 1, etc. Links connected to DAQ at given T are written in bold, whereas the links received from Splitter oard at given location have grey background. Special cases of links *2A, *2 and EP32 are described in the text. TRIGGER OARDS S1 l. planes EN11ab N2 N2 N1 N1 N0 N0 P1A P1A P2A P2A EP12 EP12 EP11ab 1, 5, 6 EN11cd EN11cd EN12 EN12 N2A N2A N1A N1A P0 P0 P1 P1 P2 P2 EP11cd EP11cd logical EN31 EN31 EN43 EN43 EN13 N2D N2D N0E N0E P1D P2D P2D EP13 EP43 EP43 EP31 EP31 S2 planes EN42 EN42 EN42 N2E N2E N1D P0E P0E P2E P2E EP42 EP42 EP42 3, 4 EN41 EN32 EN33 EN33 N2 N2A N1E N1E P1E P1E P2A P2 EP33 EP33 EP32 EP41 S3 reference EN32 EN23 EN23 M0C M0C M0C EP23 EP23 EP32 plane 2 EN21ab EN21cd EN22 EN22 N2C N2C N1C N1C P1C P1C P2C P2C EP22 EP22 EP21cd EP21ab S4 r. plane 2* EN51ab EN51cd EN52 EN52 N2C N2C N1C N1C P1C P1C P2C P2C EP52 EP52 EP51cd EP51ab & l. plane 5 EN32 EN EN M0C M0C M0C EP EP EP32 logical EN41 EN32 EN33 EN33 N2 N2A N1E N1E P1E P1E P2A P2 EP33 EP33 EP32 EP41 S5 planes EN42 EN42 EN42 N2E N2E N1D M0E M0E P2E P2E EP42 EP42 EP42 3*, 4* EN31 EN31 EN43 EN43 EN13 N2D N2D M0D M0D P1D P2D P2D EP13 EP43 EP43 EP31 EP31 S6 l. planes EN11cd EN11cd EN12 EN12 N2A N2A N1A N1A P0 P0 P1 P1 P2 P2 EP11cd EP11cd 1*, 5*, 6* EN11ab N2 N2 N1 N1 N0 N0 P1A P1A P2A P2A EP12 EP12 EP11ab input links R/O links connectors

13 Splitter oards in one crate receive in total 98 (106) links (see Table 6). It can be done by 14 Splitter oards; each receives 8 links and transfers the corresponding data to the Trigger oards. Required pins nicely fit into a single SCSI connector. Alternatively one can use 17 Splitter oards, 7 links, i.e pins each. On the backplane most of the signals will be split and send to 2 or 3 destinations (see Table 7) Connection table An example of how to connect links to Trigger oards is given in Table 7. This table can be seen as a layout of the middle-plane connecting Splitter oards with Trigger oards. The table was designed in the following way. Each column of the table corresponds to one Trigger oard. Trigger oard number n covers towers n and n-1 for n>0,towersn and n+1 for n<0 and tower 0 for n=0. One can see from Fig. 10 which links must be connected for each tower. For example Trigger oard 2 covering towers 1 and 2 must receive 7 links P1A, P0, P1D, P0E, P1E, M0C, and P1C. All thost links must appear in column 2 of the table. Total number of links connected to each Trigger oard is listed in the row titled connectors. One can see that this number varies form 10 to 14. In order two cover full logical φ-sector one has to bring signals from two physical φ-sectors (see Sec. 2.3). Therefore each Trigger oard must receive a double set of links. For example, column 2 must contain 2 7=14 links. There is one exception at RE2, which is the reference plane in the endcap. Its physical segmentation in φ matches the logical one, so the links EN21ab, EN21cd, EN22, EN23, EP21ab, EP21cd, EP22, EP23 do not need to be doubled. For example, link EP21cd appears only once in column 12. The same is valid for chamber RE5, if exists (links EN51ab, EN51cd, EN52, EN, EP51ab, EP51cd, EP52, EP). In most of the cases one link has to be connected to 2 or 3 Trigger oards. One can imagine that the signals from this link are received by one of the Splitter oards and distributed to the Trigger oards through the middle-plane. Posible location of the input from the Splitter is indicated by a gray background. For example, link M0C is received from the Splitter in slot 0 and distributed to Trigger oards -2, 0 and 2. The total number links coming to the midle-plane from a Splitter in a given location is listed in the row titled input links. The locations of the inputs were chosen to distribute signals uniformly between the slots. One can see that the number of inputs per slot is always 6 or 7. Data coming to the Trigger oards should be also recorded by the DAQ system. It is enough that the data from each link are recorded once, i.e. only at one Trigger oard. We have chosen to do it at the same location as the signals are received from the Splitters. Locations selected for readout are shown in bold. Obviously, in each crate it is enough to read data from only one φ-sector. The adjacent sector will be read out in another crate. (Therefore, not all gray fileds have bold text.) The number of links read out at each Trigger oard are listed in the row titled R/O links. It is always equal to 3 or LDMUX connections One can see from Table 7 that the data are coming in 6 almost separate streams (denoted S1-S6). They can be treated on each Trigger oard by 6 LDMUX chips or can be groupped into 4, 3 or 2 LDMUXes. Link EP32 (EN32) in board 12 (-12) in fact belongs to stream S2, but it was put on the S3 bus because of lack of space. If the streams S2 and S3 are not connected to the same LDMUX there should be a connection between the two LDMUXes on the Trigger oard. If, in turn, the streams S1 and S2 are not coming to the same LDMUX, links P2A (T6) and P2 (T8) received in S1 should be transfered also to S2. This can be done on the backplane or again by a connection between corresponding LDMUXes Readout The link system is optimized in such a way that the average data rate through each link is similar. Splitter Crates sends data from a given link to several Trigger oards (see Table 7). The data transfered by each link must be read only once. Table 7 shows in bold which links is read by which Trigger Crate. The LDEMUX circuit which consist of Synchronizer_RLDEMUX FPGA and LDEMUX FPGA. The Synchronizer_ RLDEMUX FPGA synchronizes the data coming from the links and passes them to LDEMUX FPGA. In addition Synchronizer_RLDEMUX FPGA reads out the data from selected link (one or two). y default the link indicated in Table 7 is chosen, but for 13

14 test purposes any of connected links can be read out Ghost-busting and sorting The wedge geometry is somewhat more difficult for ghost-busting and sorting. In the ring geometry described in the Trigger TDR each η-tower is first treated separately for the full φ angle (360 ). At this step ghostbusting requires only local information exchange between neighbouring φ-segments. Only the next step of ghostbusting and sorting between different η-towers requires information from remote detector regions, which need to be supplemented by the address (η,φ). In the case of wedge geometry only one sector ( φ = 30 ) of one tower can be treated locally. The next steps, sorting between towers and between sectors should be based on addresses. This is less convenient, because it requires more interconnections and slightly more complicated logic. Details need to be worked out, but it does not seem to be a major obstacle. References [1] Study of Detailed Geometry of arrel RPC Strips, CMS IN 2000/044. [2] Layout of the Link System for the RPC Pattern Comparator Trigger, CMS IN 2000/043. [3] RPC System Geometry Simulated in CMSIM and ORCA 4.2, CMS IN 2000/054. [4] K. anzuzi et al., Optical link developments for the CMS RPC, Proceedings of The Fifth Workshop on Electronics for LHC Experiments, Snowmass, Colorado, September 20-24, [5] The Level-1 Trigger Technical Design Report, CERN/LHCC [6] M. Huhtinen, Optimization of the CMS forward shielding, CMS Note 2000/

RPC Trigger Overview

RPC Trigger Overview RPC Trigger Overview presented by Maciek Kudla, Warsaw University RPC Trigger ESR Warsaw, July 8th, 2003 RPC Trigger Task The task of RPC Muon Trigger electronics is to deliver 4 highest momentum muons

More information

The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Available on CMS information server CMS NOTE 998/ The Compact Muon Solenoid Experiment CMS Note Mailing address: CMS CERN, CH- GENEVA 3, Switzerland 8 July 998 A simplified Track Assembler I/O for the

More information

Level-1 Regional Calorimeter Trigger System for CMS

Level-1 Regional Calorimeter Trigger System for CMS Computing in High Energy and Nuclear Physics, La Jolla, CA, USA, March 24-28, 2003 1 Level-1 Regional Calorimeter Trigger System for CMS P. Chumney, S. Dasu, J. Lackey, M. Jaworski, P. Robl, W. H. Smith

More information

The ATLAS Level-1 Muon to Central Trigger Processor Interface

The ATLAS Level-1 Muon to Central Trigger Processor Interface The ATLAS Level-1 Muon to Central Processor D. Berge a, N. Ellis a, P. Farthouat a, S. Haas a, P. Klofver a, A. Krasznahorkay a,b, A. Messina a, T. Pauly a, G. Schuler a, R. Spiwoks a, T. Wengler a,c a

More information

Trigger Report. W. H. Smith U. Wisconsin. Calorimeter & Muon Trigger: Highlights Milestones Concerns Near-term Activities CMS

Trigger Report. W. H. Smith U. Wisconsin. Calorimeter & Muon Trigger: Highlights Milestones Concerns Near-term Activities CMS Trigger Report W. H. Smith U. Wisconsin Calorimeter & Muon Trigger: Highlights Milestones Concerns Near-term Activities Calorimeter Trigger Highlights, Milestones, Activities: Receiver Card Prototype delivered

More information

Muon Reconstruction and Identification in CMS

Muon Reconstruction and Identification in CMS Muon Reconstruction and Identification in CMS Marcin Konecki Institute of Experimental Physics, University of Warsaw, Poland E-mail: marcin.konecki@gmail.com An event reconstruction at LHC is a challenging

More information

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC

Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC Fast pattern recognition with the ATLAS L1Track trigger for the HL-LHC On behalf of the ATLAS Collaboration Uppsala Universitet E-mail: mikael.martensson@cern.ch ATL-DAQ-PROC-2016-034 09/01/2017 A fast

More information

The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System

The Track-Finding Processor for the Level-1 Trigger of the CMS Endcap Muon System The Track-Finding Processor for the Level- Trigger of the CMS Endcap Muon System D. Acosta, A. Madorsky (Madorsky@phys.ufl.edu), B. Scurlock, S.M. Wang University of Florida A. Atamanchuk, V. Golovtsov,

More information

DTTF muon sorting: Wedge Sorter and Barrel Sorter

DTTF muon sorting: Wedge Sorter and Barrel Sorter DTTF muon sorting: Wedge Sorter and Barrel Sorter 1 BS, it sorts the 4 best tracks out of max 24 tracks coming from the 12 WS of barrel Vienna Bologna PHTF 72 x Vienna Bologna Padova 12 WS, each one sorts

More information

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review June 5, 2002

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Review June 5, 2002 WBS 3.1 - Trigger Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager DOE/NSF Review June 5, 2002 This talk is available on: http://hep.wisc.edu/wsmith/cms/trig_lehman_plen02.pdf W. Smith, U. Wisconsin,

More information

DTTF muon sorting: Wedge Sorter and Barrel Sorter

DTTF muon sorting: Wedge Sorter and Barrel Sorter DTTF muon sorting: Wedge Sorter and Barrel Sorter 1 BS, it sorts the 4 best tracks out of max 24 tracks coming from the 12 WS of barrel Vienna Bologna PHTF 72 x Vienna Bologna Padova 12 WS, each one sorts

More information

Muon Trigger Electronics in the Counting Room

Muon Trigger Electronics in the Counting Room Muon Trigger Electronics in the Counting Room Darin Acosta University of Florida April 2000 US CMS DOE/NSF Review: April 11-13, 2000 1 Outline Overview of the CSC trigger system Sector Receiver (WBS: 3.1.1.2)

More information

A 3-D Track-Finding Processor for the CMS Level-1 Muon Trigger

A 3-D Track-Finding Processor for the CMS Level-1 Muon Trigger A 3-D Track-Finding Processor for the CMS Level-1 Muon Trigger D.Acosta, B.Scurlock*, A.Madorsky, H.Stoeck, S.M.Wang Department of Physics, University of Florida, Gainesville, FL 32611, USA V.Golovtsov,

More information

Ignacy Kudla, Radomir Kupczak, Krzysztof Pozniak, Antonio Ranieri

Ignacy Kudla, Radomir Kupczak, Krzysztof Pozniak, Antonio Ranieri *** Draft *** 15/04/97 *** MK/RK/KTP/AR *** ***use color print!!!*** RPC Muon Trigger Detector Control Ignacy Kudla, Radomir Kupczak, Krzysztof Pozniak, Antonio Ranieri $Ã&06 Ã*(1(5$/ RPC Muon Trigger

More information

DT TPG STATUS. Trigger meeting, September INFN Bologna; INFN Padova; CIEMAT Madrid. Outline: most updates on Sector Collector system

DT TPG STATUS. Trigger meeting, September INFN Bologna; INFN Padova; CIEMAT Madrid. Outline: most updates on Sector Collector system DT TPG STATUS Trigger meeting, September 19 2006 INFN Bologna; INFN Padova; CIEMAT Madrid Outline: most updates on Sector Collector system MTCC phase 1 lessons resume Open issues (Good) news from tests

More information

The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The Compact Muon Solenoid Experiment. CMS Note. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Available on CMS information server CMS NOTE 998/4 The Compact Muon Solenoid Experiment CMS Note Mailing address: CMS CERN, CH-2 GENEVA 23, Switzerland 29 July 998 Muon DTBX Chamber Trigger Simulation

More information

2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System

2008 JINST 3 S Online System. Chapter System decomposition and architecture. 8.2 Data Acquisition System Chapter 8 Online System The task of the Online system is to ensure the transfer of data from the front-end electronics to permanent storage under known and controlled conditions. This includes not only

More information

arxiv:hep-ph/ v1 11 Mar 2002

arxiv:hep-ph/ v1 11 Mar 2002 High Level Tracker Triggers for CMS Danek Kotliński a Andrey Starodumov b,1 a Paul Scherrer Institut, CH-5232 Villigen, Switzerland arxiv:hep-ph/0203101v1 11 Mar 2002 b INFN Sezione di Pisa, Via Livornese

More information

The CMS Global Calorimeter Trigger Hardware Design

The CMS Global Calorimeter Trigger Hardware Design The CMS Global Calorimeter Trigger Hardware Design M. Stettler, G. Iles, M. Hansen a, C. Foudas, J. Jones, A. Rose b a CERN, 1211 Geneva 2, Switzerland b Imperial College of London, UK Matthew.Stettler@cern.ch

More information

SoLID GEM Detectors in US

SoLID GEM Detectors in US SoLID GEM Detectors in US Kondo Gnanvo University of Virginia SoLID Collaboration Meeting @ JLab, 08/26/2016 Outline Design Optimization U-V strips readout design Large GEMs for PRad in Hall B Requirements

More information

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment

A New Segment Building Algorithm for the Cathode Strip Chambers in the CMS Experiment EPJ Web of Conferences 108, 02023 (2016) DOI: 10.1051/ epjconf/ 201610802023 C Owned by the authors, published by EDP Sciences, 2016 A New Segment Building Algorithm for the Cathode Strip Chambers in the

More information

The ATLAS Conditions Database Model for the Muon Spectrometer

The ATLAS Conditions Database Model for the Muon Spectrometer The ATLAS Conditions Database Model for the Muon Spectrometer Monica Verducci 1 INFN Sezione di Roma P.le Aldo Moro 5,00185 Rome, Italy E-mail: monica.verducci@cern.ch on behalf of the ATLAS Muon Collaboration

More information

SVT detector Electronics Status

SVT detector Electronics Status SVT detector Electronics Status On behalf of the SVT community Mauro Citterio INFN Milano Overview: - SVT design status - F.E. chips - Electronic design - Hit rates and data volumes 1 SVT Design Detectors:

More information

Forward Time-of-Flight Detector Efficiency for CLAS12

Forward Time-of-Flight Detector Efficiency for CLAS12 Forward Time-of-Flight Detector Efficiency for CLAS12 D.S. Carman, Jefferson Laboratory ftof eff.tex May 29, 2014 Abstract This document details an absolute hit efficiency study of the FTOF panel-1a and

More information

Track-Finder Test Results and VME Backplane R&D. D.Acosta University of Florida

Track-Finder Test Results and VME Backplane R&D. D.Acosta University of Florida Track-Finder Test Results and VME Backplane R&D D.Acosta University of Florida 1 Technical Design Report Trigger TDR is completed! A large amount effort went not only into the 630 pages, but into CSC Track-Finder

More information

RESPONSIBILITIES CIEMAT, MADRID HEPHY, VIENNA INFN, PADOVA INFN, BOLOGNA INFN, TORINO U. AUTONOMA, MADRID & LV, HV PS SYSTEMS.

RESPONSIBILITIES CIEMAT, MADRID HEPHY, VIENNA INFN, PADOVA INFN, BOLOGNA INFN, TORINO U. AUTONOMA, MADRID & LV, HV PS SYSTEMS. 2 RESPONSIBILITIES CIEMAT, MADRID HEPHY, VIENNA INFN, PADOVA INFN, BOLOGNA INFN, TORINO U. AUTONOMA, MADRID & LV, HV PS SYSTEMS 4 5 2 Crates w/bp 2 TIM 3 ROS-25 3 TRG SC 2 RO PP 6 SC crate: 3 units 2

More information

The CLICdp Optimization Process

The CLICdp Optimization Process ILDOptWS, Feb, 2016 A. Sailer: The CLICdp Optimization Process 1/17 The CLICdp Optimization Process André Sailer (CERN-EP-LCD) On Behalf of the CLICdp Collaboration ILD Software and Optimisation Workshop

More information

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Available on CMS information server CMS CR -2008/100 The Compact Muon Solenoid Experiment Conference Report Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland 02 December 2008 (v2, 03 December 2008)

More information

CMX (Common Merger extension module) Y. Ermoline for CMX collaboration Preliminary Design Review, Stockholm, 29 June 2011

CMX (Common Merger extension module) Y. Ermoline for CMX collaboration Preliminary Design Review, Stockholm, 29 June 2011 (Common Merger extension module) Y. Ermoline for collaboration Preliminary Design Review, Stockholm, 29 June 2011 Outline Current L1 Calorimeter trigger system Possible improvement to maintain trigger

More information

DAQ and ECS Ethernet cabling in UXA85

DAQ and ECS Ethernet cabling in UXA85 DAQ and ECS Ethernet cabling in UXA85 LHCb Technical Note Issue: 2 Revision: 4 Reference: EDMS 497862/2 Created: September 15, 2004 Last modified: 22/04/2005 13:55:50 Prepared By: Approved By: LHCb DAQ

More information

The Sorting Processor Project

The Sorting Processor Project CMS TN/95-028 6 March, 1995 The Sorting Processor Project G.De Robertis and A.Ranieri Sezione INN di Bari, Italy I.M.Kudla Institute of Experimental Physics,Warsaw University, Poland G.Wrochna on leave

More information

Electronics on the detector Mechanical constraints: Fixing the module on the PM base.

Electronics on the detector Mechanical constraints: Fixing the module on the PM base. PID meeting Mechanical implementation ti Electronics architecture SNATS upgrade proposal Christophe Beigbeder PID meeting 1 Electronics is split in two parts : - one directly mounted on the PM base receiving

More information

1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France

1. INTRODUCTION 2. MUON RECONSTRUCTION IN ATLAS. A. Formica DAPNIA/SEDI, CEA/Saclay, Gif-sur-Yvette CEDEX, France &+(3/D-ROOD&DOLIRUQLD0DUFK 1 Design, implementation and deployment of the Saclay muon reconstruction algorithms (Muonbox/y) in the Athena software framework of the ATLAS experiment A. Formica DAPNIA/SEDI,

More information

I/O Choices for the ATLAS. Insertable B Layer (IBL) Abstract. Contact Person: A. Grillo

I/O Choices for the ATLAS. Insertable B Layer (IBL) Abstract. Contact Person: A. Grillo I/O Choices for the ATLAS Insertable B Layer (IBL) ATLAS Upgrade Document No: Institute Document No. Created: 14/12/2008 Page: 1 of 2 Modified: 8/01/2009 Rev. No.: 1.00 Abstract The ATLAS Pixel System

More information

Optimisation Studies for the CLIC Vertex-Detector Geometry

Optimisation Studies for the CLIC Vertex-Detector Geometry CLICdp-Note04-002 4 July 204 Optimisation Studies for the CLIC Vertex-Detector Geometry Niloufar Alipour Tehrani, Philipp Roloff CERN, Switzerland, ETH Zürich, Switzerland Abstract An improved CLIC detector

More information

SoLID GEM Detectors in US

SoLID GEM Detectors in US SoLID GEM Detectors in US Kondo Gnanvo University of Virginia SoLID Collaboration Meeting @ JLab, 05/07/2016 Outline Overview of SoLID GEM Trackers Design Optimization Large Area GEMs for PRad in Hall

More information

Update on PRad GEMs, Readout Electronics & DAQ

Update on PRad GEMs, Readout Electronics & DAQ Update on PRad GEMs, Readout Electronics & DAQ Kondo Gnanvo University of Virginia, Charlottesville, VA Outline PRad GEMs update Upgrade of SRS electronics Integration into JLab DAQ system Cosmic tests

More information

Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade

Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade Quad Module Hybrid Development for the ATLAS Pixel Layer Upgrade Lawrence Berkeley National Lab E-mail: kedunne@lbl.gov Maurice Garcia-Sciveres, Timon Heim Lawrence Berkeley National Lab, Berkeley, USA

More information

Trigger Layout and Responsibilities

Trigger Layout and Responsibilities CMS EMU TRIGGER ELECTRONICS B. Paul Padley Rice University February 1999 Trigger Layout and Responsibilities Basic Requirements z Latency: < 3.2 us z Fully pipelined synchronous architecture, dead time

More information

V. Karimäki, T. Lampén, F.-P. Schilling, The HIP algorithm for Track Based Alignment and its Application to the CMS Pixel Detector, CMS Note

V. Karimäki, T. Lampén, F.-P. Schilling, The HIP algorithm for Track Based Alignment and its Application to the CMS Pixel Detector, CMS Note VI V. Karimäki, T. Lampén, F.-P. Schilling, The HIP algorithm for Track Based Alignment and its Application to the CMS Pixel Detector, CMS Note 26/18, CERN, Geneva, Switzerland, 1pp., Copyright (26) by

More information

Copyright 2014 Society of Photo-Optical Instrumentation Engineers. This paper was published in Proceedings of SPIE (Proc. SPIE Vol.

Copyright 2014 Society of Photo-Optical Instrumentation Engineers. This paper was published in Proceedings of SPIE (Proc. SPIE Vol. Copyright 2014 Society of Photo-Optical Instrumentation Engineers. This paper was published in Proceedings of SPIE (Proc. SPIE Vol. 9290, 929025, DOI: http://dx.doi.org/10.1117/12.2073380 ) and is made

More information

ATLAS ITk Layout Design and Optimisation

ATLAS ITk Layout Design and Optimisation ATLAS ITk Layout Design and Optimisation Noemi Calace noemi.calace@cern.ch On behalf of the ATLAS Collaboration 3rd ECFA High Luminosity LHC Experiments Workshop 3-6 October 2016 Aix-Les-Bains Overview

More information

S-LINK: A Prototype of the ATLAS Read-out Link

S-LINK: A Prototype of the ATLAS Read-out Link : A Prototype of the ATLAS Read-out Link Erik van der Bij, Robert McLaren, Zoltán Meggyesi EP-Division CERN, CH-1211 Geneva 23 Abstract The ATLAS data acquisition system needs over 1500 read-out links

More information

USCMS HCAL FERU: Front End Readout Unit. Drew Baden University of Maryland February 2000

USCMS HCAL FERU: Front End Readout Unit. Drew Baden University of Maryland February 2000 USCMS HCAL FERU: Front End Readout Unit Drew Baden University of Maryland February 2000 HCAL Front-End Readout Unit Joint effort between: University of Maryland Drew Baden (Level 3 Manager) Boston University

More information

The CMS alignment challenge

The CMS alignment challenge The CMS alignment challenge M. Weber a for the CMS Collaboration a I. Physikalisches Institut B, RWTH Aachen, Germany Abstract The CMS tracking detectors are of unprecedented complexity: 66 million pixel

More information

CSC Trigger Motherboard

CSC Trigger Motherboard CSC Trigger Motherboard Functions of TMB Tests: Performance at summer 2003 test beam Radiation, magnetic fields, etc. Plans for TMB production and testing 1 Cathode LCT CSC Trigger Requirements Identify

More information

RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters

RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters RT2016 Phase-I Trigger Readout Electronics Upgrade for the ATLAS Liquid-Argon Calorimeters Nicolas Chevillot (LAPP/CNRS-IN2P3) on behalf of the ATLAS Liquid Argon Calorimeter Group 1 Plan Context Front-end

More information

Level-1 Data Driver Card of the ATLAS New Small Wheel Upgrade Compatible with the Phase II 1 MHz Readout

Level-1 Data Driver Card of the ATLAS New Small Wheel Upgrade Compatible with the Phase II 1 MHz Readout Level-1 Data Driver Card of the ATLAS New Small Wheel Upgrade Compatible with the Phase II 1 MHz Readout Panagiotis Gkountoumis National Technical University of Athens Brookhaven National Laboratory On

More information

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions

The LHCb upgrade. Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions The LHCb upgrade Burkhard Schmidt for the LHCb Collaboration Outline: Present LHCb detector and trigger LHCb upgrade main drivers Overview of the sub-detector modifications Conclusions OT IT coverage 1.9

More information

CMS FPGA Based Tracklet Approach for L1 Track Finding

CMS FPGA Based Tracklet Approach for L1 Track Finding CMS FPGA Based Tracklet Approach for L1 Track Finding Anders Ryd (Cornell University) On behalf of the CMS Tracklet Group Presented at AWLC June 29, 2017 Anders Ryd Cornell University FPGA Based L1 Tracking

More information

THE ALFA TRIGGER SIMULATOR

THE ALFA TRIGGER SIMULATOR Vol. 46 (2015) ACTA PHYSICA POLONICA B No 7 THE ALFA TRIGGER SIMULATOR B. Dziedzic Tadeusz Kościuszko Cracow University of Technology, Institute of Physics Podchorążych 1, 30-084 Kraków, Poland K. Korcyl

More information

Simulating the RF Shield for the VELO Upgrade

Simulating the RF Shield for the VELO Upgrade LHCb-PUB-- March 7, Simulating the RF Shield for the VELO Upgrade T. Head, T. Ketel, D. Vieira. Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, Brazil European Organization for Nuclear Research

More information

Detector Control LHC

Detector Control LHC Detector Control Systems @ LHC Matthias Richter Department of Physics, University of Oslo IRTG Lecture week Autumn 2012 Oct 18 2012 M. Richter (UiO) DCS @ LHC Oct 09 2012 1 / 39 Detectors in High Energy

More information

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization

Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization Scientifica Acta 2, No. 2, 74 79 (28) Physics CMS Muon High Level Trigger: Level 3 reconstruction algorithm development and optimization Alessandro Grelli Dipartimento di Fisica Nucleare e Teorica, Università

More information

Frontend Control Electronics for the LHCb upgrade Hardware realization and test

Frontend Control Electronics for the LHCb upgrade Hardware realization and test First Prototype of the muon Frontend Control Electronics for the LHCb upgrade Hardware realization and test V. Bocci, G. Chiodi, P. Fresch et al. International Conference on Technology and Instrumentation

More information

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari

Data Acquisition in Particle Physics Experiments. Ing. Giuseppe De Robertis INFN Sez. Di Bari Data Acquisition in Particle Physics Experiments Ing. Giuseppe De Robertis INFN Sez. Di Bari Outline DAQ systems Theory of operation Case of a large experiment (CMS) Example of readout GEM detectors for

More information

Results of Radiation Test of the Cathode Front-end Board for CMS Endcap Muon Chambers

Results of Radiation Test of the Cathode Front-end Board for CMS Endcap Muon Chambers Results of Radiation Test of the Cathode Front-end Board for CMS Endcap Muon Chambers B. Bylsma 1, L.S. Durkin 1, J. Gu 1, T.Y. Ling 1, M. Tripathi 2 1 Department of Physics, Ohio State University, Columbus,

More information

Muon Port Card Upgrade Status May 2013

Muon Port Card Upgrade Status May 2013 CSC Endcap Muon Port Card and Muon Sorter Upgrade Status May 2013 MPC Upgrade Requirements Be able to deliver all 18 trigger primitives from the EMU peripheral crate to the upgraded Sector Processor Preserve

More information

ATLAS TDAQ RoI Builder and the Level 2 Supervisor system

ATLAS TDAQ RoI Builder and the Level 2 Supervisor system ATLAS TDAQ RoI Builder and the Level 2 Supervisor system R. E. Blair 1, J. Dawson 1, G. Drake 1, W. Haberichter 1, J. Schlereth 1, M. Abolins 2, Y. Ermoline 2, B. G. Pope 2 1 Argonne National Laboratory,

More information

A synchronous architecture for the L0 muon trigger

A synchronous architecture for the L0 muon trigger CPPM, IN2P3 CNRS et Université d Aix Marseille II A synchronous architecture for the L0 muon trigger LHCb Technical Note Issue: 1 Revision: 1 Reference: LHCb 2001 010 Created: January 23, 2001 Last modified:

More information

The MROD. The MDT Precision Chambers ROD. Adriaan König University of Nijmegen. 5 October nd ATLAS ROD Workshop 1

The MROD. The MDT Precision Chambers ROD. Adriaan König University of Nijmegen. 5 October nd ATLAS ROD Workshop 1 The MROD The MDT Precision Chambers ROD Adriaan König University of Nijmegen 5 October 2000 2nd ATLAS ROD Workshop 1 Contents System Overview MROD-0 Prototype MROD-1 Prototype Performance Study FE Parameter

More information

Updated impact parameter resolutions of the ATLAS Inner Detector

Updated impact parameter resolutions of the ATLAS Inner Detector Updated impact parameter resolutions of the ATLAS Inner Detector ATLAS Internal Note Inner Detector 27.09.2000 ATL-INDET-2000-020 06/10/2000 Szymon Gadomski, CERN 1 Abstract The layout of the ATLAS pixel

More information

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Status Review November 20, 2003

WBS Trigger. Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager. DOE/NSF Status Review November 20, 2003 WBS 3.1 - Trigger Wesley H. Smith, U. Wisconsin CMS Trigger Project Manager DOE/NSF Status Review November 20, 2003 This talk is available on: http://hep.wisc.edu/wsmith/cms/trig_lehman_nov03.pdf US CMS

More information

The First Integration Test of the ATLAS End-Cap Muon Level 1 Trigger System

The First Integration Test of the ATLAS End-Cap Muon Level 1 Trigger System 864 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 50, NO. 4, AUGUST 2003 The First Integration Test of the ATLAS End-Cap Muon Level 1 Trigger System K. Hasuko, H. Kano, Y. Matsumoto, Y. Nakamura, H. Sakamoto,

More information

Velo readout board RB3. Common L1 board (ROB)

Velo readout board RB3. Common L1 board (ROB) Velo readout board RB3 Testing... Common L1 board (ROB) Specifying Federica Legger 10 February 2003 1 Summary LHCb Detectors Online (Trigger, DAQ) VELO (detector and Readout chain) L1 electronics for VELO

More information

The CMS L1 Global Trigger Offline Software

The CMS L1 Global Trigger Offline Software The CMS L1 Global Offline Software Vasile Mihai Ghete Institute for High Energy Physics, Vienna, Austria Seminar 08-09 June 2009, HEPHY Vienna CMS experiment Tracker pixel detector: 3 barrel layers, 2

More information

LASER INTERFEROMETER GRAVITATIONAL WAVE OBSERVATORY -LIGO-

LASER INTERFEROMETER GRAVITATIONAL WAVE OBSERVATORY -LIGO- LASER INTERFEROMETER GRAVITATIONAL WAVE OBSERVATORY -LIGO- CALIFORNIA INSTITUTE OF TECHNOLOGY MASSACHUSETTS INSTITUTE OF TECHNOLOGY Document Type DCC Number July 7, 2005 AdvLigo CDS Discussion Paper R.

More information

Prototyping of large structures for the Phase-II upgrade of the pixel detector of the ATLAS experiment

Prototyping of large structures for the Phase-II upgrade of the pixel detector of the ATLAS experiment Prototyping of large structures for the Phase-II upgrade of the pixel detector of the ATLAS experiment Diego Alvarez Feito CERN EP-DT On Behalf of the ATLAS Collaboration 2017 IEEE NSS and MIC 26/10/2017

More information

PROGRESS ON ADF BOARD DESIGN

PROGRESS ON ADF BOARD DESIGN PROGRESS ON ADF BOARD DESIGN Denis Calvet calvet@hep.saclay.cea.fr CEA Saclay, 91191 Gif-sur-Yvette CEDEX, France Saclay, 16 May 2002 PLAN ANALOG SPLITTER ADF BOARD AND CRATES DIGITAL FILTER SCL INTERFACE

More information

A real time electronics emulator with realistic data generation for reception tests of the CMS ECAL front-end boards

A real time electronics emulator with realistic data generation for reception tests of the CMS ECAL front-end boards Available on CMS information server CMS CR 2005/029 November 4 th, 2005 A real time electronics emulator with realistic data generation for reception tests of the CMS ECAL front-end s T. Romanteau Ph.

More information

Data Acquisition in High Speed Ethernet & Fibre Channel Avionics Systems

Data Acquisition in High Speed Ethernet & Fibre Channel Avionics Systems Data Acquisition in High Speed Ethernet & Fibre Channel Avionics Systems Troy Troshynski Avionics Interface Technologies (A Division of Teradyne) Omaha, NE U.S.A. troyt@aviftech.com http://www.aviftech.com/aggregator

More information

Forward Time-of-Flight Geometry for CLAS12

Forward Time-of-Flight Geometry for CLAS12 Forward Time-of-Flight Geometry for CLAS12 D.S. Carman, Jefferson Laboratory ftof geom.tex April 13, 2016 Abstract This document details the nominal geometry for the CLAS12 Forward Time-of- Flight System

More information

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA

The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments September 2004, BOSTON, USA Carmen González Gutierrez (CERN PH/ED) The ALICE TPC Readout Control Unit 10th Workshop on Electronics for LHC and future Experiments 13 17 September 2004, BOSTON, USA Outline: 9 System overview 9 Readout

More information

Upgrade of the ATLAS Level-1 Trigger with event topology information

Upgrade of the ATLAS Level-1 Trigger with event topology information Upgrade of the ATLAS Level-1 Trigger with event topology information E. Simioni 1, S. Artz 1, B. Bauß 1, V. Büscher 1, K. Jakobi 1, A. Kaluza 1, C. Kahra 1, M. Palka 2, A. Reiß 1, J. Schäffer 1, U. Schäfer

More information

PETsys SiPM Readout System

PETsys SiPM Readout System SiPM Readout System FEB/A_v2 FEB/S FEB/I The SiPM Readout System is designed to read a large number of SiPM photo-sensor pixels in applications where a high data rate and excellent time resolution is required.

More information

Status Report of the ATLAS SCT Optical Links.

Status Report of the ATLAS SCT Optical Links. Status Report of the ATLAS SCT Optical Links. D.G.Charlton, J.D.Dowell, R.J.Homer, P.Jovanovic, T.J. McMahon, G.Mahout J.A.Wilson School of Physics and Astronomy, University of Birmingham, Birmingham B15

More information

1 MHz Readout. LHCb Technical Note. Artur Barczyk, Guido Haefeli, Richard Jacobsson, Beat Jost, and Niko Neufeld. Revision: 1.0

1 MHz Readout. LHCb Technical Note. Artur Barczyk, Guido Haefeli, Richard Jacobsson, Beat Jost, and Niko Neufeld. Revision: 1.0 1 MHz Readout LHCb Technical Note Issue: Final Revision: 1.0 Reference: LHCb 2005 62 Created: 9 March, 2005 Last modified: 7 September 2005 Prepared By: Artur Barczyk, Guido Haefeli, Richard Jacobsson,

More information

Control slice prototypes for the ALICE TPC detector

Control slice prototypes for the ALICE TPC detector Control slice prototypes for the ALICE TPC detector S.Popescu 1, 3, A.Augustinus 1, L.Jirdén 1, U.Frankenfeld 2, H.Sann 2 1 CERN, Geneva, Switzerland, 2 GSI, Darmstadt, Germany, 3 NIPN E, Bucharest, Romania

More information

Vertex Detector Electronics: ODE to ECS Interface

Vertex Detector Electronics: ODE to ECS Interface Vertex Detector Electronics: ODE to ECS Interface LHCb Technical Note Issue: 1 Revision: 0 Reference: LHCb 2000-012 VELO Created: 1 February 2000 Last modified: 20 March 2000 Prepared By: Yuri Ermoline

More information

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland

The Compact Muon Solenoid Experiment. Conference Report. Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland Available on CMS information server CMS CR -2017/188 The Compact Muon Solenoid Experiment Conference Report Mailing address: CMS CERN, CH-1211 GENEVA 23, Switzerland 29 June 2017 (v2, 07 July 2017) Common

More information

New slow-control FPGA IP for GBT based system and status update of the GBT-FPGA project

New slow-control FPGA IP for GBT based system and status update of the GBT-FPGA project New slow-control FPGA IP for GBT based system and status update of the GBT-FPGA project 1 CERN Geneva CH-1211, Switzerland E-mail: julian.mendez@cern.ch Sophie Baron a, Pedro Vicente Leitao b CERN Geneva

More information

L1 and Subsequent Triggers

L1 and Subsequent Triggers April 8, 2003 L1 and Subsequent Triggers Abstract During the last year the scope of the L1 trigger has changed rather drastically compared to the TP. This note aims at summarising the changes, both in

More information

PoS(High-pT physics09)036

PoS(High-pT physics09)036 Triggering on Jets and D 0 in HLT at ALICE 1 University of Bergen Allegaten 55, 5007 Bergen, Norway E-mail: st05886@alf.uib.no The High Level Trigger (HLT) of the ALICE experiment is designed to perform

More information

The WaveDAQ system: Picosecond measurements with channels

The WaveDAQ system: Picosecond measurements with channels Stefan Ritt :: Muon Physics :: Paul Scherrer Institute The WaveDAQ system: Picosecond measurements with 10 000 channels Workshop on pico-second photon sensors, Kansas City, Sept. 2016 0.2-2 ns DRS4 Chip

More information

Technical Specification of LHC instrumentation VME crates Back plane, power supplies and transition modules

Technical Specification of LHC instrumentation VME crates Back plane, power supplies and transition modules Group Code.: SL/BI EDMS No.: 365170 LHC Project document No.: XXXX The Large Hadron Collider Project IT-XXXX/LHC/LHC Technical Specification of LHC instrumentation VME crates Back plane, power supplies

More information

Module Performance Report. ATLAS Calorimeter Level-1 Trigger- Common Merger Module. Version February-2005

Module Performance Report. ATLAS Calorimeter Level-1 Trigger- Common Merger Module. Version February-2005 Module Performance Report ATLAS Calorimeter Level-1 Trigger- Common Merger Module B. M. Barnett, I. P. Brawn, C N P Gee Version 1.0 23 February-2005 Table of Contents 1 Scope...3 2 Measured Performance...3

More information

The Phase-2 ATLAS ITk Pixel Upgrade

The Phase-2 ATLAS ITk Pixel Upgrade The Phase-2 ATLAS ITk Pixel Upgrade T. Flick (University of Wuppertal) - on behalf of the ATLAS collaboration 14th Topical Seminar on Innovative Particle and Radiation Detectors () 03.-06. October 2016

More information

ATLAS, CMS and LHCb Trigger systems for flavour physics

ATLAS, CMS and LHCb Trigger systems for flavour physics ATLAS, CMS and LHCb Trigger systems for flavour physics Università degli Studi di Bologna and INFN E-mail: guiducci@bo.infn.it The trigger systems of the LHC detectors play a crucial role in determining

More information

Short Introduction to DCS, JCOP Framework, PVSS. PVSS Architecture and Concept. JCOP Framework concepts and tools.

Short Introduction to DCS, JCOP Framework, PVSS. PVSS Architecture and Concept. JCOP Framework concepts and tools. Hassan Shahzad, NCP Contents Short Introduction to DCS, JCOP Framework, PVSS and FSM. PVSS Architecture and Concept. JCOP Framework concepts and tools. CMS Endcap RPC DCS. 2 What is DCS DCS stands for

More information

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN

PoS(EPS-HEP2017)523. The CMS trigger in Run 2. Mia Tosi CERN CERN E-mail: mia.tosi@cern.ch During its second period of operation (Run 2) which started in 2015, the LHC will reach a peak instantaneous luminosity of approximately 2 10 34 cm 2 s 1 with an average pile-up

More information

CMS Calorimeter Trigger Phase I upgrade

CMS Calorimeter Trigger Phase I upgrade Journal of Instrumentation OPEN ACCESS CMS Calorimeter Trigger Phase I upgrade To cite this article: P Klabbers et al View the article online for updates and enhancements. Related content - CMS level-1

More information

Results of a Sliced System Test for the ATLAS End-cap Muon Level-1 Trigger

Results of a Sliced System Test for the ATLAS End-cap Muon Level-1 Trigger Results of a Sliced System Test for the ATLAS End-cap Muon Level-1 Trigger H.Kano*, K.Hasuko, Y.Matsumoto, Y.Nakamura, *Corresponding Author: kano@icepp.s.u-tokyo.ac.jp ICEPP, University of Tokyo, 7-3-1

More information

A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade

A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade Journal of Instrumentation OPEN ACCESS A generic firmware core to drive the Front-End GBT-SCAs for the LHCb upgrade Recent citations - The Versatile Link Demo Board (VLDB) R. Martín Lesma et al To cite

More information

Performance of the ATLAS Inner Detector at the LHC

Performance of the ATLAS Inner Detector at the LHC Performance of the ALAS Inner Detector at the LHC hijs Cornelissen for the ALAS Collaboration Bergische Universität Wuppertal, Gaußstraße 2, 4297 Wuppertal, Germany E-mail: thijs.cornelissen@cern.ch Abstract.

More information

Real-time Analysis with the ALICE High Level Trigger.

Real-time Analysis with the ALICE High Level Trigger. Real-time Analysis with the ALICE High Level Trigger C. Loizides 1,3, V.Lindenstruth 2, D.Röhrich 3, B.Skaali 4, T.Steinbeck 2, R. Stock 1, H. TilsnerK.Ullaland 3, A.Vestbø 3 and T.Vik 4 for the ALICE

More information

A LVL2 Zero Suppression Algorithm for TRT Data

A LVL2 Zero Suppression Algorithm for TRT Data A LVL2 Zero Suppression Algorithm for TRT Data R. Scholte,R.Slopsema,B.vanEijk, N. Ellis, J. Vermeulen May 5, 22 Abstract In the ATLAS experiment B-physics studies will be conducted at low and intermediate

More information

SoLID GEM Detectors in US

SoLID GEM Detectors in US SoLID GEM Detectors in US Kondo Gnanvo University of Virginia SoLID Collaboration Meeting @ Jlab, 01/13/2016 Outline Overview of SoLID GEM Trackers Large area GEM R&D @ UVa Update on APV25 Electronics

More information

Upgrading the ATLAS Tile Calorimeter electronics

Upgrading the ATLAS Tile Calorimeter electronics ITIM Upgrading the ATLAS Tile Calorimeter electronics Gabriel Popeneciu, on behalf of the ATLAS Tile Calorimeter System INCDTIM Cluj Napoca, Romania Gabriel Popeneciu PANIC 2014, Hamburg 26th August 2014

More information

The ASDEX Upgrade UTDC and DIO cards - A family of PCI/cPCI devices for Real-Time DAQ under Solaris

The ASDEX Upgrade UTDC and DIO cards - A family of PCI/cPCI devices for Real-Time DAQ under Solaris The ASDEX Upgrade UTDC and DIO cards - A family of PCI/cPCI devices for Real-Time DAQ under Solaris A. Lohs a, K. Behler a,*, G. Raupp, Unlimited Computer Systems b, ASDEX Upgrade Team a a Max-Planck-Institut

More information

A High Performance Bus Communication Architecture through Bus Splitting

A High Performance Bus Communication Architecture through Bus Splitting A High Performance Communication Architecture through Splitting Ruibing Lu and Cheng-Kok Koh School of Electrical and Computer Engineering Purdue University,West Lafayette, IN, 797, USA {lur, chengkok}@ecn.purdue.edu

More information