ESNET Requirements for Physics Reseirch at the SSCL

Similar documents
Superconducting Super Collider Laboratory. Bridging Fortran to Object Oriented Paradigm for HEP Data Modeling Task. J. Huang

Go SOLAR Online Permitting System A Guide for Applicants November 2012

A REMOTE CONTROL ROOM AT DIII-D

Superconducting Super Collider Laboratory

NIF ICCS Test Controller for Automated & Manual Testing

Advanced Synchrophasor Protocol DE-OE-859. Project Overview. Russell Robertson March 22, 2017

Bridging The Gap Between Industry And Academia

In-Field Programming of Smart Meter and Meter Firmware Upgrade

REAL-TIME MULTIPLE NETWORKED VIEWER CAPABILITY OF THE DIII D EC DATA ACQUISITION SYSTEM

FY97 ICCS Prototype Specification

ACCELERATOR OPERATION MANAGEMENT USING OBJECTS*

CHANGING THE WAY WE LOOK AT NUCLEAR

A VERSATILE DIGITAL VIDEO ENGINE FOR SAFEGUARDS AND SECURITY APPLICATIONS

Graphical Programming of Telerobotic Tasks

Large Scale Test Simulations using the Virtual Environment for Test Optimization

Centrally Managed. Ding Jun JLAB-ACC This process of name resolution and control point location

RECENT ENHANCEMENTS TO ANALYZED DATA ACQUISITION AND REMOTE PARTICIPATION AT THE DIII D NATIONAL FUSION FACILITY

SmartSacramento Distribution Automation

Development of Web Applications for Savannah River Site

Intelligent Grid and Lessons Learned. April 26, 2011 SWEDE Conference

PJM Interconnection Smart Grid Investment Grant Update

Tim Draelos, Mark Harris, Pres Herrington, and Dick Kromer Monitoring Technologies Department Sandia National Laboratories

@ST1. JUt EVALUATION OF A PROTOTYPE INFRASOUND SYSTEM ABSTRACT. Tom Sandoval (Contractor) Los Alamos National Laboratory Contract # W7405-ENG-36

Real Time Price HAN Device Provisioning

TUCKER WIRELINE OPEN HOLE WIRELINE LOGGING

Saiidia National Laboratories. work completed under DOE ST485D sponsored by DOE

On Demand Meter Reading from CIS

and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof.

High Scalability Resource Management with SLURM Supercomputing 2008 November 2008

2 Databases for calibration and bookkeeping purposes

Adding a System Call to Plan 9

PJM Interconnection Smart Grid Investment Grant Update

OPTIMIZING CHEMICAL SENSOR ARRAY SIZES

The ANLABM SP Scheduling System

MULTIPLE HIGH VOLTAGE MODULATORS OPERATING INDEPENDENTLY FROM A SINGLE COMMON 100 kv dc POWER SUPPLY

Alignment and Micro-Inspection System

Testing PL/SQL with Ounit UCRL-PRES

COMPUTATIONAL FLUID DYNAMICS (CFD) ANALYSIS AND DEVELOPMENT OF HALON- REPLACEMENT FIRE EXTINGUISHING SYSTEMS (PHASE II)

DOE EM Web Refresh Project and LLNL Building 280

NATIONAL GEOSCIENCE DATA REPOSITORY SYSTEM

MAS. &lliedsignal. Design of Intertransition Digitizing Decomutator KCP Federal Manufacturing & Technologies. K. L.

Site Impact Policies for Website Use

GA A22720 THE DIII D ECH MULTIPLE GYROTRON CONTROL SYSTEM

Entergy Phasor Project Phasor Gateway Implementation

Final Report for LDRD Project Learning Efficient Hypermedia N avi g a ti o n

The APS Control System-Network

GA A26400 CUSTOMIZABLE SCIENTIFIC WEB-PORTAL FOR DIII-D NUCLEAR FUSION EXPERIMENT

BWXT Y-12 Y-12. A BWXT/Bechtel Enterprise COMPUTER GENERATED INPUTS FOR NMIS PROCESSOR VERIFICATION. Y-12 National Security Complex

DERIVATIVE-FREE OPTIMIZATION ENHANCED-SURROGATE MODEL DEVELOPMENT FOR OPTIMIZATION. Alison Cozad, Nick Sahinidis, David Miller

Performance Evaluation of FDDI, ATM, and Gigabit Ethernet as Backbone Technologies Using Simulation

Systems Integration Tony Giroti, CEO Bridge Energy Group

Portable Data Acquisition System

Integrated Volt VAR Control Centralized

The NMT-5 Criticality Database

ALAMO: Automatic Learning of Algebraic Models for Optimization

VxWorks v5.1 Benchmark Tests

Contributors: Surabhi Jain, Gengbin Zheng, Maria Garzaran, Jim Cownie, Taru Doodi, and Terry L. Wilmarth

Washington DC October Consumer Engagement. October 4, Gail Allen, Sr. Manager, Customer Solutions

Optimizing Bandwidth Utilization in Packet Based Telemetry Systems. Jeffrey R Kalibjian

Southern Company Smart Grid

Leonard E. Duda Sandia National Laboratories. PO Box 5800, MS 0665 Albuquerque, NM

LosAlamos National Laboratory LosAlamos New Mexico HEXAHEDRON, WEDGE, TETRAHEDRON, AND PYRAMID DIFFUSION OPERATOR DISCRETIZATION

REAL-TIME CONTROL OF DIII D PLASMA DISCHARGES USING A LINUX ALPHA COMPUTING CLUSTER

Protecting Control Systems from Cyber Attack: A Primer on How to Safeguard Your Utility May 15, 2012

Charles L. Bennett, USA. Livermore, CA 94550

Testing of PVODE, a Parallel ODE Solver

MISO. Smart Grid Investment Grant Update. Kevin Frankeny NASPI Workgroup Meeting October 17-18, 2012

Clusters Using Nonlinear Magnification

Superconducting Super Collider Laboratory

Cross-Track Coherent Stereo Collections

Resource Management at LLNL SLURM Version 1.2

GA A22637 REAL TIME EQUILIBRIUM RECONSTRUCTION FOR CONTROL OF THE DISCHARGE IN THE DIII D TOKAMAK

TELECOMS SERVICES FOR SMALL AND MEDIUM-SIZED ENTERPRISES: WORLDWIDE FORECAST

Reduced Order Models for Oxycombustion Boiler Optimization

Big Data Computing for GIS Data Discovery

R. P. Evans (INEEL) J. G. Keller (INEEL) A. G. Stephens (ISU) J. Blotter (ISU) May 17, 1999 May 20, 1999

NFRC Spectral Data Library #4 for use with the WINDOW 4.1 Computer Program

ENDF/B-VII.1 versus ENDFB/-VII.0: What s Different?

Boosting Server-to-Server Gigabit Throughput with Jumbo Frames

State of Florida uses the power of technology to accomplish objectives. Sprint provides state-of-the-art voice and data solutions

Software Integration Plan for Dynamic Co-Simulation and Advanced Process Control

IN-SITU PROPERTY MEASUREMENTS ON LASER-DRAWN STRANDS OF SL 5170 EPOXY AND SL 5149 ACRYLATE

for use with the WINDOW 4.1 Computer Program

Information to Insight

Java Based Open Architecture Controller

National Accelerator Laboratory

Physics Department, Brookhaven National Laboratory, Upton, N Y, Physics Division, Los Alamos National Laboratory, Los Alamos, NM 87545

METADATA REGISTRY, ISO/IEC 11179

Fermi National Accelerator Laboratory

Hybrid Coax-Wireless Multimedia Home Networks Using Technology. Noam Geri, Strategic Marketing Manager

INTRODUCTION TO WIRELESS COMMUNICATION

FSEC Procedure for Testing Stand-Alone Photovoltaic Systems

Kemp Technologies LM-3600 IPv4 and IPv6 Performance Report

Data Intensive Science Impact on Networks

Programmable Information Highway (with no Traffic Jams)

IP Backbone Opportunities in Asia/Pacific (Executive Summary) Executive Summary

Parallel 3-0 Space Charge Calculations in the Unified Accelerator Libray

HELIOS CALCULATIONS FOR UO2 LATTICE BENCHMARKS

5A&-qg-oOL6c AN INTERNET ENABLED IMPACT LIMITER MATERIAL DATABASE

THE APS REAL-TIME ORBIT FEEDBACK SYSTEM Q),/I..lF j.carwardine and K. Evans Jr.

Transcription:

SSCLSR1222 June 1993 Distribution Category: 0 L. Cormell T. Johnson ESNET Requirements for Physics Reseirch at the SSCL Superconducting Super Collider Laboratory

Disclaimer Notice I This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government or any agency thereof, nor any of their employees. makes any warranty, express or implied. or assumes any legal liability or responsibility for the accuracy, completeness. or usefulness ot any information. apparatus, product. or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or sewice by trade name, trademark. manufacturer. or otherwise. does not necessarily constitute or imply its endorsement. rewmmendation. or favoring by the United States Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof. Superconducting Super Collider Laboratory is an equal opportunity employer.

DISCLAIMER Portions of this document may be illegible in electronic image products. Images are produced from the best available original document.

SSCLSR 1222 ESNET Requirements for Physics Research at the SSCL L. Cormel1 and T. Johnson Superconducting Super Collider Laboratory* 2550 Beckleymeade Ave. Dallas, TX 75237 June 1993 *Operated by the Universities Research Association, Inc., for the U.S. Department of Energy under Contract NO. DEAC3589ER486.

1.0 INTRODUCTION AND PURPOSE High energy physics (HEP) research at the Superconducting Super Collider Laboratory (SSCL) is a highly collaborative affair. Scientists participating in SSC research come from a worldwide distribution of institutions. The Solenoid Detector Collaboration (SDC) currently has more than 10 members from 20 countries. Likewise, the Gamma, Electron, Muon (GEM) collaboration members number more than oo from 17 countries. Roughly half of the collaborators on these experiments are from outside the U.S.Communications, in general, and data transmission, in particular, are crucial to the success of the collaborations and to the ultimate success of the SSC. The bulk of data transmission to and from the Laboratory is over the Energy Science NETwork (ESNET). The location of the SSCL and its current connections to the ESNET backbone by four T1 lines are shown in Figure 1. The purpose of this document is to describe the anticipated network capacity needed to provide adequate communication among these widespread collaborations.* / ESNET CONNECTIONS TO THE SSC LABORATORY LOCATED I N ELLIS COUNTY, TEXAS Figure 1. ESNET Connections to the SSC Laboratory. *Another set of data communications requirements deals with the need to transmit detector data locally within the SSCL site. These requirements are described in another document.

2.0 BACKGROUND Wide area network (WAN)requirements of the collaborations vary from simple interactive sessions such as email on the low end of the spectrum to the transmission of large amounts of bulk data from the SSCL to the collaborations regional analysis centers. The GEM and SDC detectors are being designed to collect and store data at a rate of several gigabits per second (Gbps). The collaborations will analyze much of the detector data at the SSCL and therefore will not require a broadcast of the many petabytes of data to each of the collaborating institutions. Nevertheless, the transmission of reduced samples of data to several regional analysis centers represents a major bandwidth requirement after the SSC becomes operational. At this time, several sources of network transmission over =NET can be identified, and models for the anticipated rampup of the usage can be constructed. The current and anticipated uses of ESNET include: Interactive sessions. These sessions currently include users logged into the PhysicsDetector Simulation Facility (PDSF) at the SSCL. Some of these sessions are xwindow based and some are simply line mode ASCII transmissions. Interactive sessions include activities such as editing, debugging, and email. Studio videoconferencing. Since the beginning of the SSCL videoconferencing has been a key form of communication for its distributed collaborations. The Laboratory and its collaborators have established eight videoconferencing studios to date. The savings in travel costs and the increased communication capabilities have shown that videoconferencing is very worthwhile. Desktop videoconferencing. Vendors of workstations and PCs are now providing inexpensive video boards and interfaces.with the addition of inexpensive CCD cameras, desktop videoconferencing becomes an extremely attractive alternative to travelkommuting and studio videoconferencing. The usage of desktop videoconferencing coupled with multimedia capabilities and shared white boards should greatly enhance the effectiveness of collaborating scientists throughout the 1990s. Multimedia communication. The transmission of digitally recorded graphics, video, and audio segments for memos, reports, and mail is a powerful method of communication. Essentially all computer vendors are now providing multimedia capabilities for their equipment. Regional analysis centers. Laboratories such as Fermilab (FNAL), Argonne (ANL), Oakridge (ORNL), and Brookhaven (BNL) are major collaborators on SDC and GEM. A considerable amount of analysis will take place at these labs and at associated universities. Some large fraction of the many petabytes (lo1sbytes) of SSC data will be transmitted to these centers. Remote control centers. SDC (and possibly GEM) is planning to provide realtime access to the online monitoring control and calibration systems of the experiment at several remote centers. Test beam data. Detector prototypes will be tested in 1995 and 1996 at Fermilab and at other labs. It is expected that data will be recorded at rates of about 35 Mbytesh, requiring the storage of many terabytes of data. It is planned to transmit this data over ESNET to the SSCL for storage and analysis. Other experimental data. Physicists from the SSCL are collaborating on both the CDF and DB experiments at Fermilab. Summary data from both experiments is routinely transmitted to the SSCL. Also, physicists in the SSCL Computing Department generate DO Monte Carlo data that is transmitted to Fermilab. 2

Each of these uses of ESNET is expected to grow at rapid rates and can easily saturate the current capacity of the T1 backbone. Everincreasing bandwidths will be required throughout the development and operation of the SSC. Delays in the improvement of the ESNET to T3,0Cl, and OC12 capabilities would cause a tremendous detriment to the capability of producing timely, highquality research from the SSC. 3.0 MODEL OF NETWORK USAGE It is anticipated that the SSC requirements for ESNET will grow rapidly from current rates of 1 Mbps or so to rates in excess of several Gbps. To plan for those needs and to determine the impact they may have on the national information infrastructure, we have developed a model that attempts to estimate the growth over a period of years. The model is based on the uses of the network described in the previous section, the number of participating scientists and their distribution, and the predicted data collection and analysis requirements for SSC detectors. 3.1 Assumptions A number of simplifying assumptions have to be made in order to predict the growth of network usage. These include: Schedule. The main factors in determining the schedule for large datatransmission requirements are the startup of test beam activities at FNAL and at the SSCL, and the beginning of the SSC operation. These dates are assumed to be FY 1995, FY 1997, and FY 2000, respectively. Other requirements, such as desktop televideo, simply start now and basically increase with time. Media. It is assumed that all of the data transmission described in this note will be over ESNET or followon networks in the national information infrastructure, such as NREN. While it is conceivable that some of this traffic may be carried by commercial or FTS telephone circuits, that assumption is not made here. Interactive usage. User accounts on the PDSF currently number about 500 and on SSCVXl about 1500. Nearly onehalf of these accounts are external users. The number of external accounts is expected to grow to about 15002000. Typically, about 20% of the users are actively using the systems during the day. Much of the network activity is simple command line input and transmission of small amounts of data, but an everincreasing amount of the traffic is xwindow based. It is assumed that eventually all interactive usage will be xbased, requiring a bandwidth of 0 KB/s per session. Studio videoconferencing. It is assumed that studio videoconferencing traffic will grow linearly or slightly faster throughout the development of the SSC. It is anticipated that the bandwidth per channel (128 Kbps) will remain constant and that video quality will be improved over the years by improved compression algorithms. Desktop videoconferencing. It is expected that traffic from desktop videoconferencing will grow exponentially for the next few years. Depending upon the frame rate and the compression algorithms being used, a twoway session may require a bandwidth of 300 Mbps. It is assumed that this mode of communication will eventually become the preferred mode rather than telephone or email. Multimedia. Multimedia usage is likely to grow exponentially for the next few years but increase at a slower rate of growth thereafter. A typical multimedia transmission might be a memo with an enclosed video, graphics, or audio clip of perhaps 10 MB of data. Eventually several thousand people are likely to transmit multimedia messages. Multi 3

media transmissions are likely to replace or become what is currently email communication. Regional centers. It is assumed that about %of the detector data collected at the SSCL will be transmitted to the collaborations regional centers. Remote control centers. It is assumed that data equivalent to about xwindow sessions and 5 televideo sessions will be transmitted to each of four regional control centers during the operation of the SSC. 3.2 Predictions The predicted usage and growth of ESNET by the SSCL and its collaborators is summarized in Table 1 and is graphically represented in Figure 2. In this model the network traffic is eventually dominated by transmission of data to the regional analysis centers and the remote control centers following the startup of the SSC experimental program. Second to those combined uses is the desktop videoconferencing,which is assumed to be a key factor in providing the daily communication necessary to plan, design, configure, install, and run the experiments. Transmission of data from the test beams to the SSCL represents the dominant requirement in the near term (Ey 199597). TABLE 1. ESTIMATED ESNET BANDWIDTH REQUIREMENTS (Mbps). 1994 1995 1996 1997 1998 1999 2000 2001 2002 0.8 1.o 2.5 20 30 20 0 500 00 1500 2000 2500 0.7 2.0 0 0 15 200 75 0 60 500 0 0 0 200 0 800 1600 3200 00 Interactive Sessions 20 30 50 30 20 Test Beam and Test Bed 1.o 50 0 too 150 150 150 1 50 150 Studio Videoconferencing Desktop Videoconferencing MultiMedia Bulk DataRegionalCenters Remote Control Centers Other Experiments TOTALS 1.o 4.0 97 27 282 465 4 1160 2141 3561 5880 7305

8000 7000 6000 f 00 5000 3000 2000 7 00 t 0 1994 1995 1996 Studio Videoconferencing Remote Control Centers : 1997 1998. Desktop Videoconferencing n Interactive ~ Sessions ~~~ 1999 2000 ~~ 2002 2001 ~ H MultiMedia H Bulk H Test Beam 8 Test a Other Experiments Bed ~ ~ DataRegional Centers Figure 2. Physics Communications Requirements in MB/s. 4.0 CONCLUSIONS The SSCL currently has four T1 connections at 1.5 Mbps each to ESNET, as shown in Figure 1. It is likely that T1 bandwidth will meet the communications needs for perhaps the next year or so. Early in 1995, both GEM and SDC are scheduled to begin test beam operation at FNAL.The bulk of this data will be transmitted back to the computing ranch at the SSCL for storage and analysis. At that time it is anticipated that the Laboratory and in particular the Physics Research Division will need the equivalent of three or more T3 lines at 45 Mbps each. At least one of these will be dedicated to the Fennilab test beams. By 1997 the bandwidth requirement is expected to grow to the OC12 level (622 Mbps). Based on this model it is anticipated that early in the next century the Laboratory will require a network backbone capable of supporting traffic in excess of OC48 rates (2.4 Gbps). 5