CU Boulder Research Cyberinfrastructure plan 1
|
|
- Nathaniel Greer
- 5 years ago
- Views:
Transcription
1 CU Boulder Research Cyberinfrastructure plan 1 Advanced information technology provides for the creation of robust new tools, organized and coordinated seamlessly, allowing the free flow of information, ideas, and results. Fully realizing this goal requires resources that span from the individual faculty member through medium-scale campus layer resources to large national centers, such as the National Science Foundationfunded XSEDE and Department of Energy Leadership Computing Facilities. This complex mix of advanced computing resources, people, and capabilities is sometimes referred to as cyberinfrastructure (CI), which we define as follows: Cyberinfrastructure consists of computational systems, data and information management, advanced instruments, visualization environments, and people, all linked together by software and advanced networks to improve scholarly productivity and enable knowledge breakthroughs and discoveries not otherwise possible 2. The mission of the research computing group is to provide leadership in developing, deploying, and operating such an integrated CI to allow CU Boulder achieve further preeminence as a research university. Research Computing Data Center Facility A centrally managed data center will reduce campus energy costs and carbon footprint by offering a more energy-efficient alternative to locally maintained server farms (currently estimated about 40 throughout CU-Boulder) as well as house mission-critical IT infrastructure. In turn, this will recapture office and lab space for teaching and research. Requirements for the data center: 1. Support high power density. Standard server racks now will have about 30kW. The power density can double or triple for GPU based racks. 2. Support next generation power infrastructure, e.g. 480 VAC and possible DC systems. Electrical failover and redundancy Centralized Research Storage Personnel (1 FTE): 1 Storage Engineer, 1 Used Blueprint for the Digital University ( to guide this document 2 Developing Coherent Cyberinfrastructure from Campus to the National Facilities (
2 Initial storage investment: 2 PB raw (1PB at each location), about $1,000,000, funded through the NSF-MRI grant petalibrary Servers: $80,000 for CIFS and NSF gateway servers Background and technical description Reliable, high-performance, professionally managed raw data storage is a fundamental building block of CU Boulder s research cyberinfrastructure. Simply stated, while users of computing (e.g., via their own clusters, through a campus condo cluster, an NSF computing center, or commercial clouds) can tolerate periods of downtime or limited availability, raw storage must be highly reliable and persistently available. The research computing group is currently working on a implementation of a scalable (multipetabyte) storage infrastructure with different service levels. Different researchers can tolerate different risk storing their data. We will provide the following services: Non replicated raid protected storage Replicated storage Replicated storage with tape backup At the most fundamental level, our storage will provide access to the following types of clients: Authenticated workstations and laptops on the CU network Laboratory and department owned clusters Primary/secondary storage for data intensive instruments Higher level data preservation services (see section on data management) Very high performance shared resource facilities such as the high performance computing facility It is clear that data storage requirements are not uniform across campus, researchers or labs. CU research storage with stable base funding provides for the critical infrastructure, know how, monitoring, and maintenance for a defined volume of storage that can grow over time. If for example, each faculty member were allocated 1 TB of long term storage this year, at the end of 10 years, that number could grow to 20TB/faculty member. However, this does not answer the question of How does a single researcher store 100TB today? To solve this issue, we will operate the CU research storage as a storage Condo. By that we mean the following: Researchers could write extramural grants to fund the acquisition of physical storage to meet their needs and a fraction of administration, security, and monitoring that scales above the basic personnel costs outlined above. It would be part of the final governance to determine appropriate cost recovery rates, but the research computing advisory committee will consider at least two different cost scenarios: Condo storage (above basic allocation) has a lifetime limited to the warranty of the project purchased storage. Condo storage has a lifetime equivalent to the core storage (unlimited).
3 In other words, costing should be calculated on the basis of both limited lifetime and infinite lifetime for long-term data preservation of large-scale research data. Research Data Management (institutional stewardship of research data) Background Members of the CU Bolder research community routinely produce large amounts of data that need to be stored, analyzed, and preserved. These research data sets and their derivative output (e.g., publications, visualizations, etc.) represent the intellectual capital of the University; they have inherent and enduring value and must be preserved and made readily accessible for reuse by future researchers. Today s interdisciplinary research challenges cannot be addressed without the ability to combine data from disparate disciplines. Researchers need to know: (1) what relevant data exist, (2) how to retrieve, (3) how to combine, and (4) how to mine and analyze them using the latest data mining, analysis, and visualization tools. Granting agencies understand this fundamental scientific need, and are increasingly making it a condition of funding that researchers have a plan for preserving their data and for making it discoverable and available for reuse by other researchers. To keep CU Boulder competitive, the research computing group will develop baseline data services that respond to these new realities. The proposed CU Research Data Archive is a suite of three core services designed to support the needs of modern researchers: Data Analysis and Visualization Data Curation Data Discovery and Integration These services complement each other and provide a horizontal stack of data services that cover both active contemporary use and preservation for future use. Data Analysis and Visualization Baseline data analysis and visualization services will be provided to all CU researchers as one of the core services of the CU Research Data Archive. Discovering the knowledge that is buried in large data collections is a team effort that includes the researchers who created or gathered the data, the staff who host the data, and the specialists who can analyze and visualize the data. There are several aspects to this work: Data migration, upload, metadata creation, and management: bringing data into active disk areas where they can be accessed, synthesized, and used. Interface creation: adding front ends for either the data owners or their designated audiences to access and manipulate the data. Data analysis and mining: providing services that use advanced statistical and database processes to create usable data sets out of raw data.
4 Database/management tools selection (Oracle, MySQL, SRB, etc.): helping data owners and users understand the options at their disposal and helping them choose the most appropriate tools for their needs. Distributed data management: working with data owners and researchers who have data scattered across different sources and in different locations, synthesizing it to form a more coherent working environment. Database application tuning, and database optimization: providing ongoing advanced database support for a myriad of activities. Schema design and SQL query tuning: helping with advanced data searching services for a wide variety of data. These tasks are all necessarily active in nature, and involve researchers and service providers working directly with the data on a nearly continuous basis. Only by doing this can they provide users with the ability to organize, process, and manage large quantities of research data into data collections for data driven discovery. The visualization services at CU will provide users with a wide range of tools and services to aid in their scientific research. Data Curation Data curation encompasses the following three concepts: Curation: The activity of managing and promoting the use of data from their creation, to ensure they are fit for contemporary use and available for discovery and reuse. For dynamic data sets this may mean continuous updating or monitoring to keep them fit for future research. Higher levels of curation can involve maintaining links and annotations with published materials. Archiving: A curation activity that ensures that data are properly selected, appraised, stored, and made accessible. The logical and physical integrity of the data are maintained over time, including security and authenticity. Preservation: An archiving activity in which specific items or collections are maintained over time so that they can be accessed and remain viable in subsequent technology environments. It is important to note that archiving and preservation are subsets of the larger curation process, which is a much broader, planned, and interactive process. Data curation is critically important for a research institution because it provides two vital services needed to ensure data longevity: Data are not merely stored, but are preserved to overcome the technical obsolescence inherent and unavoidable in any storage system. Data are documented in such a way that they can be linked in scientific publications and meet the requirements of funding agencies. Staff of the CU Libraries and CU research computing would provide the curation service component of the CU Research Data Archive jointly. The CU Libraries would provide curatorial
5 oversight and bibliographic control and integration services. CU research computing staff would provide the back end technology services needed to actively maintain the data and the storage systems holding them. Staff from both organizations will provide the metadata services necessary to ensure that data remain discoverable and accessible. The data itself would be housed on campus, in the CU campus storage facility. It should be noted that this is merely the first level of storage needed. For true long-term preservation, it is essential to plan for storage that is not on campus. If this is not done, data are always dependent on a single point of failure, and are thus highly vulnerable. Baseline investments are required to establish geographically distributed replicas of data. Since data are inextricably dependent on a mediating technological infrastructure and subject to loss occasioned by either environmental, organizational, or technological disruptions, it is imperative that vital campus research data be replicated in at least two remote sites geographically, organizationally, and technically independent of each other and that the entire enterprise be anchored within a reliable and predictable baseline source of revenue, as even a temporary interruption of proactive curation activities can lead to irreparable loss. For this reason, another layer of service is required that stores exact duplicates of the data offsite. Research computing is working with NCAR to allow for this type of storage. Data Discovery and Integration The Research Data Depot would provide a portal to facilitate the discovery of, and access to, the research data held in the Research Data Archive. This service would include facilities for the registration and description of collections, services to support the submission of collections, and assistance for the use, reuse, and amalgamation of data sets for further research. The portal would assign persistent identifiers to each of the data collections, provide the ability to search across all collections that have been registered, link the data collections to their author(s), link the data collections to the resultant analyses and visualizations, and link the data collections to their published output through the integration of portal content with traditional library discovery tools and databases. Where appropriate, the contents of the CU Research Data Archive would be offered for harvesting and crawling by external discovery tools such as Google or disciplinary content aggregators. Research Computing Network 1 FTE in support of the network. Background
6 The current CU Boulder network is a 1gb (giga-bit, a mere 10 to the 9th) fiber core with Cat 5 or higher wiring throughout campus. We have created a 10 gb research computing core to connect the supercomputer to storage, bring individual dedicated 10gb circuits to various locations as needed, and create a 40gb circuit between the supercomputer and storage to be located at NCAR. CU Boulder participates in I2 (the Internet 2 higher education, government, and vendor research computing consortium) and is an active member of regional gigapops (high speed networking points of presence or pops) and other network. See description in the following subsections. The NSO will be able to get access to fiber to support all connectivity needs necessary at that time with in the Boulder area and to the national networking infrastructure. CU Boulders research computing group will provide all campus researcher s with a leading edge network that meets their needs and facilitates collaboration, high performance data exchange, access to colocation facilities, remote mounts to storage, and real time communications. To that end, we are currently building a Research Cyberinfrastructure Network (RCN) that will be used by every research lab whose requirements go beyond the standard production network. The RCN will complement the standard production network and will be designed for ultra high performance. It should be the first campus environment for implementation of newer technologies before being adopted into the standard production network. Funding and access philosophy should aim to encourage usage of the network. Computational Resources Supercomputer The Dell supercomputer is currently listed as the 31st fastest computer in the world on top500.org. The machine was setup in Texas where Michael Obert ran the High Performance Linpack benchmark, a program solving a very large system of equations, to get a performance of teraflops or a trillion floating point operations per second. The computer consists of 342 blade chassis cloud edge servers, each chassis holds 4 servers for a total of 1368 servers. Each server has 2 sockets with six CPU cores per socket for a total of 16,416 cores. Condo model Working on pilot project with the Institute of Cognitive Sciences. Current approach with ICS is to incentivize ICS to shutdown existing resource in the CINC building and move to condo cluster managed by RC. Developing with ICS policies that will move researchers from their local resources to the centralized RC resources.
ACCI Recommendations on Long Term Cyberinfrastructure Issues: Building Future Development
ACCI Recommendations on Long Term Cyberinfrastructure Issues: Building Future Development Jeremy Fischer Indiana University 9 September 2014 Citation: Fischer, J.L. 2014. ACCI Recommendations on Long Term
More informationEMC Virtual Infrastructure for Microsoft Applications Data Center Solution
EMC Virtual Infrastructure for Microsoft Applications Data Center Solution Enabled by EMC Symmetrix V-Max and Reference Architecture EMC Global Solutions Copyright and Trademark Information Copyright 2009
More informationOneUConn IT Service Delivery Vision
OneUConn IT Service Delivery Vision The University s Academic Vision establishes a foundation and high expectations for excellence in research, teaching, learning, and outreach for all of UConn s campuses.
More informationData Curation Handbook Steps
Data Curation Handbook Steps By Lisa R. Johnston Preliminary Step 0: Establish Your Data Curation Service: Repository data curation services should be sustained through appropriate staffing and business
More informationData Protection for Cisco HyperFlex with Veeam Availability Suite. Solution Overview Cisco Public
Data Protection for Cisco HyperFlex with Veeam Availability Suite 1 2017 2017 Cisco Cisco and/or and/or its affiliates. its affiliates. All rights All rights reserved. reserved. Highlights Is Cisco compatible
More informationCyberinfrastructure!
Cyberinfrastructure! David Minor! UC San Diego Libraries! San Diego Supercomputer Center! January 4, 2012! Cyberinfrastructure:! History! Definitions! Examples! History! mid-1990s:! High performance computing
More informationCyberinfrastructure Framework for 21st Century Science & Engineering (CIF21)
Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation, Integration Alan Blatecky Director OCI 1 1 Framing the
More informationMicrosoft Office SharePoint Server 2007
Microsoft Office SharePoint Server 2007 Enabled by EMC Celerra Unified Storage and Microsoft Hyper-V Reference Architecture Copyright 2010 EMC Corporation. All rights reserved. Published May, 2010 EMC
More informationEMC Integrated Infrastructure for VMware. Business Continuity
EMC Integrated Infrastructure for VMware Business Continuity Enabled by EMC Celerra and VMware vcenter Site Recovery Manager Reference Architecture Copyright 2009 EMC Corporation. All rights reserved.
More informationEffective: 12/31/17 Last Revised: 8/28/17. Responsible University Administrator: Vice Chancellor for Information Services & CIO
Effective: 12/31/17 Last Revised: 8/28/17 Responsible University Administrator: Vice Chancellor for Information Services & CIO Responsible University Office: Information Technology Services Policy Contact:
More informationINTRODUCING VERITAS BACKUP EXEC SUITE
INTRODUCING VERITAS BACKUP EXEC SUITE January 6, 2005 VERITAS ARCHITECT NETWORK TABLE OF CONTENTS Managing More Storage with Fewer Resources...3 VERITAS Backup Exec Suite...3 Continuous Data Protection...
More informationTHE NATIONAL DATA SERVICE(S) & NDS CONSORTIUM A Call to Action for Accelerating Discovery Through Data Services we can Build Ed Seidel
THE NATIONAL DATA SERVICE(S) & NDS CONSORTIUM A Call to Action for Accelerating Discovery Through Data Services we can Build Ed Seidel National Center for Supercomputing Applications University of Illinois
More informationMicrosoft SharePoint Server 2013 Plan, Configure & Manage
Microsoft SharePoint Server 2013 Plan, Configure & Manage Course 20331-20332B 5 Days Instructor-led, Hands on Course Information This five day instructor-led course omits the overlap and redundancy that
More informationOrganizational Update: December 2015
Organizational Update: December 2015 David Hudak Doug Johnson Alan Chalker www.osc.edu Slide 1 OSC Organizational Update Leadership changes State of OSC Roadmap Web app demonstration (if time) Slide 2
More information2017 Resource Allocations Competition Results
2017 Resource Allocations Competition Results Table of Contents Executive Summary...3 Computational Resources...5 CPU Allocations...5 GPU Allocations...6 Cloud Allocations...6 Storage Resources...6 Acceptance
More informationBUSINESS CONTINUITY: THE PROFIT SCENARIO
WHITE PAPER BUSINESS CONTINUITY: THE PROFIT SCENARIO THE BENEFITS OF A COMPREHENSIVE BUSINESS CONTINUITY STRATEGY FOR INCREASED OPPORTUNITY Organizational data is the DNA of a business it makes your operation
More informationGlobal Headquarters: 5 Speen Street Framingham, MA USA P F
Global Headquarters: 5 Speen Street Framingham, MA 01701 USA P.508.872.8200 F.508.935.4015 www.idc.com W H I T E P A P E R T h e B u s i n e s s C a s e f o r M i d r a n g e T a p e L i b r a r y S o
More informationThe Computation and Data Needs of Canadian Astronomy
Summary The Computation and Data Needs of Canadian Astronomy The Computation and Data Committee In this white paper, we review the role of computing in astronomy and astrophysics and present the Computation
More informationData Management Glossary
Data Management Glossary A Access path: The route through a system by which data is found, accessed and retrieved Agile methodology: An approach to software development which takes incremental, iterative
More informationData publication and discovery with Globus
Data publication and discovery with Globus Questions and comments to outreach@globus.org The Globus data publication and discovery services make it easy for institutions and projects to establish collections,
More informationCollege of Agricultural Sciences UNIT STRATEGIC PLANNING UPDATES MARCH 2, Information Technologies
College of Agricultural Sciences UNIT STRATEGIC PLANNING UPDATES MARCH 2, 2009 Information Technologies UNIT STRATEGIC PLANNING UPDATES MARCH 2, 2009 Information Technologies Executive Summary Challenges
More informationEMC Business Continuity for Microsoft Applications
EMC Business Continuity for Microsoft Applications Enabled by EMC Celerra, EMC MirrorView/A, EMC Celerra Replicator, VMware Site Recovery Manager, and VMware vsphere 4 Copyright 2009 EMC Corporation. All
More informationProtecting Future Access Now Models for Preserving Locally Created Content
Protecting Future Access Now Models for Preserving Locally Created Content By Amy Kirchhoff Archive Service Product Manager, Portico, ITHAKA Amigos Online Conference Digital Preservation: What s Now, What
More informationNew Zealand Government IBM Infrastructure as a Service
New Zealand Government IBM Infrastructure as a Service A world class agile cloud infrastructure designed to provide quick access to a security-rich, enterprise-class virtual server environment. 2 New Zealand
More informationIBM System Storage DS5020 Express
IBM DS5020 Express Manage growth, complexity, and risk with scalable, high-performance storage Highlights Mixed host interfaces support (FC/iSCSI) enables SAN tiering Balanced performance well-suited for
More informationORACLE DATABASE LIFECYCLE MANAGEMENT PACK
ORACLE DATABASE LIFECYCLE MANAGEMENT PACK ORACLE DATABASE LIFECYCLE MANAGEMENT PACK KEY FEATURES Auto Discovery of hosts Inventory tracking and reporting Database provisioning Schema and data change management
More informationEarthCube and Cyberinfrastructure for the Earth Sciences: Lessons and Perspective from OpenTopography
EarthCube and Cyberinfrastructure for the Earth Sciences: Lessons and Perspective from OpenTopography Christopher Crosby, San Diego Supercomputer Center J Ramon Arrowsmith, Arizona State University Chaitan
More informationNC Education Cloud Feasibility Report
1 NC Education Cloud Feasibility Report 1. Problem Definition and rationale North Carolina districts are generally ill-equipped to manage production server infrastructure. Server infrastructure is most
More informationVeritas Storage Foundation for Windows by Symantec
Veritas Storage Foundation for Windows by Symantec Advanced online storage management Data Sheet: Storage Management Overview Veritas Storage Foundation 6.0 for Windows brings advanced online storage management
More informationBSIT 1 Technology Skills: Apply current technical tools and methodologies to solve problems.
Bachelor of Science in Information Technology At Purdue Global, we employ a method called Course-Level Assessment, or CLA, to determine student mastery of Course Outcomes. Through CLA, we measure how well
More informationHow Microsoft IT Reduced Operating Expenses Using Virtualization
How Microsoft IT Reduced Operating Expenses Using Virtualization Published: May 2010 The following content may no longer reflect Microsoft s current position or infrastructure. This content should be viewed
More informationMigrating a critical high-performance platform to Azure with zero downtime
Microsoft IT Showcase Migrating a critical high-performance platform to Azure with zero downtime At Microsoft IT, our strategy is to move workloads from on-premises datacenters to Azure. So, when the Microsoft
More informationDell Storage Point of View: Optimize your data everywhere
Dell Storage Point of View: Optimize your data everywhere Fluid Data Architecture Dell Point of View 1 Executive summary Business moves faster every day. The volume, velocity and value of the data you
More informationData Management Checklist
Data Management Checklist Managing research data throughout its lifecycle ensures its long-term value and prevents data from falling into digital obsolescence. Proper data management is a key prerequisite
More informationCisco Unified Computing System Delivering on Cisco's Unified Computing Vision
Cisco Unified Computing System Delivering on Cisco's Unified Computing Vision At-A-Glance Unified Computing Realized Today, IT organizations assemble their data center environments from individual components.
More informationDRS Policy Guide. Management of DRS operations is the responsibility of staff in Library Technology Services (LTS).
Harvard University Library Office for Information Systems DRS Policy Guide This Guide defines the policies associated with the Harvard Library Digital Repository Service (DRS) and is intended for Harvard
More informationGeorgia State University Cyberinfrastructure Plan
Georgia State University Cyberinfrastructure Plan Summary Building relationships with a wide ecosystem of partners, technology, and researchers are important for GSU to expand its innovative improvements
More informationCarbonite Availability. Technical overview
Carbonite Availability Technical overview Table of contents Executive summary The availability imperative...3 True real-time replication More efficient and better protection... 4 Robust protection Reliably
More informationMicrosoft SQL Server on Stratus ftserver Systems
W H I T E P A P E R Microsoft SQL Server on Stratus ftserver Systems Security, scalability and reliability at its best Uptime that approaches six nines Significant cost savings for your business Only from
More informationBusiness Continuity and Disaster Recovery. Ed Crowley Ch 12
Business Continuity and Disaster Recovery Ed Crowley Ch 12 Topics Disaster Recovery Business Impact Analysis MTBF and MTTR RTO and RPO Redundancy Failover Backup Sites Load Balancing Mirror Sites Disaster
More informationConducting a Self-Assessment of a Long-Term Archive for Interdisciplinary Scientific Data as a Trustworthy Digital Repository
Conducting a Self-Assessment of a Long-Term Archive for Interdisciplinary Scientific Data as a Trustworthy Digital Repository Robert R. Downs and Robert S. Chen Center for International Earth Science Information
More informationThat Set the Foundation for the Private Cloud
for Choosing Virtualization Solutions That Set the Foundation for the Private Cloud solutions from work together to harmoniously manage physical and virtual environments, enabling the use of multiple hypervisors
More informationData Governance Central to Data Management Success
Data Governance Central to Data Success International Anne Marie Smith, Ph.D. DAMA International DMBOK Editorial Review Board Primary Contributor EWSolutions, Inc Principal Consultant and Director of Education
More information7 Ways Compellent Optimizes VMware Server Virtualization WHITE PAPER FEBRUARY 2009
7 Ways Compellent Optimizes ware Virtualization WHITE PAPER FEBRUARY 2009 Introduction Increasingly, enterprises are turning to server virtualization to enhance IT flexibility while reducing costs. virtualization
More informationECONOMICAL, STORAGE PURPOSE-BUILT FOR THE EMERGING DATA CENTERS. By George Crump
ECONOMICAL, STORAGE PURPOSE-BUILT FOR THE EMERGING DATA CENTERS By George Crump Economical, Storage Purpose-Built for the Emerging Data Centers Most small, growing businesses start as a collection of laptops
More informationCore Services for ediscovery Perfection
BEST-IN-CLASS DATA ENVIRONMENTS. Core Services for ediscovery Perfection MANAGE MANAGE IMPLEMENT IMPLEMENT ASSESS Core Services for ediscovery Perfection George Jon is an ediscovery infrastructure specialist
More informationScientific Data Curation and the Grid
Scientific Data Curation and the Grid David Boyd CLRC e-science Centre http://www.e-science.clrc.ac.uk/ d.r.s.boyd@rl.ac.uk 19 October 2001 Digital Curation Seminar 1 Outline Some perspectives on scientific
More informationAdvancing Library Cyberinfrastructure for Big Data Sharing and Reuse. Zhiwu Xie
Advancing Library Cyberinfrastructure for Big Data Sharing and Reuse Zhiwu Xie 2017 NFAIS Annual Conference, Feb 27, 2017 Big Data: How Big? Moving yardstick No longer unique to big science 1000 Genomes
More informationThe Microsoft Large Mailbox Vision
WHITE PAPER The Microsoft Large Mailbox Vision Giving users large mailboxes without breaking your budget Introduction Giving your users the ability to store more email has many advantages. Large mailboxes
More informationBuilding on to the Digital Preservation Foundation at Harvard Library. Andrea Goethals ABCD-Library Meeting June 27, 2016
Building on to the Digital Preservation Foundation at Harvard Library Andrea Goethals ABCD-Library Meeting June 27, 2016 What do we already have? What do we still need? Where I ll focus DIGITAL PRESERVATION
More informationEUDAT- Towards a Global Collaborative Data Infrastructure
EUDAT- Towards a Global Collaborative Data Infrastructure FOT-Net Data Stakeholder Meeting Brussels, 8 March 2016 Yann Le Franc, PhD e-science Data Factory, France CEO and founder EUDAT receives funding
More information<Insert Picture Here> Enterprise Data Management using Grid Technology
Enterprise Data using Grid Technology Kriangsak Tiawsirisup Sales Consulting Manager Oracle Corporation (Thailand) 3 Related Data Centre Trends. Service Oriented Architecture Flexibility
More information5 Fundamental Strategies for Building a Data-centered Data Center
5 Fundamental Strategies for Building a Data-centered Data Center June 3, 2014 Ken Krupa, Chief Field Architect Gary Vidal, Solutions Specialist Last generation Reference Data Unstructured OLTP Warehouse
More informationTITLE. the IT Landscape
The Impact of Hyperconverged Infrastructure on the IT Landscape 1 TITLE Drivers for adoption Lower TCO Speed and Agility Scale Easily Operational Simplicity Hyper-converged Integrated storage & compute
More informationNEC Express5800 R320f Fault Tolerant Servers & NEC ExpressCluster Software
NEC Express5800 R320f Fault Tolerant Servers & NEC ExpressCluster Software Downtime Challenges and HA/DR Solutions Undergoing Paradigm Shift with IP Causes of Downtime: Cost of Downtime: HA & DR Solutions:
More informationIntroduction to FREE National Resources for Scientific Computing. Dana Brunson. Jeff Pummill
Introduction to FREE National Resources for Scientific Computing Dana Brunson Oklahoma State University High Performance Computing Center Jeff Pummill University of Arkansas High Peformance Computing Center
More informationEMC XTREMCACHE ACCELERATES ORACLE
White Paper EMC XTREMCACHE ACCELERATES ORACLE EMC XtremSF, EMC XtremCache, EMC VNX, EMC FAST Suite, Oracle Database 11g XtremCache extends flash to the server FAST Suite automates storage placement in
More informationHPE MSA 2042 Storage. Data sheet
HPE MSA 2042 Storage HPE MSA 2042 Storage offers an entry storage platform with built-in hybrid flash for application acceleration and high performance. It is ideal for performance-hungry applications
More informationEMC Virtual Architecture for Microsoft SharePoint Server Reference Architecture
EMC Virtual Architecture for Microsoft SharePoint Server 2007 Enabled by EMC CLARiiON CX3-40, VMware ESX Server 3.5 and Microsoft SQL Server 2005 Reference Architecture EMC Global Solutions Operations
More informationEMC XTREMCACHE ACCELERATES VIRTUALIZED ORACLE
White Paper EMC XTREMCACHE ACCELERATES VIRTUALIZED ORACLE EMC XtremSF, EMC XtremCache, EMC Symmetrix VMAX and Symmetrix VMAX 10K, XtremSF and XtremCache dramatically improve Oracle performance Symmetrix
More informationSustainable Governance for Long-Term Stewardship of Earth Science Data
Sustainable Governance for Long-Term Stewardship of Earth Science Data Robert R. Downs and Robert S. Chen NASA Socioeconomic Data and Applications Center (SEDAC) Center for International Earth Science
More informationControlling Costs and Driving Agility in the Datacenter
Controlling Costs and Driving Agility in the Datacenter Optimizing Server Infrastructure with Microsoft System Center Microsoft Corporation Published: November 2007 Executive Summary To help control costs,
More informationDiscover the all-flash storage company for the on-demand world
Discover the all-flash storage company for the on-demand world STORAGE FOR WHAT S NEXT The applications we use in our personal lives have raised the level of expectations for the user experience in enterprise
More informationUniversity of California, Riverside. Computing and Communications. Computational UCR. March 28, Introduction 2
University of California, Riverside Computing and Communications Computational Clusters @ UCR March 28, 2008 1 Introduction 2 2 Campus Datacenter Co-Location Facility 2 3 Three Cluster Models 2 3.1 Departmentally
More informationCloud Confidence: Simple Seamless Secure. Dell EMC Data Protection for VMware Cloud on AWS
Cloud Confidence: Simple Seamless Secure Dell EMC Data Protection for VMware Cloud on AWS Introduction From the boardroom to the data center, digital transformation has become a business imperative. Whether
More informationEnergy Action Plan 2015
Energy Action Plan 2015 Purpose: In support of the Texas A&M University Vision 2020: Creating a Culture of Excellence and Action 2015: Education First Strategic Plan, the Energy Action Plan (EAP) 2015
More informationData Virtualization Implementation Methodology and Best Practices
White Paper Data Virtualization Implementation Methodology and Best Practices INTRODUCTION Cisco s proven Data Virtualization Implementation Methodology and Best Practices is compiled from our successful
More informationUniversity of British Columbia Library. Persistent Digital Collections Implementation Plan. Final project report Summary version
University of British Columbia Library Persistent Digital Collections Implementation Plan Final project report Summary version May 16, 2012 Prepared by 1. Introduction In 2011 Artefactual Systems Inc.
More informationNetwork Performance, Security and Reliability Assessment
Network Performance, Security and Reliability Assessment Presented to: CLIENT NAME OMITTED Drafted by: Verteks Consulting, Inc. 2102 SW 20 th Place, Suite 602 Ocala, Fl 34474 352-401-0909 ASSESSMENT SCORECARD
More informationEMC Backup and Recovery for Microsoft Exchange 2007
EMC Backup and Recovery for Microsoft Exchange 2007 Enabled by EMC CLARiiON CX4-120, Replication Manager, and Hyper-V on Windows Server 2008 using iscsi Reference Architecture Copyright 2009 EMC Corporation.
More informationEMC Virtual Infrastructure for Microsoft SharePoint Server 2010 Enabled by EMC CLARiiON and VMware vsphere 4
EMC Virtual Infrastructure for Microsoft SharePoint Server 2010 Enabled by EMC CLARiiON and VMware vsphere 4 A Detailed Review EMC Information Infrastructure Solutions Abstract Customers are looking for
More informationOracle Database and Application Solutions
Oracle Database and Application Solutions Overview The success of Oracle s products is based on three principles: Simplify Enterprises must increase the speed of information delivery with Integrated Systems,
More informationIntroduction to Grid Computing
Milestone 2 Include the names of the papers You only have a page be selective about what you include Be specific; summarize the authors contributions, not just what the paper is about. You might be able
More informationBig Data 2015: Sponsor and Participants Research Event ""
Big Data 2015: Sponsor and Participants Research Event "" Center for Large-scale Data Systems Research, CLDS! San Diego Supercomputer Center! UC San Diego! Agenda" Welcome and introductions! SDSC: Who
More informationTransform your bottom line: 5G Fixed Wireless Access
Transform your bottom line: 5G Fixed Wireless Access Transform Your Bottom Line: 5G Fixed Wireless Access 1 Seizing the opportunity of 5G with Fixed Wireless Access To get a sense of the future of broadband,
More informationSimplifying Downtime Prevention for Industrial Plants. A Guide to the Five Most Common Deployment Approaches
Simplifying Downtime Prevention for Industrial Plants A Guide to the Five Most Common Deployment Approaches Simplifying Downtime Prevention for Industrial Plants: A Guide to the Five Most Common Deployment
More informationMySQL for Database Administrators Ed 4
Oracle University Contact Us: (09) 5494 1551 MySQL for Database Administrators Ed 4 Duration: 5 Days What you will learn The MySQL for Database Administrators course teaches DBAs and other database professionals
More informationSurvey of Research Data Management Practices at the University of Pretoria
Survey of Research Data Management Practices at the University of Pretoria Undertaken by the Department of Library Services in order to improve research practices at the University Unisa Library Open Access
More informationAdvanced Solutions of Microsoft SharePoint Server 2013 Course Contact Hours
Advanced Solutions of Microsoft SharePoint Server 2013 Course 20332 36 Contact Hours Course Overview This course examines how to plan, configure, and manage a Microsoft SharePoint Server 2013 environment.
More informationData Curation Profile Human Genomics
Data Curation Profile Human Genomics Profile Author Profile Author Institution Name Contact J. Carlson N. Brown Purdue University J. Carlson, jrcarlso@purdue.edu Date of Creation October 27, 2009 Date
More informationChain of Preservation Model Diagrams and Definitions
International Research on Permanent Authentic Records in Electronic Systems (InterPARES) 2: Experiential, Interactive and Dynamic Records APPENDIX 14 Chain of Preservation Model Diagrams and Definitions
More informationIT Town Hall Meeting
IT Town Hall Meeting Scott F. Midkiff Vice President for Information Technology and CIO Professor of Electrical and Computer Engineering Virginia Tech midkiff@vt.edu October 20, 2015 10/20/2015 IT Town
More informationKroll Ontrack VMware Forum. Survey and Report
Kroll Ontrack VMware Forum Survey and Report Contents I. Defining Cloud and Adoption 4 II. Risks 6 III. Challenging Recoveries with Loss 7 IV. Questions to Ask Prior to Engaging in Cloud storage Solutions
More informationSurveillance Dell EMC Storage with IndigoVision Control Center
Surveillance Dell EMC Storage with IndigoVision Control Center Sizing Guide H14832 REV 1.1 Copyright 2016-2017 Dell Inc. or its subsidiaries. All rights reserved. Published May 2016 Dell believes the information
More informationReducing Costs in the Data Center Comparing Costs and Benefits of Leading Data Protection Technologies
Reducing Costs in the Data Center Comparing Costs and Benefits of Leading Data Protection Technologies November 2007 Reducing Costs in the Data Center Table of Contents The Increasingly Costly Data Center...1
More informationNational Data Sharing and Accessibility Policy-2012 (NDSAP-2012)
National Data Sharing and Accessibility Policy-2012 (NDSAP-2012) Department of Science & Technology Ministry of science & Technology Government of India Government of India Ministry of Science & Technology
More informationReal-time Protection for Microsoft Hyper-V
Real-time Protection for Microsoft Hyper-V Introduction Computer virtualization has come a long way in a very short time, triggered primarily by the rapid rate of customer adoption. Moving resources to
More informationIntegrated Data Management:
Integrated Data Management: An Innovative Approach to Improved IT Productivity and Flexibility VS Joshi, Ravi Chalaka, Network Appliance, April 2006, TR-3467 Executive Summary Typical IT organizations
More informationDell EMC Storage with the Avigilon Control Center System
Dell EMC Storage with the Avigilon Control Center System Surveillance July 2018 H15398.4 Sizing Guide Abstract This guide helps you understand the benefits of using a Dell EMC storage solution with Avigilon
More informationBlackPearl Customer Created Clients Using Free & Open Source Tools
BlackPearl Customer Created Clients Using Free & Open Source Tools December 2017 Contents A B S T R A C T... 3 I N T R O D U C T I O N... 3 B U L D I N G A C U S T O M E R C R E A T E D C L I E N T...
More informationVeritas NetBackup Appliance Family OVERVIEW BROCHURE
Veritas NetBackup Appliance Family OVERVIEW BROCHURE Veritas NETBACKUP APPLIANCES Veritas understands the shifting needs of the data center and offers NetBackup Appliances as a way for customers to simplify
More informationEnterpriseLink Benefits
EnterpriseLink Benefits GGY a Moody s Analytics Company 5001 Yonge Street Suite 1300 Toronto, ON M2N 6P6 Phone: 416-250-6777 Toll free: 1-877-GGY-AXIS Fax: 416-250-6776 Email: axis@ggy.com Web: www.ggy.com
More informationBy 2014, World-Wide file based
By 2014, World-Wide file based Organizations that need to manage large quantities of data continually face an ongoing challenge: managing and storing more data. As your storage increases, you encounter
More informationWHITE PAPER Cloud FastPath: A Highly Secure Data Transfer Solution
WHITE PAPER Cloud FastPath: A Highly Secure Data Transfer Solution Tervela helps companies move large volumes of sensitive data safely and securely over network distances great and small. We have been
More informationVeritas InfoScale Enterprise for Oracle Real Application Clusters (RAC)
Veritas InfoScale Enterprise for Oracle Real Application Clusters (RAC) Manageability and availability for Oracle RAC databases Overview Veritas InfoScale Enterprise for Oracle Real Application Clusters
More informationPUT DATA PROTECTION WHERE YOU NEED IT
from Carbonite PUT DATA PROTECTION WHERE YOU NEED IT Flexibility is your best friend when it comes to choosing a backup plan. For a backup solution to be considered flexible, it needs to satisfy several
More informationScientific data processing at global scale The LHC Computing Grid. fabio hernandez
Scientific data processing at global scale The LHC Computing Grid Chengdu (China), July 5th 2011 Who I am 2 Computing science background Working in the field of computing for high-energy physics since
More informationConnecticut Department of Department of Administrative Services and the Broadband Technology Opportunity Program (BTOP) 8/20/2012 1
Connecticut Department of Department of Administrative Services and the Broadband Technology Opportunity Program (BTOP) 8/20/2012 1 Presentation Overview What is BTOP? Making BTOP work for our state What
More informationarcserve r16.5 Hybrid data protection
arcserve r16.5 Hybrid data protection Whether you re protecting the data center, remote offices or desktop resources, you need a solution that helps you meet today s demanding service-level agreements
More informationArchiving, Backup, and Recovery for Complete the Promise of Virtualisation Unified information management for enterprise Windows environments
Archiving, Backup, and Recovery for Complete the Promise of Virtualisation Unified information management for enterprise Windows environments The explosion of unstructured information It is estimated that
More information