Chemistry Packages at CHPC. Anita M. Orendt Center for High Performance Computing Fall 2011

Size: px
Start display at page:

Download "Chemistry Packages at CHPC. Anita M. Orendt Center for High Performance Computing Fall 2011"

Transcription

1 Chemistry Packages at CHPC Anita M. Orendt Center for High Performance Computing Fall 2011

2 Purpose of Presentation Identify the computational chemistry software and related tools currently available at CHPC Present brief overview of these packages Present how to access packages on CHPC Next presentation Gaussian09 & GaussView 17 November 2011

3 Brief Overview CHPC Resources Computational Clusters sanddunearch, updraft and ember Home directory NFS mounted on all clusters /uufs/chpc.utah.edu/common/home/<unid> generally not backed up (there are exceptions) Scratch systems /scratch/serial all clusters 16 TB NFS /scratch/general updraft only 3.5TB NFS /scratch/ibrix/chpc_gen updraft and ember - 60 TB /scratch/local on compute nodes (60 sda, 200GB updraft, 400GB ember) Applications /uufs/arches/sys/pkg /uufs/chpc.utah.edu/sys/pkg /uufs/cluster.arches/sys/pkg where cluster = sanddunearch, updraft or ember

4 Sand Dune Arch 156 nodes/624 cores Infiniband and GigE Owner nodes Turret Arch Telluride Meteorology Switch Updraft 256 nodes/2048 cores Infiniband and GigE 85 nodes general usage Ember about 334 nodes/4008 cores Infiniband and GigE 53 nodes general usage GPU nodes (6 CHPC) Administrative Nodes NFS Home Directories NFS scratch systems serial all clusters general updraft IBRIX /scratch/ibrix/chpc_gen ember, updraft

5 Getting Started at CHPC Account application an online process Username is your unid with password administrated by campus Interactive nodes two CHPC owned nodes per cluster (cluster.chpc.utah.edu) with roundrobin access to divide load CHPC environment scripts in account when created Getting started guide Problem reporting system or to

6 Interactive Node Usage Interactive nodes for prepping/testing of input files, analyzing results, compilations, debugging, data transfer, etc no running of jobs 15 min MAX cpu no jobs of any time length that negatively impact ability of other users to get work done (e.g., heavy cpu and/or memory usage, heavy i/o)

7 Batch System All use of compute nodes go through a batch system using Torque (PBS) and Maui (Moab) for scheduling #PBS -S /bin/csh #PBS A account #PBS l walltime=24:00:00,nodes=2:ppn=8 qsub I to get interactive use of a compute node (-X if X11 forwarding needed) Walltime limits 72 hours on sanddunearch 24 hours on updraft 72 hours on ember (long qos by request)

8 Allocation Procedure Allocation process now within jira pid=10130&issuetype=12 Allocations on a calendar quarter basis Next requests due about Dec 1 for Jan-Mar 2012 Allocations for different clusters are kept separate Sanddunearch is no longer being allocated all freecycle One allocation per research group; accessible by all members of group

9 Out of Allocation Options If your group did not request any allocation can get a quick allocation (limited to 5000 units) on one of the clusters pid=10130&issuetype=13 Can run in freecycle mode on sanddunearch; once job starts it will run until done or walltime requested is reached Can run on smithp-guest on smithp nodes ember or freecycle on CHPC nodes on either ember or updraft (jobs are preemptable)

10 Security Policies (1) No clear text passwords - use ssh and scp Do not share your account under any circumstances Don t leave your terminal unattended while logged into your account Do not introduce classified or sensitive work onto CHPC systems Use a good password and protect it see gate.acs.utah.edu for tips on good passwords

11 Security Policies (2) Do not try to break passwords, tamper with files, look into anyone else s directory, etc. your privileges do not extend beyond your own directory Do not distribute or copy privileged data or software Report suspicions to CHPC (security@chpc.utah.edu) Please see /docs/policies/security.html for more details

12 Access to Interactive Nodes From Windows machine: Need ssh client PuTTY: Xshell 4: For Xwindowing need tool to display graphical interfaces such as Gaussview, ECCE, AutoDockTools Xming (Mesa version) - XLiveCD - XWin32 free through OSL

13 Default login scripts CHPC maintains default login scripts that will set up necessary environment for batch commands and many of the programs to work /docs/manuals/getting_started/code/ chpc.tcshrc /docs/manuals/getting_started/code/ chpc.bashrc Located in your home directory as.tcshrc or.bashrc Can comment out setups for packages not used Default ones provided have chemistry package setups commented out need to remove # at start of line Can customize by creating.aliases file that is sourced at end of the CHPC script do not customize.tcshrc/.bashrc file as you will need to occasionally get a new version

14 Location of Installations Have a three layer system for installations: /uufs/chpc.utah.edu/sys/pkg Accessible on all clusters and CHPC supported linux desktops /uufs/arches/sys/pkg Accessible on all clusters, but not desktops /uufs/cluster.arches/sys/pkg Single cluster access For builds of packages customized to some feature (usually for the MPI) of the given cluster 10/21/2010 Slide 13

15 Software - General Information Available on CHPC web pages Can get to from -> Software documentation -> More (for complete list) /docs/manuals/software Available for most packages Has information on licensing restrictions, example batch scripts, where to get more information on a specific package Also has useful information on running of jobs (scaling, issues, etc) for some applications

16 Chemistry Packages Quantum/Ab Initio Codes for Molecules Molecular Mechanics/Dynamics Plane Wave for Solids Support Packages visualization docking conversion of file formats structure database

17 Electronic Structure Packages Gaussian03/09 NWChem GAMESS Molpro serial only Dalton serial only Orca Psi SIESTA licensed by individual research group but can add others to license at no charge CFOUR

18 Molecular Mechanics/Dynamics Amber Gromacs NAMD LAMMPS Charmm (licensed by research group)

19 Support Packages Gaussview Molden VMD NBOView Babel (Openbabel) Dock and AutoDock Cambridge Structural Database ECCE Special case installed oncarbon.chpc.utah.edu Talk to me first if you want to use

20 GaussView Molecular builder and viewer for Gaussian input/output files CHPC provides campus license for linux version For Gaussian03 standard is version 4 For Gaussian09 standard is version 5 Chemistry Department has campus license for GV5 for windows; can buy into license Access with gv& provided you have uncommented the Gaussian setup from the standard.tcshrc DO NOT submit jobs from within GaussView instead create and save input file and use batch system

21 NBOView Package to view NBO orbitals Works only with G03 (not G09) For information see How to use at CHPC: Add to your path: /uufs/chpc.utah.edu/sys/pkg/ nboview/nboview1.1 nboview

22 Molden Another program for viewing molecular/electronic structures; version 4.7 Works with Gamess, Gaussian, Molpro Supports plots of electronic density, MOs, etc More information at How to use at CHPC: Make sure your path includes /uufs/chpc.utah.edu/sys/pkg/ molden/std molden &

23 VMD Visualization, mainly for MM/MD Reads a number of different file formats Information at Can either install on own desktop (windows/mac/linux versions available) or use CHPC installation /uufs/ chpc.utah.edu/sys/pkg/vmd/vmd-std/bin/vmd latest version (1.9) installed

24 ECCE Extensible Computational Chemistry Environment Package developed at EMSL at PNNL Set of modules to manage computational chemistry computer jobs (Gaussian, NWChem) from start (including building system) to finish (analyzing output) all from your desktop system Installed on carbon.chpc.utah.edu User accounts need setup see me if interested as carbon is not always kept running

25 OpenBabel Tool to interconvert structure files between a number of formats used in molecular modeling To run: source /uufs/chpc.utah.edu/sys/openbabel/etc/ babel.csh (or uncomment in your tcsh) babel -i < input-type > < infile > -o < output-type > < outfile > babel H to see format for usage, options, and input/ output file-types

26 Dock/AutoDock Programs to look at binding of a small molecule within the active site of a receptor, usually a macromolecule Dock version 6.4 installed (6.5 is current) get info at: source /uufs/chpc.utah.edu/sys/pkg/dock/etc/dock.csh dock6.mpi to start (needs arguments) Autodock version 4.2 info available at source /uufs/chpc.utah.edu/sys/pkg/autodock/etc/autodock.csh autodock4 (with proper arguments) or autogrid4 Also have autodocktools, a GUI interface, installed start with adt &

27 Cambridge Structural Database Moved from library to CHPC summer 2006 additions to the database are made every 3-4 months; package updated annually for information Need CHPC account to use From PC need Xterm/Xwindowing software (Putty/XMing work well) to start session on any of the interactive nodes source /uufs/chpc.utah.edu/sys/pkg/csd/std/cambridge/etc/ csd.csh Can uncomment line in.tcshrc to do this upon login cq & <- to start conquest (search engine) mercury & <- to start crystal structure viewer The first time you use it on a given computer you will be asked to confirm licensing need to provide site/license codes (840/7E92A7)

28 Amber Molecular mechanics/dynamics package Current version is Amber 11 Basic information on getting started: /docs/manuals/software/amber.html For more information see The Amber Mail Reflector Archive and mailing list is very useful! Different parallel builds available for each cluster to make use of the different interconnects Have GPU version initial tests show impressive performance gains

29 Other MM/MD packages All built for individual clusters--- Gromacs - Recently updated; version installed with cluster optimized builds of both single and double precision LAMMPS - Installations need updating if there is interest NAMD: Version 2.6 on sanddunearch (2.7 current) NWChem also has MM/MD capabilities

30 CHARMM Package is licensed per research group and for a specific version If interested get license and we can do installation such that only your group has access. See for details

31 Gaussian03/09 Commercial electronic structure package for information and User s Guide Last installed revision of G03 is E.01 /uufs/chpc.utah.edu/sys/pkg/gaussian03/std Has been updated to include latest NBO5 Version B.01 of G09 installed; C.01 just announced /uufs/chpc.utah.edu/sys/pkg/gaussian09/amd64 /uufs/chpc.utah.edu/sys/pkg/gaussian09/em64t For information on accessing the CHPC installation /docs/manuals/software/g03.html

32 NWChem Package developed at PNNL to work on massively parallel systems Goal: Computational chemistry solutions that are scalable with respect to both chemical system size and MPP hardware size Has quantum mechanics, molecular mechanics/dynamics, and quantum molecular dynamics, plane waves for periodic systems Version (with Python) /uufs/chpc.utah.edu/sys/pkg/nwchem/nwchem-5.1.1/bin/linux64 To run: source appropriate etc/nwchem.csh must have $HOME/pdir directory if running parallel More information and example batch script at /docs/manuals/software/nwchem.html

33 GAMESS General Atomic and Molecular Electronic Structure System Another option for most ab initio quantum calculations On 22 February 2006 version can be updated if needed /uufs/chpc.utah.edu/sys/pkg/gamess/gamess for information on usage and capabilities Can run both parallel or serial For information on accessing the CHPC installation see /docs/manuals/software/gamess.html

34 Other Quantum Codes Mainly codes that are specialized in terms of functionality Dalton only serial build Focus on property calculations at HF, DFT, MCSCF and CC levels of theory version 2.0 installed; Dalton2011 recently released Molpro only serial build Emphasis on highly accurate calculation methods Psi - Orca - CFOUR -

35 Plane Wave Codes Plane wave codes for the study of systems under periodic boundary conditions (PBC) NWChem VASP Ab initio QM/MD Licensed per research group Quantum Espresso - Understands crystal space groups Has GIPAW module to do NMR calculations J-ICE and XCrysDen to view CPMD - BigDFT -

36 Schrodinger Suite Commercial package geared for chemical modeling and simulation in the pharmaceutical field, specifically drug discovery Interface Maestro (free for academia) Calculation code includes Jaguar (Quantum), Macromodel (MM), Qsite (QM/MM) Interfaces with Desmond for MD Tools for structure based drug design Docking, ligand design, binding affinities, screening libraries Have a year license (Oct 2011) Token based, so limited in number of concurrent uses Purchased by two groups in Medicinal Chemistry If interested in using come talk to me

37 Finally.. Let us know if there is some other package that does something that our current packages do not; we can look into the possibility of getting it. Factors: cost, hardware/os requirements, licensing issues Any questions contact me Phone: Office: 422 INSCC

CENTER FOR HIGH PERFORMANCE COMPUTING. Overview of CHPC. Martin Čuma, PhD. Center for High Performance Computing

CENTER FOR HIGH PERFORMANCE COMPUTING. Overview of CHPC. Martin Čuma, PhD. Center for High Performance Computing Overview of CHPC Martin Čuma, PhD Center for High Performance Computing m.cuma@utah.edu Spring 2014 Overview CHPC Services HPC Clusters Specialized computing resources Access and Security Batch (PBS and

More information

Gaussian03 and Gaussview Presentation Anita Orendt Center for High Performance Computing University of Utah anita.orendt@utah.edu April 24, 2008 http://www.chpc.utah.edu 4/29/08 http://www.chpc.utah.edu

More information

Protected Environment at CHPC. Sean Igo Center for High Performance Computing September 11, 2014

Protected Environment at CHPC. Sean Igo Center for High Performance Computing September 11, 2014 Protected Environment at CHPC Sean Igo Center for High Performance Computing Sean.Igo@utah.edu September 11, 2014 Purpose of Presentation Overview of CHPC environment / access Actually this is most of

More information

HIPAA Environment, AI and NLP Services at CHPC. Sean Igo Center for High Performance Computing September 23, 2010

HIPAA Environment, AI and NLP Services at CHPC. Sean Igo Center for High Performance Computing September 23, 2010 HIPAA Environment, AI and NLP Services at CHPC Sean Igo Center for High Performance Computing Sean.Igo@utah.edu September 23, 2010 Purpose of Presentation Brief overview of CHPC environment / access Overview

More information

Introduction to SLURM & SLURM batch scripts

Introduction to SLURM & SLURM batch scripts Introduction to SLURM & SLURM batch scripts Anita Orendt Assistant Director Research Consulting & Faculty Engagement anita.orendt@utah.edu 16 Feb 2017 Overview of Talk Basic SLURM commands SLURM batch

More information

CALMIP : HIGH PERFORMANCE COMPUTING

CALMIP : HIGH PERFORMANCE COMPUTING CALMIP : HIGH PERFORMANCE COMPUTING Nicolas.renon@univ-tlse3.fr Emmanuel.courcelle@inp-toulouse.fr CALMIP (UMS 3667) Espace Clément Ader www.calmip.univ-toulouse.fr CALMIP :Toulouse University Computing

More information

Introduction to SLURM & SLURM batch scripts

Introduction to SLURM & SLURM batch scripts Introduction to SLURM & SLURM batch scripts Anita Orendt Assistant Director Research Consulting & Faculty Engagement anita.orendt@utah.edu 6 February 2018 Overview of Talk Basic SLURM commands SLURM batch

More information

Introduction to SLURM & SLURM batch scripts

Introduction to SLURM & SLURM batch scripts Introduction to SLURM & SLURM batch scripts Anita Orendt Assistant Director Research Consulting & Faculty Engagement anita.orendt@utah.edu 23 June 2016 Overview of Talk Basic SLURM commands SLURM batch

More information

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah Introduction to Parallel Programming Martin Čuma Center for High Performance Computing University of Utah mcuma@chpc.utah.edu Overview Types of parallel computers. Parallel programming options. How to

More information

Protected Environment at CHPC

Protected Environment at CHPC Protected Environment at CHPC Anita Orendt anita.orendt@utah.edu Wayne Bradford wayne.bradford@utah.edu Center for High Performance Computing 5 November 2015 CHPC Mission In addition to deploying and operating

More information

Introduction to Modules at CHPC

Introduction to Modules at CHPC Introduction to Modules at CHPC Anita Orendt Assistant Director Research Consulting & Faculty Engagement anita.orendt@utah.edu 13 June 2017 Overview of Talk Why Modules Where to find information How to

More information

Computational Biology Software Overview

Computational Biology Software Overview Computational Biology Software Overview Le Yan HPC Consultant User Services @ LONI Outline Overview of available software packages for computational biologists Demo: run NAMD on a LONI Linux cluster What

More information

UoW HPC Quick Start. Information Technology Services University of Wollongong. ( Last updated on October 10, 2011)

UoW HPC Quick Start. Information Technology Services University of Wollongong. ( Last updated on October 10, 2011) UoW HPC Quick Start Information Technology Services University of Wollongong ( Last updated on October 10, 2011) 1 Contents 1 Logging into the HPC Cluster 3 1.1 From within the UoW campus.......................

More information

Intel Manycore Testing Lab (MTL) - Linux Getting Started Guide

Intel Manycore Testing Lab (MTL) - Linux Getting Started Guide Intel Manycore Testing Lab (MTL) - Linux Getting Started Guide Introduction What are the intended uses of the MTL? The MTL is prioritized for supporting the Intel Academic Community for the testing, validation

More information

User Guide of High Performance Computing Cluster in School of Physics

User Guide of High Performance Computing Cluster in School of Physics User Guide of High Performance Computing Cluster in School of Physics Prepared by Sue Yang (xue.yang@sydney.edu.au) This document aims at helping users to quickly log into the cluster, set up the software

More information

Cerebro Quick Start Guide

Cerebro Quick Start Guide Cerebro Quick Start Guide Overview of the system Cerebro consists of a total of 64 Ivy Bridge processors E5-4650 v2 with 10 cores each, 14 TB of memory and 24 TB of local disk. Table 1 shows the hardware

More information

CPMD Performance Benchmark and Profiling. February 2014

CPMD Performance Benchmark and Profiling. February 2014 CPMD Performance Benchmark and Profiling February 2014 Note The following research was performed under the HPC Advisory Council activities Special thanks for: HP, Mellanox For more information on the supporting

More information

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah Introduction to Parallel Programming Martin Čuma Center for High Performance Computing University of Utah mcuma@chpc.utah.edu Overview Types of parallel computers. Parallel programming options. How to

More information

Please include the following sentence in any works using center resources.

Please include the following sentence in any works using center resources. The TCU High-Performance Computing Center The TCU HPCC currently maintains a cluster environment hpcl1.chm.tcu.edu. Work on a second cluster environment is underway. This document details using hpcl1.

More information

Using the IBM Opteron 1350 at OSC. October 19-20, 2010

Using the IBM Opteron 1350 at OSC. October 19-20, 2010 Using the IBM Opteron 1350 at OSC October 19-20, 2010 Table of Contents Hardware Overview The Linux Operating System User Environment and Storage 2 Hardware Overview Hardware introduction Login node configuration

More information

HOKUSAI System. Figure 0-1 System diagram

HOKUSAI System. Figure 0-1 System diagram HOKUSAI System October 11, 2017 Information Systems Division, RIKEN 1.1 System Overview The HOKUSAI system consists of the following key components: - Massively Parallel Computer(GWMPC,BWMPC) - Application

More information

PACE. Instructional Cluster Environment (ICE) Orientation. Research Scientist, PACE

PACE. Instructional Cluster Environment (ICE) Orientation. Research Scientist, PACE PACE Instructional Cluster Environment (ICE) Orientation Mehmet (Memo) Belgin, PhD Research Scientist, PACE www.pace.gatech.edu What is PACE A Partnership for an Advanced Computing Environment Provides

More information

Scheduling Strategies for HPC as a Service (HPCaaS) for Bio-Science Applications

Scheduling Strategies for HPC as a Service (HPCaaS) for Bio-Science Applications Scheduling Strategies for HPC as a Service (HPCaaS) for Bio-Science Applications Sep 2009 Gilad Shainer, Tong Liu (Mellanox); Jeffrey Layton (Dell); Joshua Mora (AMD) High Performance Interconnects for

More information

Protected Environment at CHPC

Protected Environment at CHPC Protected Environment at CHPC Anita Orendt, anita.orendt@utah.edu Wayne Bradford, wayne.bradford@utah.edu Center for High Performance Computing 25 October 2016 CHPC Mission In addition to deploying and

More information

The cluster system. Introduction 22th February Jan Saalbach Scientific Computing Group

The cluster system. Introduction 22th February Jan Saalbach Scientific Computing Group The cluster system Introduction 22th February 2018 Jan Saalbach Scientific Computing Group cluster-help@luis.uni-hannover.de Contents 1 General information about the compute cluster 2 Available computing

More information

The State of Accelerated Applications. Michael Feldman

The State of Accelerated Applications. Michael Feldman The State of Accelerated Applications Michael Feldman Accelerator Market in HPC Nearly half of all new HPC systems deployed incorporate accelerators Accelerator hardware performance has been advancing

More information

User Guide of Computational Facilities

User Guide of Computational Facilities Physical Chemistry Unit Departament de Quimica User Guide of Computational Facilities Beta Version System Manager: Marc Noguera i Julian January 20, 2009 Contents 1 Introduction 1 2 Computer resources

More information

High Performance Computing (HPC) Club Training Session. Xinsheng (Shawn) Qin

High Performance Computing (HPC) Club Training Session. Xinsheng (Shawn) Qin High Performance Computing (HPC) Club Training Session Xinsheng (Shawn) Qin Outline HPC Club The Hyak Supercomputer Logging in to Hyak Basic Linux Commands Transferring Files Between Your PC and Hyak Submitting

More information

Kohinoor queuing document

Kohinoor queuing document List of SGE Commands: qsub : Submit a job to SGE Kohinoor queuing document qstat : Determine the status of a job qdel : Delete a job qhost : Display Node information Some useful commands $qstat f -- Specifies

More information

How to Use a Supercomputer - A Boot Camp

How to Use a Supercomputer - A Boot Camp How to Use a Supercomputer - A Boot Camp Shelley Knuth Peter Ruprecht shelley.knuth@colorado.edu peter.ruprecht@colorado.edu www.rc.colorado.edu Outline Today we will discuss: Who Research Computing is

More information

ENABLING NEW SCIENCE GPU SOLUTIONS

ENABLING NEW SCIENCE GPU SOLUTIONS ENABLING NEW SCIENCE TESLA BIO Workbench The NVIDIA Tesla Bio Workbench enables biophysicists and computational chemists to push the boundaries of life sciences research. It turns a standard PC into a

More information

How to run computations using GaussView on PC and Gaussian03 on Hamilton with ITS machines

How to run computations using GaussView on PC and Gaussian03 on Hamilton with ITS machines How to run computations using GaussView on PC and Gaussian03 on Hamilton with ITS machines Set up of the FTP(file transfer protocol) Program: Use winscp Start > Durham Network > winscp404 Log into hamilton.

More information

PACE. Instructional Cluster Environment (ICE) Orientation. Mehmet (Memo) Belgin, PhD Research Scientist, PACE

PACE. Instructional Cluster Environment (ICE) Orientation. Mehmet (Memo) Belgin, PhD  Research Scientist, PACE PACE Instructional Cluster Environment (ICE) Orientation Mehmet (Memo) Belgin, PhD www.pace.gatech.edu Research Scientist, PACE What is PACE A Partnership for an Advanced Computing Environment Provides

More information

Introduction to Discovery.

Introduction to Discovery. Introduction to Discovery http://discovery.dartmouth.edu The Discovery Cluster 2 Agenda What is a cluster and why use it Overview of computer hardware in cluster Help Available to Discovery Users Logging

More information

XSEDE New User Tutorial

XSEDE New User Tutorial April 2, 2014 XSEDE New User Tutorial Jay Alameda National Center for Supercomputing Applications XSEDE Training Survey Make sure you sign the sign in sheet! At the end of the module, I will ask you to

More information

How to run applications on Aziz supercomputer. Mohammad Rafi System Administrator Fujitsu Technology Solutions

How to run applications on Aziz supercomputer. Mohammad Rafi System Administrator Fujitsu Technology Solutions How to run applications on Aziz supercomputer Mohammad Rafi System Administrator Fujitsu Technology Solutions Agenda Overview Compute Nodes Storage Infrastructure Servers Cluster Stack Environment Modules

More information

ICS-ACI System Basics

ICS-ACI System Basics ICS-ACI System Basics Adam W. Lavely, Ph.D. Fall 2017 Slides available: goo.gl/ss9itf awl5173 ICS@PSU 1 Contents 1 Overview 2 HPC Overview 3 Getting Started on ACI 4 Moving On awl5173 ICS@PSU 2 Contents

More information

Introduction to Cuda Visualization. Graphical Application Tunnelling on Palmetto

Introduction to Cuda Visualization. Graphical Application Tunnelling on Palmetto Introduction to Cuda Visualization The CUDA programming paradigm is NVidia's development tool which is used to enable advanced computer processing on their GPGPU (General Purpose graphics Processing Units)

More information

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah Introduction to Parallel Programming Martin Čuma Center for High Performance Computing University of Utah mcuma@chpc.utah.edu Overview Types of parallel computers. Parallel programming options. How to

More information

New User Seminar: Part 2 (best practices)

New User Seminar: Part 2 (best practices) New User Seminar: Part 2 (best practices) General Interest Seminar January 2015 Hugh Merz merz@sharcnet.ca Session Outline Submitting Jobs Minimizing queue waits Investigating jobs Checkpointing Efficiency

More information

MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization

MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization 2 Glenn Bresnahan Director, SCV MGHPCC Buy-in Program Kadin Tseng HPC Programmer/Consultant

More information

OBTAINING AN ACCOUNT:

OBTAINING AN ACCOUNT: HPC Usage Policies The IIA High Performance Computing (HPC) System is managed by the Computer Management Committee. The User Policies here were developed by the Committee. The user policies below aim to

More information

Comsics: the parallel computing facility in the school of physics, USM.

Comsics: the parallel computing facility in the school of physics, USM. Comsics: the parallel computing facility in the school of physics, USM. Yoon Tiem Leong Talk given at theory group weekly seminar, School of Physics, Universiti Sains Malaysia Tues, 19 October 2010 Abstract

More information

A Prototype CPU-GPU Cluster for Research in High Performance Computing and Visualization of Large Scale Applications

A Prototype CPU-GPU Cluster for Research in High Performance Computing and Visualization of Large Scale Applications A Prototype CPU-GPU Cluster for Research in High Performance Computing and Visualization of Large Scale Applications Fritz McCall fmccall@umiacs.umd.edu Brad Erdman chef@umiacs.umd.edu Cluster Goals GPUs

More information

Introduction to BioHPC

Introduction to BioHPC Introduction to BioHPC New User Training [web] [email] portal.biohpc.swmed.edu biohpc-help@utsouthwestern.edu 1 Updated for 2018-03-07 Overview Today we re going to cover: What is BioHPC? How do I access

More information

A Brief Introduction to The Center for Advanced Computing

A Brief Introduction to The Center for Advanced Computing A Brief Introduction to The Center for Advanced Computing February 8, 2007 Hardware 376 Opteron nodes, over 890 cores Gigabit networking, Myrinet networking, Infiniband networking soon Hardware: nyx nyx

More information

UL HPC Monitoring in practice: why, what, how, where to look

UL HPC Monitoring in practice: why, what, how, where to look C. Parisot UL HPC Monitoring in practice: why, what, how, where to look 1 / 22 What is HPC? Best Practices Getting Fast & Efficient UL HPC Monitoring in practice: why, what, how, where to look Clément

More information

Introduction to HPC Resources and Linux

Introduction to HPC Resources and Linux Introduction to HPC Resources and Linux Burak Himmetoglu Enterprise Technology Services & Center for Scientific Computing e-mail: bhimmetoglu@ucsb.edu Paul Weakliem California Nanosystems Institute & Center

More information

Introduction to the NCAR HPC Systems. 25 May 2018 Consulting Services Group Brian Vanderwende

Introduction to the NCAR HPC Systems. 25 May 2018 Consulting Services Group Brian Vanderwende Introduction to the NCAR HPC Systems 25 May 2018 Consulting Services Group Brian Vanderwende Topics to cover Overview of the NCAR cluster resources Basic tasks in the HPC environment Accessing pre-built

More information

Introduction to Discovery.

Introduction to Discovery. Introduction to Discovery http://discovery.dartmouth.edu The Discovery Cluster 2 Agenda What is a cluster and why use it Overview of computer hardware in cluster Help Available to Discovery Users Logging

More information

HPC Account and Software Setup

HPC Account and Software Setup HPC Account and Software Setup Ken-ichi Nomura MAGICS 2 nd Workshop at USC March 2, 2018 Wireless Network Configuration 1. USC Secure Wireless 2. eduroam 3. MAGICS@USC password: magics@usc 4. USC Guest

More information

The JANUS Computing Environment

The JANUS Computing Environment Research Computing UNIVERSITY OF COLORADO The JANUS Computing Environment Monte Lunacek monte.lunacek@colorado.edu rc-help@colorado.edu What is JANUS? November, 2011 1,368 Compute nodes 16,416 processors

More information

A Brief Introduction to The Center for Advanced Computing

A Brief Introduction to The Center for Advanced Computing A Brief Introduction to The Center for Advanced Computing May 1, 2006 Hardware 324 Opteron nodes, over 700 cores 105 Athlon nodes, 210 cores 64 Apple nodes, 128 cores Gigabit networking, Myrinet networking,

More information

AI, Machine Learning, and NLP Services at CHPC. Sean Igo Center for High Performance Computing September 25, 2014

AI, Machine Learning, and NLP Services at CHPC. Sean Igo Center for High Performance Computing September 25, 2014 AI, Machine Learning, and NLP Services at CHPC Sean Igo Center for High Performance Computing Sean.Igo@utah.edu September 25, 2014 Purpose of Presentation Brief overview of CHPC environment / access Overview

More information

A Brief Introduction to The Center for Advanced Computing

A Brief Introduction to The Center for Advanced Computing A Brief Introduction to The Center for Advanced Computing November 10, 2009 Outline 1 Resources Hardware Software 2 Mechanics: Access Transferring files and data to and from the clusters Logging into the

More information

Cloud Computing Research Cloud: NeCTAR Commercial Cloud: Amazon AWS, Microsoft Azure, etc. Seed money for exploration of new cloud technologies

Cloud Computing Research Cloud: NeCTAR Commercial Cloud: Amazon AWS, Microsoft Azure, etc. Seed money for exploration of new cloud technologies High Performance Computing (HPC) As a service: NCI Raijin Katana local HPC cluster Cloud Computing Research Cloud: NeCTAR Commercial Cloud: Amazon AWS, Microsoft Azure, etc. Seed money for exploration

More information

Introduction to BioHPC New User Training

Introduction to BioHPC New User Training Introduction to BioHPC New User Training [web] [email] portal.biohpc.swmed.edu biohpc-help@utsouthwestern.edu 1 Updated for 2019-02-06 Overview Today we re going to cover: What is BioHPC? How do I access

More information

Quantum Chemistry (QC) on GPUs. Feb. 2, 2017

Quantum Chemistry (QC) on GPUs. Feb. 2, 2017 Quantum Chemistry (QC) on GPUs Feb. 2, 2017 Overview of Life & Material Accelerated Apps MD: All key codes are GPU-accelerated Great multi-gpu performance Focus on dense (up to 16) GPU nodes &/or large

More information

Graham vs legacy systems

Graham vs legacy systems New User Seminar Graham vs legacy systems This webinar only covers topics pertaining to graham. For the introduction to our legacy systems (Orca etc.), please check the following recorded webinar: SHARCNet

More information

Our new HPC-Cluster An overview

Our new HPC-Cluster An overview Our new HPC-Cluster An overview Christian Hagen Universität Regensburg Regensburg, 15.05.2009 Outline 1 Layout 2 Hardware 3 Software 4 Getting an account 5 Compiling 6 Queueing system 7 Parallelization

More information

PACE Orientation. Research Scientist, PACE

PACE Orientation. Research Scientist, PACE PACE Orientation Mehmet (Memo) Belgin, PhD Research Scientist, PACE www.pace.gatech.edu What is PACE A Partnership for an Advanced Computing Environment Provides faculty and researchers vital tools to

More information

Image Sharpening. Practical Introduction to HPC Exercise. Instructions for Cirrus Tier-2 System

Image Sharpening. Practical Introduction to HPC Exercise. Instructions for Cirrus Tier-2 System Image Sharpening Practical Introduction to HPC Exercise Instructions for Cirrus Tier-2 System 2 1. Aims The aim of this exercise is to get you used to logging into an HPC resource, using the command line

More information

Computing with the Moore Cluster

Computing with the Moore Cluster Computing with the Moore Cluster Edward Walter An overview of data management and job processing in the Moore compute cluster. Overview Getting access to the cluster Data management Submitting jobs (MPI

More information

Tech Computer Center Documentation

Tech Computer Center Documentation Tech Computer Center Documentation Release 0 TCC Doc February 17, 2014 Contents 1 TCC s User Documentation 1 1.1 TCC SGI Altix ICE Cluster User s Guide................................ 1 i ii CHAPTER 1

More information

Introduction to High Performance Computing and an Statistical Genetics Application on the Janus Supercomputer. Purpose

Introduction to High Performance Computing and an Statistical Genetics Application on the Janus Supercomputer. Purpose Introduction to High Performance Computing and an Statistical Genetics Application on the Janus Supercomputer Daniel Yorgov Department of Mathematical & Statistical Sciences, University of Colorado Denver

More information

Parallel Computing with Matlab and R

Parallel Computing with Matlab and R Parallel Computing with Matlab and R scsc@duke.edu https://wiki.duke.edu/display/scsc Tom Milledge tm103@duke.edu Overview Running Matlab and R interactively and in batch mode Introduction to Parallel

More information

GROMACS Performance Benchmark and Profiling. August 2011

GROMACS Performance Benchmark and Profiling. August 2011 GROMACS Performance Benchmark and Profiling August 2011 Note The following research was performed under the HPC Advisory Council activities Participating vendors: Intel, Dell, Mellanox Compute resource

More information

Introduction to High Performance Computing at UEA. Chris Collins Head of Research and Specialist Computing ITCS

Introduction to High Performance Computing at UEA. Chris Collins Head of Research and Specialist Computing ITCS Introduction to High Performance Computing at UEA. Chris Collins Head of Research and Specialist Computing ITCS Introduction to High Performance Computing High Performance Computing at UEA http://rscs.uea.ac.uk/hpc/

More information

RUNNING MOLECULAR DYNAMICS SIMULATIONS WITH CHARMM: A BRIEF TUTORIAL

RUNNING MOLECULAR DYNAMICS SIMULATIONS WITH CHARMM: A BRIEF TUTORIAL RUNNING MOLECULAR DYNAMICS SIMULATIONS WITH CHARMM: A BRIEF TUTORIAL While you can probably write a reasonable program that carries out molecular dynamics (MD) simulations, it s sometimes more efficient

More information

High Performance Computing (HPC) Prepared By: Abdussamad Muntahi Muhammad Rahman

High Performance Computing (HPC) Prepared By: Abdussamad Muntahi Muhammad Rahman High Performance Computing (HPC) Prepared By: Abdussamad Muntahi Muhammad Rahman 1 2 Introduction to High Performance Computing (HPC) Introduction High-speed computing. Originally pertaining only to supercomputers

More information

Our Workshop Environment

Our Workshop Environment Our Workshop Environment John Urbanic Parallel Computing Scientist Pittsburgh Supercomputing Center Copyright 2015 Our Environment Today Your laptops or workstations: only used for portal access Blue Waters

More information

UF Research Computing: Overview and Running STATA

UF Research Computing: Overview and Running STATA UF : Overview and Running STATA www.rc.ufl.edu Mission Improve opportunities for research and scholarship Improve competitiveness in securing external funding Matt Gitzendanner magitz@ufl.edu Provide high-performance

More information

Working on the NewRiver Cluster

Working on the NewRiver Cluster Working on the NewRiver Cluster CMDA3634: Computer Science Foundations for Computational Modeling and Data Analytics 22 February 2018 NewRiver is a computing cluster provided by Virginia Tech s Advanced

More information

Performance Study of Popular Computational Chemistry Software Packages on Cray HPC Systems

Performance Study of Popular Computational Chemistry Software Packages on Cray HPC Systems Performance Study of Popular Computational Chemistry Software Packages on Cray HPC Systems Junjie Li (lijunj@iu.edu) Shijie Sheng (shengs@iu.edu) Raymond Sheppard (rsheppar@iu.edu) Pervasive Technology

More information

Using ITaP clusters for large scale statistical analysis with R. Doug Crabill Purdue University

Using ITaP clusters for large scale statistical analysis with R. Doug Crabill Purdue University Using ITaP clusters for large scale statistical analysis with R Doug Crabill Purdue University Topics Running multiple R jobs on departmental Linux servers serially, and in parallel Cluster concepts and

More information

Quantum Chemistry (QC) on GPUs. Dec. 19, 2016

Quantum Chemistry (QC) on GPUs. Dec. 19, 2016 Quantum Chemistry (QC) on GPUs Dec. 19, 2016 Overview of Life & Material Accelerated Apps MD: All key codes are GPU-accelerated Great multi-gpu performance Focus on dense (up to 16) GPU nodes &/or large

More information

Before We Start. Sign in hpcxx account slips Windows Users: Download PuTTY. Google PuTTY First result Save putty.exe to Desktop

Before We Start. Sign in hpcxx account slips Windows Users: Download PuTTY. Google PuTTY First result Save putty.exe to Desktop Before We Start Sign in hpcxx account slips Windows Users: Download PuTTY Google PuTTY First result Save putty.exe to Desktop Research Computing at Virginia Tech Advanced Research Computing Compute Resources

More information

Name Department/Research Area Have you used the Linux command line?

Name Department/Research Area Have you used the Linux command line? Please log in with HawkID (IOWA domain) Macs are available at stations as marked To switch between the Windows and the Mac systems, press scroll lock twice 9/27/2018 1 Ben Rogers ITS-Research Services

More information

XSEDE New User Tutorial

XSEDE New User Tutorial May 13, 2016 XSEDE New User Tutorial Jay Alameda National Center for Supercomputing Applications XSEDE Training Survey Please complete a short on-line survey about this module at http://bit.ly/hamptonxsede.

More information

Chemistry at CSC CSC Computational Chemistry Spring School

Chemistry at CSC CSC Computational Chemistry Spring School Chemistry at CSC CSC Computational Chemistry Spring School 22.3.2013 Aim: you ll be aware of Lots of software for chemists Lots of computational resources for chemists Several ways to access CSC s resources

More information

Introduction to Discovery.

Introduction to Discovery. Introduction to Discovery http://discovery.dartmouth.edu March 2014 The Discovery Cluster 2 Agenda Resource overview Logging on to the cluster with ssh Transferring files to and from the cluster The Environment

More information

Effective Use of CCV Resources

Effective Use of CCV Resources Effective Use of CCV Resources Mark Howison User Services & Support This talk... Assumes you have some familiarity with a Unix shell Provides examples and best practices for typical usage of CCV systems

More information

Introduction to High Performance Computing at UEA. Chris Collins Head of Research and Specialist Computing ITCS

Introduction to High Performance Computing at UEA. Chris Collins Head of Research and Specialist Computing ITCS Introduction to High Performance Computing at UEA. Chris Collins Head of Research and Specialist Computing ITCS Introduction to High Performance Computing High Performance Computing at UEA http://rscs.uea.ac.uk/hpc/

More information

XSEDE New User Tutorial

XSEDE New User Tutorial October 20, 2017 XSEDE New User Tutorial Jay Alameda National Center for Supercomputing Applications XSEDE Training Survey Please complete a short on line survey about this module at http://bit.ly/xsedesurvey.

More information

Guillimin HPC Users Meeting. Bryan Caron

Guillimin HPC Users Meeting. Bryan Caron July 17, 2014 Bryan Caron bryan.caron@mcgill.ca McGill University / Calcul Québec / Compute Canada Montréal, QC Canada Outline Compute Canada News Upcoming Maintenance Downtime in August Storage System

More information

XSEDE New User Tutorial

XSEDE New User Tutorial June 12, 2015 XSEDE New User Tutorial Jay Alameda National Center for Supercomputing Applications XSEDE Training Survey Please remember to sign in for today s event: http://bit.ly/1fashvo Also, please

More information

Introduction to Molecular Dynamics on ARCHER: Instructions for running parallel jobs on ARCHER

Introduction to Molecular Dynamics on ARCHER: Instructions for running parallel jobs on ARCHER Introduction to Molecular Dynamics on ARCHER: Instructions for running parallel jobs on ARCHER 1 Introduction This handout contains basic instructions for how to login in to ARCHER and submit jobs to the

More information

KISTI TACHYON2 SYSTEM Quick User Guide

KISTI TACHYON2 SYSTEM Quick User Guide KISTI TACHYON2 SYSTEM Quick User Guide Ver. 2.4 2017. Feb. SupercomputingCenter 1. TACHYON 2 System Overview Section Specs Model SUN Blade 6275 CPU Intel Xeon X5570 2.93GHz(Nehalem) Nodes 3,200 total Cores

More information

INTRODUCTION TO THE CLUSTER

INTRODUCTION TO THE CLUSTER INTRODUCTION TO THE CLUSTER WHAT IS A CLUSTER? A computer cluster consists of a group of interconnected servers (nodes) that work together to form a single logical system. COMPUTE NODES GATEWAYS SCHEDULER

More information

Introduction to BioHPC

Introduction to BioHPC Introduction to BioHPC New User Training [web] [email] portal.biohpc.swmed.edu biohpc-help@utsouthwestern.edu 1 Updated for 2017-01-04 Overview Today we re going to cover: What is BioHPC? How do I access

More information

HECToR. UK National Supercomputing Service. Andy Turner & Chris Johnson

HECToR. UK National Supercomputing Service. Andy Turner & Chris Johnson HECToR UK National Supercomputing Service Andy Turner & Chris Johnson Outline EPCC HECToR Introduction HECToR Phase 3 Introduction to AMD Bulldozer Architecture Performance Application placement the hardware

More information

HPC DOCUMENTATION. 3. Node Names and IP addresses:- Node details with respect to their individual IP addresses are given below:-

HPC DOCUMENTATION. 3. Node Names and IP addresses:- Node details with respect to their individual IP addresses are given below:- HPC DOCUMENTATION 1. Hardware Resource :- Our HPC consists of Blade chassis with 5 blade servers and one GPU rack server. a.total available cores for computing: - 96 cores. b.cores reserved and dedicated

More information

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah

Introduction to Parallel Programming. Martin Čuma Center for High Performance Computing University of Utah Introduction to Parallel Programming Martin Čuma Center for High Performance Computing University of Utah m.cuma@utah.edu Overview Types of parallel computers. Parallel programming options. How to write

More information

XSEDE New User Training. Ritu Arora November 14, 2014

XSEDE New User Training. Ritu Arora   November 14, 2014 XSEDE New User Training Ritu Arora Email: rauta@tacc.utexas.edu November 14, 2014 1 Objectives Provide a brief overview of XSEDE Computational, Visualization and Storage Resources Extended Collaborative

More information

Docking Study with HyperChem Release Notes

Docking Study with HyperChem Release Notes Docking Study with HyperChem Release Notes This document lists additional information about Docking Study with HyperChem family, Essential, Premium Essential, Professional, Advanced, Ultimat, and Cluster.

More information

TECHNICAL OVERVIEW ACCELERATED COMPUTING AND THE DEMOCRATIZATION OF SUPERCOMPUTING

TECHNICAL OVERVIEW ACCELERATED COMPUTING AND THE DEMOCRATIZATION OF SUPERCOMPUTING TECHNICAL OVERVIEW ACCELERATED COMPUTING AND THE DEMOCRATIZATION OF SUPERCOMPUTING Accelerated computing is revolutionizing the economics of the data center. HPC and hyperscale customers deploy accelerated

More information

HTC Brief Instructions

HTC Brief Instructions HTC Brief Instructions Version 18.08.2018 University of Paderborn Paderborn Center for Parallel Computing Warburger Str. 100, D-33098 Paderborn http://pc2.uni-paderborn.de/ 2 HTC BRIEF INSTRUCTIONS Table

More information

Basis Sets, Electronic Properties, and Visualization

Basis Sets, Electronic Properties, and Visualization Basis Sets, Electronic Properties, and Visualization Goals of the Exercise - Perform common quantum chemical calculations using the Gaussian package. - Practice preparation, monitoring, and analysis of

More information

Introduction to BioHPC New User Training

Introduction to BioHPC New User Training Introduction to BioHPC New User Training [web] [email] portal.biohpc.swmed.edu biohpc-help@utsouthwestern.edu 1 Updated for 2018-04-04 Overview Today we re going to cover: What is BioHPC? How do I access

More information

Minnesota Supercomputing Institute Regents of the University of Minnesota. All rights reserved.

Minnesota Supercomputing Institute Regents of the University of Minnesota. All rights reserved. Minnesota Supercomputing Institute Introduction to MSI for Physical Scientists Michael Milligan MSI Scientific Computing Consultant Goals Introduction to MSI resources Show you how to access our systems

More information