ICS-ACI System Basics

Similar documents
Introduction to HPC Resources and Linux

UoW HPC Quick Start. Information Technology Services University of Wollongong. ( Last updated on October 10, 2011)

Quick Start Guide. by Burak Himmetoglu. Supercomputing Consultant. Enterprise Technology Services & Center for Scientific Computing

New User Tutorial. OSU High Performance Computing Center

Quick Start Guide. by Burak Himmetoglu. Supercomputing Consultant. Enterprise Technology Services & Center for Scientific Computing

High Performance Computing (HPC) Club Training Session. Xinsheng (Shawn) Qin

Introduction to Joker Cyber Infrastructure Architecture Team CIA.NMSU.EDU

Before We Start. Sign in hpcxx account slips Windows Users: Download PuTTY. Google PuTTY First result Save putty.exe to Desktop

Session 1: Accessing MUGrid and Command Line Basics

High Performance Computing Resources at MSU

HPC Introductory Course - Exercises

A Hands-On Tutorial: RNA Sequencing Using High-Performance Computing

Logging in to the CRAY

Introduction to Linux for BlueBEAR. January

Working on the NewRiver Cluster

The cluster system. Introduction 22th February Jan Saalbach Scientific Computing Group

For Dr Landau s PHYS8602 course

Using Sapelo2 Cluster at the GACRC

UF Research Computing: Overview and Running STATA

Data Management at ARSC

Introduction to GALILEO

Migrating from Zcluster to Sapelo

Name Department/Research Area Have you used the Linux command line?

Introduction to UNIX. Logging in. Basic System Architecture 10/7/10. most systems have graphical login on Linux machines

Introduction to Discovery.

Introduction to PICO Parallel & Production Enviroment

PACE. Instructional Cluster Environment (ICE) Orientation. Mehmet (Memo) Belgin, PhD Research Scientist, PACE

Linux at the Command Line Don Johnson of BU IS&T

Intel Manycore Testing Lab (MTL) - Linux Getting Started Guide

Introduction to GALILEO

PACE. Instructional Cluster Environment (ICE) Orientation. Research Scientist, PACE

Image Sharpening. Practical Introduction to HPC Exercise. Instructions for Cirrus Tier-2 System

bwunicluster Tutorial Access, Data Transfer, Compiling, Modulefiles, Batch Jobs

Using Compute Canada. Masao Fujinaga Information Services and Technology University of Alberta

Introduction to Discovery.

Lab 1 Introduction to UNIX and C

Carnegie Mellon. Linux Boot Camp. Jack, Matthew, Nishad, Stanley 6 Sep 2016

Exercise 1: Connecting to BW using ssh: NOTE: $ = command starts here, =means one space between words/characters.

bwunicluster Tutorial Access, Data Transfer, Compiling, Modulefiles, Batch Jobs

XSEDE New User Training. Ritu Arora November 14, 2014

Introduction to Unix The Windows User perspective. Wes Frisby Kyle Horne Todd Johansen

Computing with the Moore Cluster

Introduction to the Linux Command Line

XSEDE New User Tutorial

LAB #5 Intro to Linux and Python on ENGR

Big Data Analytics with Hadoop and Spark at OSC

Using the Yale HPC Clusters

Introduction to HPCC at MSU

Access: bwunicluster, bwforcluster, ForHLR

Getting Started with XSEDE. Dan Stanzione

Intermediate Programming, Spring Misha Kazhdan

A Brief Introduction to The Center for Advanced Computing

PBS Pro Documentation

Introduction: What is Unix?

Using the computational resources at the GACRC

Introduction to the SHARCNET Environment May-25 Pre-(summer)school webinar Speaker: Alex Razoumov University of Ontario Institute of Technology

A Brief Introduction to The Center for Advanced Computing

An Introduction to Cluster Computing Using Newton

AMS 200: Working on Linux/Unix Machines

Supercomputing environment TMA4280 Introduction to Supercomputing

Duke Compute Cluster Workshop. 3/28/2018 Tom Milledge rc.duke.edu

NBIC TechTrack PBS Tutorial

Oregon State University School of Electrical Engineering and Computer Science. CS 261 Recitation 1. Spring 2011

Introduction to HPC at MSU

Crash Course in High Performance Computing

CS 261 Recitation 1 Compiling C on UNIX

Introduction to Molecular Dynamics on ARCHER: Instructions for running parallel jobs on ARCHER

Joint High Performance Computing Exchange (JHPCE) Cluster Orientation.

New User Seminar: Part 2 (best practices)

Introduction to Linux Environment. Yun-Wen Chen

How to run applications on Aziz supercomputer. Mohammad Rafi System Administrator Fujitsu Technology Solutions

Please include the following sentence in any works using center resources.

Our Workshop Environment

Our new HPC-Cluster An overview

HPC Workshop. Nov. 9, 2018 James Coyle, PhD Dir. Of High Perf. Computing

User Guide of High Performance Computing Cluster in School of Physics

Setting up my Dev Environment ECS 030

A Brief Introduction to The Center for Advanced Computing

Unix/Linux Basics. Cpt S 223, Fall 2007 Copyright: Washington State University

Data transfer and RDS for HPC

Introduction to High-Performance Computing (HPC)

Introduction to the NCAR HPC Systems. 25 May 2018 Consulting Services Group Brian Vanderwende

Introduction to BioHPC

INTRODUCTION TO THE CLUSTER

Graham vs legacy systems

Linux Tutorial. Ken-ichi Nomura. 3 rd Magics Materials Software Workshop. Gaithersburg Marriott Washingtonian Center November 11-13, 2018

CS CS Tutorial 2 2 Winter 2018

STA 303 / 1002 Using SAS on CQUEST

Linux Bootcamp Fall 2015

Parallel Programming Pre-Assignment. Setting up the Software Environment

CS Operating Systems, Fall 2018 Project #0 Description

HPC DOCUMENTATION. 3. Node Names and IP addresses:- Node details with respect to their individual IP addresses are given below:-

Bitnami Apache Solr for Huawei Enterprise Cloud

Sharpen Exercise: Using HPC resources and running parallel applications

Introduction to Linux and Supercomputers

Introduction to Discovery.

Introduction to Computing V - Linux and High-Performance Computing

Beginner's Guide for UK IBM systems

Siemens PLM Software. HEEDS MDO Setting up a Windows-to- Linux Compute Resource.

Introduction to HPC2N

Transcription:

ICS-ACI System Basics Adam W. Lavely, Ph.D. Fall 2017 Slides available: goo.gl/ss9itf awl5173 ICS@PSU 1

Contents 1 Overview 2 HPC Overview 3 Getting Started on ACI 4 Moving On awl5173 ICS@PSU 2

Contents 1 Overview 2 HPC Overview 3 Getting Started on ACI 4 Moving On awl5173 ICS@PSU 3

Goals for Today Goal 1: Learn about HPC in general Goal 2: Learn about ICS-ACI Goal 3: Learn how to get more information Notes: Expect to be overwhelmed Experience is the best teacher Learn terminology and basic concepts Spend time playing/learning Replicate an analysis awl5173 ICS@PSU 4

Contents 1 Overview 2 HPC Overview 3 Getting Started on ACI 4 Moving On awl5173 ICS@PSU 5

High Performance Computing Compute that is large : Large numbers of processors Meteorology, nuclear physics, fluid dynamics Large numbers of runs Genomics, astronomy, structural design Large memory requirements Financial predictions, molecular analysis Large storage requirements Big data analysis, machine learning Large runtimes Optimization, Poor coding awl5173 ICS@PSU 6

High Performance Computing - 2 HPC does not improve every single computational process! Your Goal: Do research. Your Task: Find tools for the best/fastest research Options: Build/find a tool specific to your problem Find a multi-purpose (very capable) tool awl5173 ICS@PSU 7

HPC @ PSU: Clusters Internal Resources: ICS-ACI is a very capable tool ACI-B: Batch (high/std/basic memory) ACI-I: Interactive Being Rolled Out: Gateways, GPUs ACI Partitions (CyberLAMP) Hosted resources (LIGO) External Resources: ICS is more than ACI NSF XSEDE Blue Waters through GLCPC DoE INCITE awl5173 ICS@PSU 8

HPC @ PSU: ICS People ICS Engagement Team: Adam Lavely; PhD in Aerospace Engering Christopher Blanton; PhD in Chemistry Charles Pavloski; PhD in Meteorology ICS iask Staff & GAs: From 7 departments across 4 colleges ICS Staff & GAs: Help organizing large proposals Other (?) If nothing else, ICS can point you in the right direction. awl5173 ICS@PSU 9

Contents 1 Overview 2 HPC Overview 3 Getting Started on ACI 4 Moving On awl5173 ICS@PSU 10

Getting an Account Sign up for an account: https://accounts.aci.ics.psu.edu Non-Faculty Notes: List a faculty sponsor by their PSU ID Implicit approval granted after 2 business days Sponsor can grant explicit approval for quicker access Faculty Notes: Email iask@ics.psu.edu to grant explicit approval awl5173 ICS@PSU 11

System Overview ACI-I: Interactive System aci-i.aci.ics.psu.edu Process on the shared interactive nodes Available to anybody for use ACI-B: Batch System aci-b.aci.ics.psu.edu Log in to a head node and submit jobs to compute nodes Groups can purchase allocations or use open queue Both systems have a common file-system awl5173 ICS@PSU 12

ACI-I Log on: Either ssh or EoD Interactive Nodes: Placed on node based on load Run your processes there Named aci-ic-* Limits on shared resources: 4 processors 12 CPU hours per process 48 GB resident memory Use this for: Debugging Visualization Interaction awl5173 ICS@PSU 13

ACI-B Log on: You must ssh (no EoD) Log-in Nodes: Placed based on current load Used for submitting jobs, not running processes Named aci-lgn-* Compute Nodes: Jobs wait until requested resources are available Run on dedicated (not shared) resources You get what you ask for in your submission script Use this for: Production runs High memory/proc/time usage awl5173 ICS@PSU 14

File-systems Home: \storage\home\userid 10 GB, backed up Cannot be shared Store important things here Work: \gpfs\work\userid 128 GB, backed up Can be shared Scratch: \gpfs\scratch\userid 1 million file limit (no size limit), not backed up Files older than 30 days are purged Run things from here awl5173 ICS@PSU 15

File-systems - 2 Notes: Work and scratch should have links in your home directory Groups can purchase group storage Similar to Work space, but in 5 TB increments Groups can purchase archival storage Back-up data for long term Data is available on datamgr.aci.ics.psu.edu for transfers Don t leave data on scratch! awl5173 ICS@PSU 16

Connecting to Clusters Ways to Connect: goo.gl/wdkqdi ssh in terminal program (mac/linux) PuTTY or other emulator (any OS, Windows) Exceed ondemand (any OS, to ACI-I only) Connect with ssh: < userid > @ < host > SSH to ACI-B $ ssh awl5173@aci-b.aci.ics.psu.edu SSH to ACI-I $ ssh awl5173@aci-i.aci.ics.psu.edu Notes: a) Use your username; b) the password doesn t show * awl5173 ICS@PSU 17

Exceed ondemand Exceed ondemand is a GUI program for accessing ACI-I: goo.gl/z5jqf2 Connecting with EoD: Hostname: aci-i.aci.ics.psu.edu PSU User ID and password (with ICS account) Two-factor authentication Xconfig: Desktop_Mode_1280x1024.cfg Xstart: Gnome_Desktop.xs Notes: You can have one session open at a time You can come back to a session (quit, don t log out) You have the GUI interface, but still need to use the command line awl5173 ICS@PSU 18

EoD: Finding the Terminal Applications System Tools Terminal awl5173 ICS@PSU 19

Getting Started: Terminal Terminal: All functionality via the command line No difference between ssh and EoD terminal Things to know: Google is your best friend Command-line framework man banana awl5173 ICS@PSU 20

Command-line framework Command Prompt: Created by the environment with some information Command: The command to run Flags: The flags (options) to alter the typical command Output: What the computer responds with (possibly nothing) 2nd Command Prompt: Shows back up when the process has finished Find total and available allocation hours $ mam-list-funds -h awl5173 ICS@PSU 21

man Most commands come with a manual. View the manual for the du command $ man du Use q to exit, arrows to move up and down. Lots of info: skimming man pages is a good skill to develop. awl5173 ICS@PSU 22

banana Most commands give a list of options if an improper flag is used. Find total and available allocation hours $ mam-list-funds --banana Output (on ACI-B): Unknown option: banana Usage: mam-list-funds [[-f] *fund_id*] [-A -I] [-n *fund_name*] [-X, --extension *property*=*value*]... [-u *user_name*] [-g *group_name*] [-a *account_name*] [-o *organization_name*] [-c *class_name*] [-m *machine_name*] [--filter *filter_name*=*filter_value*]... [--filter-type ExactMatch Exclusive NonExclusive] [--full] [--show *attribute_name*,...] [--long] [--wide] [--format csv raw standard] [--hours] [--debug] [--site *site_name*] [--help] [--man] [--quiet] [--version] [--about] awl5173 ICS@PSU 23

Basic Commands List the contents of the current directory $ ls Print the name of the current directory $ pwd Change directory $ cd scratch Copy a file $ cp logfile logfile_13sept2017 awl5173 ICS@PSU 24

Helpful Command Line Hints Other commands you may find useful: history, mv, rm, mkdir, find, grep, awk, id, du, clear, env, ssh, more Special Characters: is your home directory. means here.. means up one directory * is the wildcard: * for all files or *.png for all png files is pipe (send the output to another command) > means write command output to a file ( ls > log.ls ) awl5173 ICS@PSU 25

Modules Modules are used to allow multiple versions of the same software on the system software stack. Load the module of the software you wish to use. Show currently available modules $ module avail Search for a module $ module spider vasp Load a module $ module load ansys/18.1 awl5173 ICS@PSU 26

Modules - 2 ICS uses module families: Modules built with a compiler module are only available with that compiler loaded Same for parallelization modules See modules in the GCC family $ module avail $ module load gcc/5.3.1 $ module avail Other module commands you may find useful: module list, module purge, module show <modulename> awl5173 ICS@PSU 27

Transferring Data Methods to transfer files to/from ACI: Command line: scp, rsync, sftp, etc Use: datamgr.aci.ics.psu.edu Local programs: WinSCP, Filezilla, etc Online methods: Box, dropbox Use EoD s Firefox for online interface No syncing capability Data-specific programs: Globus, Aspera Copy a file to ACI-B $ scp lfile awl5173@aci-b.aci.ics.psu.edu:~/work/. awl5173 ICS@PSU 28

Connecting with WinSCP/Filezilla: File protocol: SCP Host name: datamgr.aci.ics.psu.edu PSU user name and password Port 22 Be ready for 2FA Drag and drop between your computer and cluster awl5173 ICS@PSU 29

Submitting a Job on ACI-B Submission script: PBS directives Give requested resources to scheduler Starts with #PBS Only at the beginning Commands Same as command line operation Have access to PBS variables Load modules, go to correct location, run code Submitting a Job $ qsub subscript.pbs awl5173 ICS@PSU 30

Sample Matlab Submission Script #!/bin/bash #PBS -l nodes=1:ppn=1 #PBS -l walltime=5:00 #PBS -A open # Get started echo "Job started on hostname at date " # Load in matlab module purge module load matlab/r2016a # Go to the correct place cd $PBS_O_WORKDIR # Run the job itself matlab-bin -nodisplay -nosplash < runthis.m > log.matlabrun # Finish up echo "Job Ended at date " awl5173 ICS@PSU 31

Contents 1 Overview 2 HPC Overview 3 Getting Started on ACI 4 Moving On awl5173 ICS@PSU 32

Getting Help ICS Documents: ics.psu.edu Onboarding documentation Expanding support iask: https://iask.aci.ics.psu.edu Email: iask@ics.psu.edu Phone: 54275 Google: Linux commands & command line tutorials Other Batch systems (TACC, OSC) awl5173 ICS@PSU 33

ICS Training Series Seminar 2: October 20, 27 Submitting jobs Run the same matlab script using the GUI, in batch mode and as a job Compiling simple codes Parallel hello world Allocation usage Basic mam usage for users in allocations Intro to parallelization Distributed vs. shared memory Data moving Globus, rsync Seminars 1, 2 & 3: Jan/Feb 2018 awl5173 ICS@PSU 34

Seminar 2: Requirements A laptop with Exceed ondemand installed Seminar will be interactive An account on ACI Basic terminal skills Be comfortable with basic terminal commands Will be learning mam-*, qsub, qstat Ability to use a terminal text editor vi, emacs, other choice awl5173 ICS@PSU 35

Fin Thank you for your time! iask: https://iask.aci.ics.psu.edu Email: iask@ics.psu.edu Phone: 54275 awl5173 ICS@PSU 36