This document will go through the steps to run a wrf model for South Idaho for a 3 day forecast. Other wrf runs will be similar, but not the same.

Size: px
Start display at page:

Download "This document will go through the steps to run a wrf model for South Idaho for a 3 day forecast. Other wrf runs will be similar, but not the same."

Transcription

1 Running WRF on R1 Introduction Web Docs This document will go through the steps to run a wrf model for South Idaho for a 3 day forecast. Other wrf runs will be similar, but not the same. Runtime Notes Note: the daily modeling process has been scripted and can found in cmsharedscriptsbuildswrfgcc3.6run_scripts Setting up your environment module load gcc4.8.4 openmpigcc pbspro export WRF_BASE_DIR=PATHTOWRFINSTALL export WRF_VERSION=3.7 export WRF_RUN_DIR=$WRF_BASE_DIR$WRF_VERSION source $WRF_RUN_DIRWPSWRF_Set_Env.bash export Run_Date=`date +"%Y%m%d"` export Log_File=$WRF_RUN_DIRDAILY_LOGSWRF_$Run_Date.log export WORKING_DATE=`date +"%Y-%m-%d"` Getting GFS data cd $WRF_RUN_DIRWPS_GEOG mkdir $WRF_RUN_DIRWPS_GEOGS-Idaho cd S-Idaho cmsharedscriptsbuildswrfgcc3.6run_scriptswrf_g et_gfs_parallel.sh ${Run_Date} this will call a script to get the GFS data in parallel Setting up your WPS and namelist.wps if you havent already done so, back up your WPS directory as a template cd $WRF_RUN_DIR cp -r WPS WPS-Template cd $WRF_RUN_DIRWPS cp cmsharedscriptsbuildswrfgcc3.6namelistsnameli st.wps-2-domain-72hr-template namelist.wps

2 this will copy a template namelist.wps that has the necessary settings to do a daily forecast. Below is a finished namelist.wps, the areas you will need to edit are highlighted. &share wrf_core = 'ARW', max_dom = 2, start_date = ' _06:00:00',' _06:00:00', end_date = ' _06:00:00',' _06:00:00', interval_seconds = 10800, io_form_geogrid = 2, debug_level = 0 &geogrid parent_id = 0,1, parent_grid_ratio = 1,3, i_parent_start = 1,30, j_parent_start = 1,18, e_we = 120,211, e_sn = 80,142, geog_data_res = '30s','30s' dx = 9000 dy = 9000 map_proj ='lambert', ref_lat = , ref_lon = , truelat1 = , truelat2 = , stand_lon = , ref_x = 60.0, ref_y = 40.0, geog_data_path = 'cmhomeggrentzwrf3.7wps_geog', opt_geogrid_tbl_path = 'cmhomeggrentzwrf3.7wpsgeogrid' &ungrib out_format prefix = 'WPS', = 'GFS1' &metgrid fg_name = 'GFS1' io_form_metgrid = 2, opt_metgrid_tbl_path = 'cmhomeggrentzwrf3.7wpsmetgrid', WPS: now you are ready to run geogrid.geogrid.exe link the ungrib data ln -sf ungribvariable_tablesvtable.gfs Vtable.link_grib.csh $WRF_RUN_DIRWPS_GEOGS-Idaho run ungrib.ungrib.exe run metgrid.metgrid.exe WRFV3: if you havent already, back up your WRFV3run directory

3 cp -r $WRF_RUN_DIRWRFV3run $WRF_RUN_DIRWRFV3run- Template cd $WRF_RUN_DIRWRFV3run now you will create a namelist.input file. Below is an example of an edited cmsharedscriptsbuildswrfgcc3.6namelistsnamelist.input-2-domain- 72hr-Template file and the areas that need editing are highlighted. &time_control run_days = 3, run_hours = 0, run_minutes = 0, run_seconds = 0, start_year = 2015, 2015, start_month = 07, 07, start_day = 23, 23, start_hour = 06, 06, start_minute = 00, 00, start_second = 00, 00, end_year = 2015, 2015, end_month = 07, 07, end_day = 26, 26, end_hour = 06, 06, end_minute = 00, 00, end_second = 00, 00, interval_seconds = 10800, input_from_file =.true.,.true., history_interval = 60, 60, frames_per_outfile = 1000, 1000, restart =.false., restart_interval = 1440, io_form_history = 2, io_form_restart = 2, io_form_input = 2, io_form_boundary = 2, debug_level = 0, &domains time_step = 54, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 2, e_we = 120,211, e_sn = 80,142, e_vert = 40,40, p_top_requested = 5000, num_metgrid_levels = 27, num_metgrid_soil_levels = 4, dx = 9000, 3000, dy = 9000, 3000, grid_id = 1, 2, parent_id = 1, 1, i_parent_start = 1, 30, j_parent_start = 1, 18, parent_grid_ratio = 1, 3, parent_time_step_ratio = 1, 3, feedback = 0, smooth_option = 0, &physics mp_physics = 8, 8, ra_lw_physics = 3, 3, ra_sw_physics = 3, 3, radt = 9, 9,

4 sf_sfclay_physics = 2, 2, sf_surface_physics = 2, 2, bl_pbl_physics = 2, 2, bldt = 0, 0, cu_physics = 1, 0, cudt = 5, 5, surface_input_source = 1, num_soil_layers = 4, sf_urban_physics = 0, 0, &dynamics w_damping = 0, diff_opt = 1, 1, km_opt = 4, 4, diff_6th_opt = 0, 0, diff_6th_factor = 0.12, 0.12, base_temp = 290., damp_opt = 0, zdamp = 5000., 5000., dampcoef = 0.2, 0.2, khdif = 0, 0, kvdif = 0, 0, non_hydrostatic =.true.,.true., moist_adv_opt = 1, 1, scalar_adv_opt = 1, 1, &bdy_control spec_bdy_width = 5, spec_zone = 1, relax_zone = 4, specified =.true.,.false., nested =.false.,.true., &grib2 &namelist_quilt nio_tasks_per_group = 0, nio_groups = 1, Link Metgrid data and run real.exe ln -sf...wpsmet_em.d0*..real.exe now you are ready to run wrf.exe. Below is a pbs submit script for running wrf.exe on the R1 cluster #!binbash # Submit command is: qsub sge-hello-world-gcc.bash # Set the job name #PBS -N WRFDaily # Specify a queue #PBS -q cpu_amd_smp #PBS -j oe # Specify the resources you would like #PBS -l select=4:ncpus=16:mpiprocs=16:mem=24gb

5 ##PBS -l select=64 # Use the current working directory cd $PBS_O_WORKDIR #PBS -S binbash # Load Source System Default Modules echo "Sourcing (loading) default modules \n". etcprofile.dmodules.sh source ~WRF3.7WPSWRF_Set_Env.bash # Add Modules You Need echo "Adding required job modules \n" module add shared module add pbspro module add gcc module add openmpigcc #PBS -v PATH,LD_LIB_PATH # Output How Many Slots The Job Used echo "=========================================================== =====" echo "PE_HOSTFILE file:" cat $PE_HOSTFILE echo "$TMPDIRmachines file:" #cat $TMPDIRmachines echo "=========================================================== =====" echo JOB_ID=$PBS_JOBID echo QUEUE=$PBS_QUEUE echo PBS_CWD_PATH=$PBS_O PATH echo PBS_O_HOME=$PBS_O_HOME echo PBS_O_LOGNAME=$PBS_O_LOGNAME echo PBS_O_PATH=$PBS_O_PATH echo PBS_O_SHELL=$PBS_O_SHELL echo PBS_O_HOST=$PBS_O_HOST echo PBS_WORKDIR=$PBS_O_WORKDIR echo PBS_ENVIRONMENT=$PBS_ENVIRONMENT echo MPI_HOME=$MPI_HOME echo LOADEDMODULES=$LOADEDMODULES # Use MPIRUN To Process The Job echo "***** Starting the MPI process \n" mpirun -v wrf.exe Post-Processing submit the script to the cluster: qsub PBS_WRF-cpu.bash Now you should have two wrfout files in your $WRF_RUN_DIRWRFV3run directory. With these you can generate images of the model.

6 unset NCARG_ROOT cp cmdata1afloreswrf- ScriptsRun_Scriptsncl_plotting_scriptswrf_Temperat ure.ncl.template.wrf_temperature.ncl sed -i 's Working-Date '$WORKING_DATE' g' wrf_temperature.ncl sed -i 's _06: _'$Offset': g' wrf_temperature.ncl mkdir output_$working_date mkdir output_$working_datetemp module load ncarg ffmpeg ncl wrf_temperature.ncl here you can stitch together the images created to make a video cd output_$working_datetemp ffmpeg -r 8 -i frame%05d.png -c:v libx264 -r 8 -pix_fmt yuv420p -s 1920x1080 output.mp4 ffmpeg -r 8 -i frame%05d.png -c:v libtheora -r 8 -pix_fmt yuv420p -s 1920x1080 -q:v 5 output.ogg mv -f output.mp4 cmdata1afloress-idaho3daywrf_$ {WORKING_DATE}_Temperature.mp4 mv -f output.ogg cmdata1afloress-idaho3daywrf_$ {WORKING_DATE}_Temperature.ogg now you should have an mp4 and ogg movie file for the temperature of south Idaho for the next 3 days.

NESTING IN WRF. Kelly Werner & Wei Wang November 2017

NESTING IN WRF. Kelly Werner & Wei Wang November 2017 NESTING IN WRF Kelly Werner & Wei Wang November 2017 What is a nest? A finer-resolution domain embedded in a coarser resolution domain, and run together with the coarser resolution domain Enables running

More information

WEATHER RESEARCH AND FORECASTING (WRF) MODELING SYSTEM (ASL 410)

WEATHER RESEARCH AND FORECASTING (WRF) MODELING SYSTEM (ASL 410) WEATHER RESEARCH AND FORECASTING (WRF) MODELING SYSTEM (ASL 410) Steps to be done Download the meteorological data Install WRF model and initialize for model run WRF Pre-processing (WPS) WRF model run

More information

Introduction to WRF-Sfire. Adam Kochanski Jonathan Beezley Jan Mandel

Introduction to WRF-Sfire. Adam Kochanski Jonathan Beezley Jan Mandel Introduction to WRF-Sfire. Adam Kochanski Jonathan Beezley Jan Mandel Outline What is WRF Structure of the WRF modeling system WRF model setup for real cases WRF model setup for idealized cases WRF setup

More information

WRF-Var System. WRF-Var Tutorial July 21-22, Xin Zhang and Syed RH Rizvi. Michael Duda

WRF-Var System. WRF-Var Tutorial July 21-22, Xin Zhang and Syed RH Rizvi. Michael Duda System Tutorial July 21-22, 2008 Xin Zhang and Syed RH Rizvi Michael Duda Outline in the WRF Modeling System Software Code Overview 1 in the WRF Modeling System 2 in the WRF Modeling System Blue --> Supported

More information

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS)

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS) User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3 Chapter 3: WRF Preprocessing System (WPS) Table of Contents Introduction Function of Each WPS Program

More information

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS)

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS) User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3 Chapter 3: WRF Preprocessing System (WPS) Table of Contents Introduction Function of Each WPS Program

More information

Using the WRF-ARW on the UWM mortimer HPC Guide for WRF-ARW Version October 15, 2017

Using the WRF-ARW on the UWM mortimer HPC Guide for WRF-ARW Version October 15, 2017 Using the WRF-ARW on the UWM mortimer HPC Guide for WRF-ARW Version 3.9.1 October 15, 2017 Introduction This guide is designed to facilitate basic compilation and execution of the WRF-ARW v3.9.1 model

More information

VI-SEEM NAT-GR CL: National training event in Greece

VI-SEEM NAT-GR CL: National training event in Greece 11-12 December 2017 Aristotle University, Research Dissemination Center (KEDEA) VI-SEEM NAT-GR CL: National training event in Greece WRF Weather Research and Forecast Model Meteorological applications

More information

Foreword. For the latest version of this document, please visit the ARW User s Web site at

Foreword. For the latest version of this document, please visit the ARW User s Web site at Foreword This User s Guide describes the Advanced Research WRF (ARW) Version 2.2 modeling system, released in December 2006. As the ARW is developed further, this document will be continuously enhanced

More information

NMM WEATHER RESEARCH & FORECASTING. Version 3 Modeling System User s Guide. October 2017

NMM WEATHER RESEARCH & FORECASTING. Version 3 Modeling System User s Guide. October 2017 WEATHER RESEARCH & FORECASTING NMM Version 3 Modeling System User s Guide Model Data: NAM (Eta), GFS, NNRP,... Terrestrial Data WRF-NMM FLOW CHART WPS met_nmm.d01 real_nmm.exe (Real data initialization)

More information

Dynamical Downscaling Tutorial

Dynamical Downscaling Tutorial The International Workshop on Agromet and GIS Applications for Agricultural Decision Making Dynamical Downscaling Tutorial Date : December 5(Mon)~9(Fri), 2016 Place : MSTAY Hotel JEJU Hosted by : Korea

More information

Version 3 Modeling Systems User s Guide January 2009

Version 3 Modeling Systems User s Guide January 2009 Version 3 Modeling Systems User s Guide January 2009 Mesoscale & Microscale Meteorology Division National Center for Atmospheric Research Foreword This User s Guide describes the Advanced Research WRF

More information

How To Build & Run WRF-Hydro V5 in Standalone Mode

How To Build & Run WRF-Hydro V5 in Standalone Mode How To Build & Run WRF-Hydro V5 in Standalone Mode And Create Customized Geographical Inputs & Regrid Forcing Data Creation Date: 14 April 2018 WRF-Hydro Development Team Purpose This document serves two

More information

wrfxpy Documentation Release Martin Vejmelka

wrfxpy Documentation Release Martin Vejmelka wrfxpy Documentation Release 1.0.0 Martin Vejmelka Aug 05, 2018 Contents 1 Basic topics 3 2 Advanced topics 15 3 Indices and tables 17 i ii wrfxpy is a set of software modules that provide functionality

More information

MEA 716 WRF Homework, Data Case Spring WRF Real-Data Case Study Simulation

MEA 716 WRF Homework, Data Case Spring WRF Real-Data Case Study Simulation MEA 716 WRF Homework, Data Case Spring 2016 Assigned: Tuesday 2/2/16 Due: Tuesday, 2/9/16 WRF Real-Data Case Study Simulation This assignment takes students through the process of obtaining initial and

More information

Step 3: Access the HPC machine you will be using to run WRF: ocelote. Step 4: transfer downloaded WRF tar files to your home directory

Step 3: Access the HPC machine you will be using to run WRF: ocelote. Step 4: transfer downloaded WRF tar files to your home directory Step 1: download WRF packages Get WRF tar file from WRF users page, Version 3.8.1. Also get WPS Version 3.8.1 (preprocessor) Store on your local machine Step 2: Login to UA HPC system ssh (UAnetid)@hpc.arizona.edu

More information

ARW. Weather Research and Forecasting. Version 3 Modeling System User s Guide January 2018

ARW. Weather Research and Forecasting. Version 3 Modeling System User s Guide January 2018 Weather Research and Forecasting ARW Version 3 Modeling System User s Guide January 2018 Mesoscale and Microscale Meteorology Laboratory National Center for Atmospheric Research ! Foreword This User s

More information

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 2: Software Installation

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 2: Software Installation User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3 Chapter 2: Software Installation Table of Contents Introduction Required Compilers and Scripting Languauges

More information

Mechanics of running HWRF

Mechanics of running HWRF HWRF v3.5b Tutorial Taipei, Taiwan, May 22, 2014 Mechanics of running HWRF Ligia Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO 1 Overview of this lecture

More information

University of Edinburgh Guide to Running WRF on ECDF Eddie

University of Edinburgh Guide to Running WRF on ECDF Eddie University of Edinburgh Guide to Running WRF on ECDF Eddie Luke Smallman School of Geoscience Institute of Atmospheric and Environmental Science Crew Building The King s Buildings West Mains Road Edinburgh

More information

WRF Utilities. Cindy Bruyère

WRF Utilities. Cindy Bruyère WRF Utilities Cindy Bruyère Overview Graphical Tools WRF Model Domain Design Intermediate Files netcdf GRIB1 / GRIB2 Verification Tools Domain Wizard Graphics Graphics NCL Graphical package WRF-ARW Only

More information

Modeling Training Workshop on WRF, SMOKE and CMAQ Asia Center for Air Pollution Research (ACAP) January 2011

Modeling Training Workshop on WRF, SMOKE and CMAQ Asia Center for Air Pollution Research (ACAP) January 2011 Modeling Training Workshop on WRF, SMOKE and CMAQ Asia Center for Air Pollution Research (ACAP) 24-29 January 2011 Prepared by Dr. Joshua Fu 1, Dr. Yun-Fat Lam 1 and Dr. Tatsuya Sakurai 2 1 University

More information

Installing Steps. WRF & WPS: Compilation Process. System Requirements. Check System Requirements

Installing Steps. WRF & WPS: Compilation Process. System Requirements. Check System Requirements WRF & WPS: Compilation Process Kelly Werner NCAR/MMM January 2018 Installing Steps 1 2 System Requirements On what kinds of systems will WRF run? Generally any 32- or 64-bit hardware, running a UNIX-like

More information

Installing and running COMSOL 4.3a on a Linux cluster COMSOL. All rights reserved.

Installing and running COMSOL 4.3a on a Linux cluster COMSOL. All rights reserved. Installing and running COMSOL 4.3a on a Linux cluster 2012 COMSOL. All rights reserved. Introduction This quick guide explains how to install and operate COMSOL Multiphysics 4.3a on a Linux cluster. It

More information

Quick Start Guide. by Burak Himmetoglu. Supercomputing Consultant. Enterprise Technology Services & Center for Scientific Computing

Quick Start Guide. by Burak Himmetoglu. Supercomputing Consultant. Enterprise Technology Services & Center for Scientific Computing Quick Start Guide by Burak Himmetoglu Supercomputing Consultant Enterprise Technology Services & Center for Scientific Computing E-mail: bhimmetoglu@ucsb.edu Linux/Unix basic commands Basic command structure:

More information

Quick Start Guide. by Burak Himmetoglu. Supercomputing Consultant. Enterprise Technology Services & Center for Scientific Computing

Quick Start Guide. by Burak Himmetoglu. Supercomputing Consultant. Enterprise Technology Services & Center for Scientific Computing Quick Start Guide by Burak Himmetoglu Supercomputing Consultant Enterprise Technology Services & Center for Scientific Computing E-mail: bhimmetoglu@ucsb.edu Contents User access, logging in Linux/Unix

More information

Kohinoor queuing document

Kohinoor queuing document List of SGE Commands: qsub : Submit a job to SGE Kohinoor queuing document qstat : Determine the status of a job qdel : Delete a job qhost : Display Node information Some useful commands $qstat f -- Specifies

More information

Installing WRF & WPS. Kelly Keene NCAR/MMM January 2015

Installing WRF & WPS. Kelly Keene NCAR/MMM January 2015 Installing WRF & WPS Kelly Keene NCAR/MMM January 2015 1 Installing Steps Check system requirements Installing Libraries Download source data Download datasets Compile WRFV3 Compile WPS 2 System Requirements

More information

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 5: WRF NMM Model

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 5: WRF NMM Model User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3 Table of Contents Chapter 5: WRF NMM Model Introduction WRF-NMM Dynamics o Time stepping o Advection o

More information

NoahMP namelist.hrldas: Description of Options

NoahMP namelist.hrldas: Description of Options NoahMP namelist.hrldas: Description of Options (notes are in bold blue and indicated with

More information

Part 1: Basic Commands/U3li3es

Part 1: Basic Commands/U3li3es Final Exam Part 1: Basic Commands/U3li3es May 17 th 3:00~4:00pm S-3-143 Same types of questions as in mid-term 1 2 ls, cat, echo ls -l e.g., regular file or directory, permissions, file size ls -a cat

More information

Grid Engine Users Guide. 5.5 Edition

Grid Engine Users Guide. 5.5 Edition Grid Engine Users Guide 5.5 Edition Grid Engine Users Guide : 5.5 Edition Published May 08 2012 Copyright 2012 University of California and Scalable Systems This document is subject to the Rocks License

More information

WRF Registry and Examples Part 1. John Michalakes, NCEP Michael Duda, NCAR Dave Gill, NCAR WRF Software Architecture Working Group

WRF Registry and Examples Part 1. John Michalakes, NCEP Michael Duda, NCAR Dave Gill, NCAR WRF Software Architecture Working Group WRF Registry and Examples Part 1 John Michalakes, NCEP Michael Duda, NCAR Dave Gill, NCAR WRF Software Architecture Working Group Outline Registry Mechanics Part 1 - - - - - - - - - - - - Examples Part

More information

Working on the NewRiver Cluster

Working on the NewRiver Cluster Working on the NewRiver Cluster CMDA3634: Computer Science Foundations for Computational Modeling and Data Analytics 22 February 2018 NewRiver is a computing cluster provided by Virginia Tech s Advanced

More information

Batch Systems. Running calculations on HPC resources

Batch Systems. Running calculations on HPC resources Batch Systems Running calculations on HPC resources Outline What is a batch system? How do I interact with the batch system Job submission scripts Interactive jobs Common batch systems Converting between

More information

A Hands-On Tutorial: RNA Sequencing Using High-Performance Computing

A Hands-On Tutorial: RNA Sequencing Using High-Performance Computing A Hands-On Tutorial: RNA Sequencing Using Computing February 11th and 12th, 2016 1st session (Thursday) Preliminaries: Linux, HPC, command line interface Using HPC: modules, queuing system Presented by:

More information

OPEN MP and MPI on Kingspeak chpc cluster

OPEN MP and MPI on Kingspeak chpc cluster OPEN MP and MPI on Kingspeak chpc cluster Command to compile the code with openmp and mpi /uufs/kingspeak.peaks/sys/pkg/openmpi/std_intel/bin/mpicc -o hem hemhotlz.c -I /uufs/kingspeak.peaks/sys/pkg/openmpi/std_intel/include

More information

Cluster Clonetroop: HowTo 2014

Cluster Clonetroop: HowTo 2014 2014/02/25 16:53 1/13 Cluster Clonetroop: HowTo 2014 Cluster Clonetroop: HowTo 2014 This section contains information about how to access, compile and execute jobs on Clonetroop, Laboratori de Càlcul Numeric's

More information

PBS Pro and Ansys Examples

PBS Pro and Ansys Examples PBS Pro and Ansys Examples Introduction This document contains a number of different types of examples of using Ansys on the HPC, listed below. 1. Single-node Ansys Job 2. Single-node CFX Job 3. Single-node

More information

HPCC - Hrothgar Getting Started User Guide Gromacs

HPCC - Hrothgar Getting Started User Guide Gromacs HPCC - Hrothgar Getting Started User Guide Gromacs High Performance Computing Center Texas Tech University HPCC - Hrothgar 2 Table of Contents 1. Introduction... 3 2. Setting up the environment... 3 For

More information

Parameter searches and the batch system

Parameter searches and the batch system Parameter searches and the batch system Scientific Computing Group css@rrzn.uni-hannover.de Parameter searches and the batch system Scientific Computing Group 1st of October 2012 1 Contents 1 Parameter

More information

Shell Programming. Introduction to Linux. Peter Ruprecht Research CU Boulder

Shell Programming. Introduction to Linux. Peter Ruprecht  Research CU Boulder Introduction to Linux Shell Programming Peter Ruprecht peter.ruprecht@colorado.edu www.rc.colorado.edu Downloadable Materials Slides and examples available at https://github.com/researchcomputing/ Final_Tutorials/

More information

Guillimin HPC Users Meeting March 17, 2016

Guillimin HPC Users Meeting March 17, 2016 Guillimin HPC Users Meeting March 17, 2016 guillimin@calculquebec.ca McGill University / Calcul Québec / Compute Canada Montréal, QC Canada Outline Compute Canada News System Status Software Updates Training

More information

SGI OpenFOAM TM Quick Start Guide

SGI OpenFOAM TM Quick Start Guide SGI OpenFOAM TM Quick Start Guide 007 5817 001 COPYRIGHT 2012, SGI. All rights reserved; provided portions may be copyright in third parties, as indicated elsewhere herein. No permission is granted to

More information

Shell Scripting. With Applications to HPC. Edmund Sumbar Copyright 2007 University of Alberta. All rights reserved

Shell Scripting. With Applications to HPC. Edmund Sumbar Copyright 2007 University of Alberta. All rights reserved AICT High Performance Computing Workshop With Applications to HPC Edmund Sumbar research.support@ualberta.ca Copyright 2007 University of Alberta. All rights reserved High performance computing environment

More information

SGE Roll: Users Guide. Version Edition

SGE Roll: Users Guide. Version Edition SGE Roll: Users Guide Version 4.2.1 Edition SGE Roll: Users Guide : Version 4.2.1 Edition Published Sep 2006 Copyright 2006 University of California and Scalable Systems This document is subject to the

More information

GPU Cluster Usage Tutorial

GPU Cluster Usage Tutorial GPU Cluster Usage Tutorial How to make caffe and enjoy tensorflow on Torque 2016 11 12 Yunfeng Wang 1 PBS and Torque PBS: Portable Batch System, computer software that performs job scheduling versions

More information

User Guide of High Performance Computing Cluster in School of Physics

User Guide of High Performance Computing Cluster in School of Physics User Guide of High Performance Computing Cluster in School of Physics Prepared by Sue Yang (xue.yang@sydney.edu.au) This document aims at helping users to quickly log into the cluster, set up the software

More information

How to Use the WRF Registry. John Michalakes, NRL Dave Gill, NCAR WRF Software Architecture Working Group

How to Use the WRF Registry. John Michalakes, NRL Dave Gill, NCAR WRF Software Architecture Working Group How to Use the WRF Registry John Michalakes, NRL Dave Gill, NCAR WRF Software Architecture Working Group Outline What is the WRF Registry Keyword syntax The BIG Three Examples Runtime I/O mods Adding a

More information

Table of Contents. Table of Contents Job Manager for remote execution of QuantumATK scripts. A single remote machine

Table of Contents. Table of Contents Job Manager for remote execution of QuantumATK scripts. A single remote machine Table of Contents Table of Contents Job Manager for remote execution of QuantumATK scripts A single remote machine Settings Environment Resources Notifications Diagnostics Save and test the new machine

More information

Advanced Features of the WRF Preprocessing System

Advanced Features of the WRF Preprocessing System * Advanced Features of the WRF Preprocessing System Michael Duda *NCAR is sponsored by the National Science Foundation Outline The GEOGRID.TBL file - What is the GEOGRID.TBL file? - Ingesting new static

More information

Post-processing Tools Cindy Bruyère GURME 2009 WRF Users' Tutorial 1 Mesoscale & Microscale Meteorological Division / NCAR

Post-processing Tools Cindy Bruyère GURME 2009 WRF Users' Tutorial 1 Mesoscale & Microscale Meteorological Division / NCAR Post-processing Tools Cindy Bruyère 1 NCL Graphical package ARWpost Converter (GrADS & vis5d) RIP4 Graphical UG: 9-2 UG: 9-28 UG: 9-19 Converter and interface to graphical package NCAR Graphics WPP UG:

More information

Introduc)on to Pacman

Introduc)on to Pacman Introduc)on to Pacman Don Bahls User Consultant dmbahls@alaska.edu (Significant Slide Content from Tom Logan) Overview Connec)ng to Pacman Hardware Programming Environment Compilers Queuing System Interac)ve

More information

Running LAMMPS on CC servers at IITM

Running LAMMPS on CC servers at IITM Running LAMMPS on CC servers at IITM Srihari Sundar September 9, 2016 This tutorial assumes prior knowledge about LAMMPS [2, 1] and deals with running LAMMPS scripts on the compute servers at the computer

More information

New User Tutorial. OSU High Performance Computing Center

New User Tutorial. OSU High Performance Computing Center New User Tutorial OSU High Performance Computing Center TABLE OF CONTENTS Logging In... 3-5 Windows... 3-4 Linux... 4 Mac... 4-5 Changing Password... 5 Using Linux Commands... 6 File Systems... 7 File

More information

How to get started with OpenFOAM at SHARCNET

How to get started with OpenFOAM at SHARCNET How to get started with OpenFOAM at SHARCNET, High Performance Technical Consultant SHARCNET, York University isaac@sharcnet.ca Outlines Introduction to OpenFOAM Compilation in SHARCNET Pre/Post-Processing

More information

Quick Guide for the Torque Cluster Manager

Quick Guide for the Torque Cluster Manager Quick Guide for the Torque Cluster Manager Introduction: One of the main purposes of the Aries Cluster is to accommodate especially long-running programs. Users who run long jobs (which take hours or days

More information

Logging in to the CRAY

Logging in to the CRAY Logging in to the CRAY 1. Open Terminal Cray Hostname: cray2.colostate.edu Cray IP address: 129.82.103.183 On a Mac 2. type ssh username@cray2.colostate.edu where username is your account name 3. enter

More information

UBDA Platform User Gudie. 16 July P a g e 1

UBDA Platform User Gudie. 16 July P a g e 1 16 July 2018 P a g e 1 Revision History Version Date Prepared By Summary of Changes 1.0 Jul 16, 2018 Initial release P a g e 2 Table of Contents 1. Introduction... 4 2. Perform the test... 5 3 Job submission...

More information

Grid Engine Users Guide. 7.0 Edition

Grid Engine Users Guide. 7.0 Edition Grid Engine Users Guide 7.0 Edition Grid Engine Users Guide : 7.0 Edition Published Dec 01 2017 Copyright 2017 University of California and Scalable Systems This document is subject to the Rocks License

More information

Guillimin HPC Users Meeting October 20, 2016

Guillimin HPC Users Meeting October 20, 2016 Guillimin HPC Users Meeting October 20, 2016 guillimin@calculquebec.ca McGill University / Calcul Québec / Compute Canada Montréal, QC Canada Please be kind to your fellow user meeting attendees Limit

More information

Docker task in HPC Pack

Docker task in HPC Pack Docker task in HPC Pack We introduced docker task in HPC Pack 2016 Update1. To use this feature, set the environment variable CCP_DOCKER_IMAGE of a task so that it could be run in a docker container on

More information

ncsa eclipse internal training

ncsa eclipse internal training ncsa eclipse internal training This tutorial will cover the basic setup and use of Eclipse with forge.ncsa.illinois.edu. At the end of the tutorial, you should be comfortable with the following tasks:

More information

How to Use the WRF Registry. John Michalakes, NRL Dave Gill, NCAR WRF Software Architecture Working Group

How to Use the WRF Registry. John Michalakes, NRL Dave Gill, NCAR WRF Software Architecture Working Group How to Use the WRF Registry John Michalakes, NRL Dave Gill, NCAR WRF Software Architecture Working Group Outline What is the WRF Registry Keyword syntax The BIG Three Examples Runtime I/O mods Adding a

More information

NBIC TechTrack PBS Tutorial. by Marcel Kempenaar, NBIC Bioinformatics Research Support group, University Medical Center Groningen

NBIC TechTrack PBS Tutorial. by Marcel Kempenaar, NBIC Bioinformatics Research Support group, University Medical Center Groningen NBIC TechTrack PBS Tutorial by Marcel Kempenaar, NBIC Bioinformatics Research Support group, University Medical Center Groningen 1 NBIC PBS Tutorial This part is an introduction to clusters and the PBS

More information

Introduction to HPC Resources and Linux

Introduction to HPC Resources and Linux Introduction to HPC Resources and Linux Burak Himmetoglu Enterprise Technology Services & Center for Scientific Computing e-mail: bhimmetoglu@ucsb.edu Paul Weakliem California Nanosystems Institute & Center

More information

How to run applications on Aziz supercomputer. Mohammad Rafi System Administrator Fujitsu Technology Solutions

How to run applications on Aziz supercomputer. Mohammad Rafi System Administrator Fujitsu Technology Solutions How to run applications on Aziz supercomputer Mohammad Rafi System Administrator Fujitsu Technology Solutions Agenda Overview Compute Nodes Storage Infrastructure Servers Cluster Stack Environment Modules

More information

SGE Roll: Users Guide. Version 5.3 Edition

SGE Roll: Users Guide. Version 5.3 Edition SGE Roll: Users Guide Version 5.3 Edition SGE Roll: Users Guide : Version 5.3 Edition Published Dec 2009 Copyright 2009 University of California and Scalable Systems This document is subject to the Rocks

More information

Advanced Features of the WRF Preprocessing System

Advanced Features of the WRF Preprocessing System * Advanced Features of the WRF Preprocessing System Michael Duda WRF Users Tutorial, 23 27 January 2012 *NCAR is sponsored by the National Science Foundation Outline The GEOGRID.TBL file What is the GEOGRID.TBL

More information

Before We Start. Sign in hpcxx account slips Windows Users: Download PuTTY. Google PuTTY First result Save putty.exe to Desktop

Before We Start. Sign in hpcxx account slips Windows Users: Download PuTTY. Google PuTTY First result Save putty.exe to Desktop Before We Start Sign in hpcxx account slips Windows Users: Download PuTTY Google PuTTY First result Save putty.exe to Desktop Research Computing at Virginia Tech Advanced Research Computing Compute Resources

More information

Parallelism. Wolfgang Kastaun. May 9, 2008

Parallelism. Wolfgang Kastaun. May 9, 2008 Parallelism Wolfgang Kastaun May 9, 2008 Outline Parallel computing Frameworks MPI and the batch system Running MPI code at TAT The CACTUS framework Overview Mesh refinement Writing Cactus modules Links

More information

NEMO-RELOC Documentation

NEMO-RELOC Documentation NEMO-RELOC Documentation Release 0.1 NOC Mar 06, 2017 Contents 1 Contents: 3 1.1 To-Do List................................................ 3 1.2 Template Note..............................................

More information

HWRF: Setup and Run. Shaowu Bao CIRES/CU and NOAA/ESRL/GSD. WRF For Hurricanes Tutorial Boulder, CO April

HWRF: Setup and Run. Shaowu Bao CIRES/CU and NOAA/ESRL/GSD. WRF For Hurricanes Tutorial Boulder, CO April HWRF: Setup and Run Shaowu Bao CIRES/CU and NOAA/ESRL/GSD WRF For Hurricanes Tutorial Boulder, CO April 28 2011 Source structure and executables ${HOME}/HWRF/src comgsi_ v2.5 pomtc WRFV3 WPS WPPV3 gfdlvortextracker

More information

Migrating from Zcluster to Sapelo

Migrating from Zcluster to Sapelo GACRC User Quick Guide: Migrating from Zcluster to Sapelo The GACRC Staff Version 1.0 8/4/17 1 Discussion Points I. Request Sapelo User Account II. III. IV. Systems Transfer Files Configure Software Environment

More information

EE/CSCI 451 Introduction to Parallel and Distributed Computation. Discussion #4 2/3/2017 University of Southern California

EE/CSCI 451 Introduction to Parallel and Distributed Computation. Discussion #4 2/3/2017 University of Southern California EE/CSCI 451 Introduction to Parallel and Distributed Computation Discussion #4 2/3/2017 University of Southern California 1 USC HPCC Access Compile Submit job OpenMP Today s topic What is OpenMP OpenMP

More information

Post-processing Tools

Post-processing Tools Post-processing Tools Cindy Bruyère 1 Graphical Packages NCL Graphical package UG: 9-2 VAPOR Converter and graphical package Support: VAPOR UG: 9-50 ARWpost Converter (GrADS & vis5d) UG: 9-28 UG: 9-19

More information

Getting started with the CEES Grid

Getting started with the CEES Grid Getting started with the CEES Grid October, 2013 CEES HPC Manager: Dennis Michael, dennis@stanford.edu, 723-2014, Mitchell Building room 415. Please see our web site at http://cees.stanford.edu. Account

More information

MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization

MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization MIGRATING TO THE SHARED COMPUTING CLUSTER (SCC) SCV Staff Boston University Scientific Computing and Visualization 2 Glenn Bresnahan Director, SCV MGHPCC Buy-in Program Kadin Tseng HPC Programmer/Consultant

More information

The DTU HPC system. and how to use TopOpt in PETSc on a HPC system, visualize and 3D print results.

The DTU HPC system. and how to use TopOpt in PETSc on a HPC system, visualize and 3D print results. The DTU HPC system and how to use TopOpt in PETSc on a HPC system, visualize and 3D print results. Niels Aage Department of Mechanical Engineering Technical University of Denmark Email: naage@mek.dtu.dk

More information

Shifter on Blue Waters

Shifter on Blue Waters Shifter on Blue Waters Why Containers? Your Computer Another Computer (Supercomputer) Application Application software libraries System libraries software libraries System libraries Why Containers? Your

More information

Cluster User Training

Cluster User Training Cluster User Training From Bash to parallel jobs under SGE in one terrifying hour Christopher Dwan, Bioteam First delivered at IICB, Kolkata, India December 14, 2009 UNIX ESSENTIALS Unix command line essentials

More information

Introduction to HPC Using zcluster at GACRC

Introduction to HPC Using zcluster at GACRC Introduction to HPC Using zcluster at GACRC On-class PBIO/BINF8350 Georgia Advanced Computing Resource Center University of Georgia Zhuofei Hou, HPC Trainer zhuofei@uga.edu Outline What is GACRC? What

More information

Computing with the Moore Cluster

Computing with the Moore Cluster Computing with the Moore Cluster Edward Walter An overview of data management and job processing in the Moore compute cluster. Overview Getting access to the cluster Data management Submitting jobs (MPI

More information

INd_rasN SOME SHELL SCRIPTING PROGRAMS. 1. Write a shell script to check whether the name passed as first argument is the name of a file or directory.

INd_rasN SOME SHELL SCRIPTING PROGRAMS. 1. Write a shell script to check whether the name passed as first argument is the name of a file or directory. 1. Write a shell script to check whether the name passed as rst argument is the name of a le or directory. Ans: #!/bin/bash if [ -f $1 ] echo "$1 is a le" echo "$1 is not a le" 2. Write a shell script

More information

Viglen NPACI Rocks. Getting Started and FAQ

Viglen NPACI Rocks. Getting Started and FAQ Viglen NPACI Rocks Getting Started and FAQ Table of Contents Viglen NPACI Rocks...1 Getting Started...3 Powering up the machines:...3 Checking node status...4 Through web interface:...4 Adding users:...7

More information

Simple examples how to run MPI program via PBS on Taurus HPC

Simple examples how to run MPI program via PBS on Taurus HPC Simple examples how to run MPI program via PBS on Taurus HPC MPI setup There's a number of MPI implementations install on the cluster. You can list them all issuing the following command: module avail/load/list/unload

More information

Using Sapelo2 Cluster at the GACRC

Using Sapelo2 Cluster at the GACRC Using Sapelo2 Cluster at the GACRC New User Training Workshop Georgia Advanced Computing Resource Center (GACRC) EITS/University of Georgia Zhuofei Hou zhuofei@uga.edu 1 Outline GACRC Sapelo2 Cluster Diagram

More information

The JANUS Computing Environment

The JANUS Computing Environment Research Computing UNIVERSITY OF COLORADO The JANUS Computing Environment Monte Lunacek monte.lunacek@colorado.edu rc-help@colorado.edu What is JANUS? November, 2011 1,368 Compute nodes 16,416 processors

More information

Using the computational resources at the GACRC

Using the computational resources at the GACRC An introduction to zcluster Georgia Advanced Computing Resource Center (GACRC) University of Georgia Dr. Landau s PHYS4601/6601 course - Spring 2017 What is GACRC? Georgia Advanced Computing Resource Center

More information

Grid Examples. Steve Gallo Center for Computational Research University at Buffalo

Grid Examples. Steve Gallo Center for Computational Research University at Buffalo Grid Examples Steve Gallo Center for Computational Research University at Buffalo Examples COBALT (Computational Fluid Dynamics) Ercan Dumlupinar, Syracyse University Aerodynamic loads on helicopter rotors

More information

Configuring the HWRF

Configuring the HWRF Configuring the HWRF Why and How? Old system had numerous namelist files. Each was maintained independently. Changes had to be made multiple times. Had to be careful to avoid conflicts. Otherwise, hard-to-detect

More information

ICS-ACI System Basics

ICS-ACI System Basics ICS-ACI System Basics Adam W. Lavely, Ph.D. Fall 2017 Slides available: goo.gl/ss9itf awl5173 ICS@PSU 1 Contents 1 Overview 2 HPC Overview 3 Getting Started on ACI 4 Moving On awl5173 ICS@PSU 2 Contents

More information

PBS Pro Documentation

PBS Pro Documentation Introduction Most jobs will require greater resources than are available on individual nodes. All jobs must be scheduled via the batch job system. The batch job system in use is PBS Pro. Jobs are submitted

More information

Hybrid MPI/OpenMP parallelization. Recall: MPI uses processes for parallelism. Each process has its own, separate address space.

Hybrid MPI/OpenMP parallelization. Recall: MPI uses processes for parallelism. Each process has its own, separate address space. Hybrid MPI/OpenMP parallelization Recall: MPI uses processes for parallelism. Each process has its own, separate address space. Thread parallelism (such as OpenMP or Pthreads) can provide additional parallelism

More information

Shark Cluster Overview

Shark Cluster Overview Shark Cluster Overview 51 Execution Nodes 1 Head Node (shark) 2 Graphical login nodes 800 Cores = slots 714 TB Storage RAW Slide 1/17 Introduction What is a High Performance Compute (HPC) cluster? A HPC

More information

Introduction to Linux Basics Part II. Georgia Advanced Computing Resource Center University of Georgia Suchitra Pakala

Introduction to Linux Basics Part II. Georgia Advanced Computing Resource Center University of Georgia Suchitra Pakala Introduction to Linux Basics Part II 1 Georgia Advanced Computing Resource Center University of Georgia Suchitra Pakala pakala@uga.edu 2 Variables in Shell HOW DOES LINUX WORK? Shell Arithmetic I/O and

More information

XSEDE New User Tutorial

XSEDE New User Tutorial April 2, 2014 XSEDE New User Tutorial Jay Alameda National Center for Supercomputing Applications XSEDE Training Survey Make sure you sign the sign in sheet! At the end of the module, I will ask you to

More information

Programming introduction part I:

Programming introduction part I: Programming introduction part I: Perl, Unix/Linux and using the BlueHive cluster Bio472- Spring 2014 Amanda Larracuente Text editor Syntax coloring Recognize several languages Line numbers Free! Mac/Windows

More information

Description of Power8 Nodes Available on Mio (ppc[ ])

Description of Power8 Nodes Available on Mio (ppc[ ]) Description of Power8 Nodes Available on Mio (ppc[001-002]) Introduction: HPC@Mines has released two brand-new IBM Power8 nodes (identified as ppc001 and ppc002) to production, as part of our Mio cluster.

More information

High Performance Computing (HPC) Using zcluster at GACRC

High Performance Computing (HPC) Using zcluster at GACRC High Performance Computing (HPC) Using zcluster at GACRC On-class STAT8060 Georgia Advanced Computing Resource Center University of Georgia Zhuofei Hou, HPC Trainer zhuofei@uga.edu Outline What is GACRC?

More information