Mechanics of running HWRF

Size: px
Start display at page:

Download "Mechanics of running HWRF"

Transcription

1 HWRF v3.5b Tutorial Taipei, Taiwan, May 22, 2014 Mechanics of running HWRF Ligia Bernardet NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES, Boulder CO 1

2 Overview of this lecture Datasets and fix files needed for running HWRF Location of source code and compiled executables Location and overview of scripts Tasks involved in a HWRF run Scripts, functions and wrappers Definition of global variables to be used by scripts Paths to input and output, initialization time, storm etc. Namelists and tables Location of output files Submission of jobs in TTFRI machine 2

3 Datasets and fix files Dataset Subdirectory File name TC Vitals Tcvitals syndat_tcvitals.$yyyy A decks abdecks a$basin$sid$yyyy.dat B decks abdecks b$basin$sid$yyyy.dat GFS anal & forec GFS/gridded/$YYYY$MM$DD$HH gfs.$yyyy$mm$dd$hh.pgrbf $fff GFS track GFS/gridded/$YYYY$MM$DD$HH avn.$yyyy$mm$dd $HH.cyclone.trackatcfunix UPP fix fix/upp * UPP coefficients fix/upp/hwrfspccoeff fix/upp/hwrftaucoeff * WPS fix wps_geog landuse_2m, topo_2m etc Location: /lfs/tutorial/hwrf/datasets 3

4 Structure of Compiled source WPSV3 geogrid.exe ungrib.exe metgrid.exe WPSV3 WRFV3_Idealized UPP hwrfutilities main main bin exec real_nmm.exe wrf.exe ideal.exe wrf.exe copygb.exe ndate.exe unipost.exe Location: /lfs/tutorial/$user/hwrf_v3.5b/sorc gfdlvortextracker trk_exec diffwrf_3dvar.exe grbindex.exe hwrf_gettrk.exe hwrf_anl_4x_step2.exe hwrf_tave.exe hwrf_anl_bogus_10m.exe hwrf_anl_cs_10m.exe hwrf_vint.exe hwrf_bin_io.exe hwrf_create_nest_1x_10m.exe hwrf_create_trak_fnl.exe hwrf_create_trak_guess.exe hwrf_data_flag.exe hwrf_inter_2to1.exe hwrf_inter_2to2.exe hwrf_inter_2to6.exe hwrf_inter_4to2.exe hwrf_inter_4to6.exe hwrf_merge_nest_4x_step12_3n.exe hwrf_pert_ct1.exe hwrf_prep.exe hwrf_readtdrstmid.exe hwrf_readtdrtime.exe hwrf_split1.exe hwrf_swcorner_dynamic.exe hwrf_wrfout_newtime.exe wgrib.exe 4

5 Scripts, Wrappers, and Functions The scripts (ksh93) to run HWRF are located in hwrf-utilities/scripts hwrf-utilities The scripts use a library of functions wrappers_scripts scripts funcs Location: /lfs/tutorial/$user/hwrf_v3.5b/sorc/ hwrf-utilities The scripts need to receive global variables with information about initialization date, which storm to run etc. Those will be passed by the wrappers A generic automation system could be use to drive these script (automation not covered in this tutorial) 5

6 Example of a wrapper: hwrf_wrapper Every wrapper is a ksh93 script that Sets global variables (e.g., storm, initialization time, dir for input/output) Sets local variables, which are specific to the script that will be called Calls a script #!/usr/bin/env ksh93!! # # Global definitions of environment variables! # PATH_TO_GLOBAL/global_vars.ksh! # # Local definitions of environment variables! # ! export WRF_MODE=main! export WRF_CORES=${HWRF_COUPLED_CORES}!! # export DEBUG=1! ${HWRF_SCRIPTS}/wrf.ksh! 6

7 Example of a script: hwrfdomain.ksh # Main! function main {! typeset errno! # Init the function library!. ${HWRF_SCRIPTS}/funcs/init! set -e! # Check to see if all the variables are set! check_vars DOMAIN_DATA START_TIME!! # Create a working directory (and cd into it)! create_work_dir ${DOMAIN_DATA}/messages!! # Create the tcvital and tcvitals.as files! lat_lon=( $( tcvitals ) )!! # Create the storm centre file! storm_centre ${lat_lon[@]}! Functions used only once are located in script that uses them storm_centre Functions used more than once are located in a library of functions check_vars and create_work_dir 7

8 Tasks, wrappers, and scripts I Task Wrapper Script Determine domain location based on TC Vitals hwrfdomain_wrapper hwrfdomain.ksh Run WPS geogrid geogrid_wrapper geogrid.ksh Run WPS ungrib ungrib_wrapper ungrib.ksh Run WPS metgrid metgrid_wrapper metgrid.ksh Run real real_wrapper real.ksh Run WRF Analysis (90s) wrfanalysis_wrapper wrf.ksh Run WRF Ghost (90s) wrfghost_wrapper wrf.ksh Run postprocessor and tracker on WRF Analysis track_analysis_wrapper output to find location of vortex in GFS track_analysis.ksh Location: /lfs/tutorial/$user/hwrf_v3.5b/sorc/hwrf-utilities 8

9 Tasks, wrappers, and scripts II Task Wrapper Script relocate1_wrapper relocate_stage1_3d.ksh relocate2_wrapper relocate_stage2_3d.ksh relocate3_wrapper relocate_stage3_3d.ksh Merge information on various domains to create IC for WRF merge_wrapper merge.ksh Run main forecast hwrf_wrapper wrf.ksh Run postprocessor unipost_wrapper unipost.ksh Run external vortex tracker tracker_wrappe tracker.ksh Create plots rungrads_wrapper run_grads Run the vortex initialization Location: /lfs/tutorial/$user/hwrf_v3.5b/sorc/hwrf-utilities 9

10 Defining global variables for scripts I Wrappers source global_vars.ksh to pass global variables to scripts Soulik 07W 2013 START_TIME Initialization time START_TIME_MINUS Initialization time minus 6 hr FCST_LENGTH 12 Length of forecast (in hrs) FCST_INTERVAL 6 Time between consecutive HWRF cycles STORM_NAME SOULIK Storm name SID 07W Storm ID L=Atl; E=E Pac; W=W Pac BASIN WP Basin AL, EP, WP, IO Location: /lfs/tutorial/$user/hwrf_v3.5b/sorc/hwrf-utilities/wrapper_scripts/global_vars.ksh 10

11 Defining global variables for scripts II Soulik 07W 2013 RUN_PREP_HYB F T= Use global model in binary format; F=Use global model in GRIB format UPP_PROD_SAT T T= UPP outputs synthetic satellite fields BKG_MODE GFS Select GFS or GDAS for initial conditions RUN_GSI F RUN_GSI_WRFINPUT F RUN_GSI_WRFGHOST F INNER_CORE_DA Disable data assimilation 0 Full description of global_vars.ksh can be found in Appendix of HWRF v3.5a Users Guide 11

12 Defining global variables for scripts III Soulik 07W 2013 HWRF_SRC_DIR /lfs/tutorial/$user/hwrf_v3.5b/sorc Path to HWRF source code HWRF_DATA_DIR /lfs/tutorial/hwrf/datasets Path to HWRF input datasets CYCLE_DATA ${HWRF_OUTPUT_DIR}/${SID}/$ {START_TIME_MINUS6} Path to previous cycle output DOMAIN_DATA ${HWRF_OUTPUT_DIR}/${SID}/$ {START_TIME} Path to current cycle output GFS_DIR ${HWRF_DATA_DIR}/GFS Path to toplevel GFS input GFS_GRIDDED_DIR ${GFS_DIR}/gridded Path to GFS gridded data TCVITALS ${HWRF_DATA_DIR}/Tcvitals Path to TC Vitals files GEOG_DATA_PATH ${HWRF_DATA_DIR}/wps_geog Path to geographical fixdata GRADS_BIN /lfs/tutorial/hwrf/bin Path to GrADS binaries GADDIR /lfs/tutorial/hwrf/grads/data Path to grads tables 12

13 Defining global variables for scripts IV Soulik 07W 2013 MPIRUN mpiexec Command to run parallel code WRF_ANAL_CORES 12 Number of cores to run WRF Analysis WRF_GHOST_CORES 12 Number of cores to run WRF Ghost HWRF_FCST_CORES 202 Number of cores to run HWRF forecast GEOGRID_CORES 12 Number of cores to run geogrid METGRID_CORES 12 Number of cores to run metgrid REAL_CORES 1 Number of cores to run real UNI_CORES 12 Number of cores to run UPP ATCFNAME HCOM ATCF identifier for HWRF forecast 13

14 Namelists and tables hwrf_cntrl.* Namelists for UPP hwrf_namelist.input Template namelist for WRF runs (ghost, analysis, main) hwrf_namelist.wps Template namelist for WPS (geogrid, ungrib, metgrid) hwrf_co2_trans hwrf_eta_micro_lookup.dat hwrf_etampnew_data hwrf_etampnew_data_dbl hwrf_genparm.tbl hwrf_landuse.tbl hwrf_rrtm_data hwrf_rrtm_data_dbl hwrf_soilparm.tbl hwrf_tr49t67 hwrf_tr49t85 hwrf_tr67t85 hwrf_vegparm.tbl Tables used for WRF runs hwrf_grib_reduce.parms List of variables used to thin the input GFS file toexpedite processing hwrf_storm_* Tables used for vortex relocation Location: /lfs/tutorial/$user/hwrf_v3.5b/sorc/hwrf-utilities/parm 14

15 WRF namelist template @[start:0:4],! @[start:4:2],! @[start:6:2],!! num_soil_layers = 4,! mp_physics = 85, 85, 85,! ra_lw_physics = 98, 98, 98,! ra_sw_physics = 98, 98, 98,! Some variables are directly defined in template namelist Other variables in front) are populated when scripts run Scripts Read hwrf-utilities/parm/hwrf_namelist.input Modify namelist according to global variables defined Write namelist in working directory 15

16 Location: /lfs/tutorial/${user}/hwrf_v3.5b/results/$sid/$yyyymmddhh Output directory structure Subdirectories Creator script messages hwrfdomain.ksh geoprd geogrid.ksh ungribprd/$yyyymmddhh ungrib.ksh metgridprd/$yyyymmddhh metgrid.ksh realprd_gfs real.ksh wrfghostprd/$yyyymmddhh wrf.ksh run in Ghost mode wrfanalysisprd/$yyyymmddhh wrf.ksh run in Analysis mode trkanalysisprd/$yyyymmddhh track_analysis.ksh relocateprd/$yyyymmddhh relocate*.ksh mergeprd merge.ksh wrfprd wrf.ksh in main mode postprd run_unipost gvtprd tracker.ksh Some subdirectories have a date string on them. For this tutorial, it is the same as initialization time. This is done because for some data assimilation configurations it is necessary to process data for other valid time as well 16

17 What you learned in this lecture The source code has many components WPS, WRF, UPP, hwrf-utilities, GFDL vortex tracker Several tasks are needed to do an end-to-end HWRF run Hwrf-utilities scripts, functions, wrappers are used to run HWRF Wrappers Source global_vars.ksh (paths, initialization time, storm etc.) Define local variables Call scripts File structure for input and output files Next: how to submit a job in TTFRI machine 17

18 How to log in to the TTFRI computer Open 2 login windows on your laptop: editing and running Linux: select open terminal from the right mouse button menu. OS X: start the Terminal and XQuartz applications from the /Applications/Utilities folder. Windows: start the application XSession from the Start menu and then Putty Open an X-enabled window to a login node Log in to the supercomputer : ssh -X user_name@ ! At the prompt, enter your temporary password Practical session instructions 18

19 How to submit wrapper scripts Define your wrapper directory setenv WRAPPER_DIR /lfs/tutorial/$user//hwrf_v3.5b/hwrf-utilities/wrapper_scripts! or!! export WRAPPER_DIR =/lfs/tutorial/$user/hwrf_v3.5b/hwrf-utilities/wrapper_scripts! Set the options for the job bsub_options='-l procs=16,walltime=6:00:00,vmem=30g -N hwrf -o hwrf.out -e hwrf.err -S /bin/bash -k oe -z! Submit the job bsub $bsub_options ${WRAPPER_DIR}/geogrid_wrapper! Descriptions of batch system options -l resources needed to run script -N job name -o standard out file name -e standard err name -S use bash -k keep standard error and output -Z suppresses the job ID from being written to standard output 19

20 ! t s e r e t n i r u o y r o f u o y k an Th For more information, you can Ask questions during the tutorial Contact me later: ligia.bernardet@noaa.gov Reach our user helpdesk: wrfhelp@ucar.edu Visit our website: HWRF v3.5a Users Guide ( HWRF_v3.5a_Users_Guide.pdf) HWRF v3.5a Scientific Documentation ( scientific_documents/hwrfv3.5a_scientificdoc.pdf) WRF-NMM V3 Users Guide ( users_guide_nmm_chap1-7upp.pdf) 20

HWRF: Setup and Run. Shaowu Bao CIRES/CU and NOAA/ESRL/GSD. WRF For Hurricanes Tutorial Boulder, CO April

HWRF: Setup and Run. Shaowu Bao CIRES/CU and NOAA/ESRL/GSD. WRF For Hurricanes Tutorial Boulder, CO April HWRF: Setup and Run Shaowu Bao CIRES/CU and NOAA/ESRL/GSD WRF For Hurricanes Tutorial Boulder, CO April 28 2011 Source structure and executables ${HOME}/HWRF/src comgsi_ v2.5 pomtc WRFV3 WPS WPPV3 gfdlvortextracker

More information

Python Scripts in HWRF

Python Scripts in HWRF HWRF v3.9a Tutorial 24 January 2018 College Park, MD Python Scripts in HWRF Evan Kalina Developmental Testbed Center, Boulder, CO NOAA ESRL Global Systems Division, Boulder CO University of Colorado CIRES,

More information

Step 3: Access the HPC machine you will be using to run WRF: ocelote. Step 4: transfer downloaded WRF tar files to your home directory

Step 3: Access the HPC machine you will be using to run WRF: ocelote. Step 4: transfer downloaded WRF tar files to your home directory Step 1: download WRF packages Get WRF tar file from WRF users page, Version 3.8.1. Also get WPS Version 3.8.1 (preprocessor) Store on your local machine Step 2: Login to UA HPC system ssh (UAnetid)@hpc.arizona.edu

More information

USERS GUIDE for the Community release of the GFDL Vortex Tracker

USERS GUIDE for the Community release of the GFDL Vortex Tracker USERS GUIDE for the Community release of the GFDL Vortex Tracker November 2011 Version 3.3b The Developmental Testbed Center Shaowu Bao, NOAA/ESRL/GSD and CIRES/CU Donald Stark, NCAR/RAL/JNT Ligia Bernardet,

More information

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 2: Software Installation

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 2: Software Installation User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3 Chapter 2: Software Installation Table of Contents Introduction Required Compilers and Scripting Languauges

More information

This document will go through the steps to run a wrf model for South Idaho for a 3 day forecast. Other wrf runs will be similar, but not the same.

This document will go through the steps to run a wrf model for South Idaho for a 3 day forecast. Other wrf runs will be similar, but not the same. Running WRF on R1 Introduction Web Docs This document will go through the steps to run a wrf model for South Idaho for a 3 day forecast. Other wrf runs will be similar, but not the same. Runtime Notes

More information

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS)

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS) User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3 Chapter 3: WRF Preprocessing System (WPS) Table of Contents Introduction Function of Each WPS Program

More information

MEA 716 WRF Homework, Data Case Spring WRF Real-Data Case Study Simulation

MEA 716 WRF Homework, Data Case Spring WRF Real-Data Case Study Simulation MEA 716 WRF Homework, Data Case Spring 2016 Assigned: Tuesday 2/2/16 Due: Tuesday, 2/9/16 WRF Real-Data Case Study Simulation This assignment takes students through the process of obtaining initial and

More information

Running HWRF with Rocoto

Running HWRF with Rocoto Running HWRF with Rocoto Christina Holt and Ligia Bernardet HWRF Developers Meeting October 2014 christina.holt@noaa.gov ligia.bernardet@noaa.gov With many thanks to Chris Harrop and Sam Trahan Why we

More information

HWRF Code Management and Support!

HWRF Code Management and Support! Code Management and Support! Ligia Bernardet! (a collaboration between DTC, EMC, and Developers)! 07 December 2011! HFIP Telecon! 1 Traditional code and support! Yearly releases are made available at dtcenter.org

More information

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS)

User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3. Chapter 3: WRF Preprocessing System (WPS) User's Guide for the NMM Core of the Weather Research and Forecast (WRF) Modeling System Version 3 Chapter 3: WRF Preprocessing System (WPS) Table of Contents Introduction Function of Each WPS Program

More information

Installing Steps. WRF & WPS: Compilation Process. System Requirements. Check System Requirements

Installing Steps. WRF & WPS: Compilation Process. System Requirements. Check System Requirements WRF & WPS: Compilation Process Kelly Werner NCAR/MMM January 2018 Installing Steps 1 2 System Requirements On what kinds of systems will WRF run? Generally any 32- or 64-bit hardware, running a UNIX-like

More information

NMM WEATHER RESEARCH & FORECASTING. Version 3 Modeling System User s Guide. October 2017

NMM WEATHER RESEARCH & FORECASTING. Version 3 Modeling System User s Guide. October 2017 WEATHER RESEARCH & FORECASTING NMM Version 3 Modeling System User s Guide Model Data: NAM (Eta), GFS, NNRP,... Terrestrial Data WRF-NMM FLOW CHART WPS met_nmm.d01 real_nmm.exe (Real data initialization)

More information

NOAA Technical Memorandum OAR GSD-51. COMMUNITY HWRF USERS GUIDE v3.9a. October 2017 THE DEVELOPMENTAL TESTBED CENTER

NOAA Technical Memorandum OAR GSD-51. COMMUNITY HWRF USERS GUIDE v3.9a. October 2017 THE DEVELOPMENTAL TESTBED CENTER NOAA Technical Memorandum OAR GSD-51 http://doi.org/10.7289/v5/tm-oar-gsd-51 COMMUNITY HWRF USERS GUIDE v3.9a October 2017 THE DEVELOPMENTAL TESTBED CENTER Mrinal K. Biswas Laurie Carson Kathr yn Newman

More information

Introduction to Cuda Visualization. Graphical Application Tunnelling on Palmetto

Introduction to Cuda Visualization. Graphical Application Tunnelling on Palmetto Introduction to Cuda Visualization The CUDA programming paradigm is NVidia's development tool which is used to enable advanced computer processing on their GPGPU (General Purpose graphics Processing Units)

More information

NOAA Technical Memorandum OAR GSD-47. COMMUNITY HWRF USERS GUIDE v3.8a. November 2016 THE DEVELOPMENTAL TESTBED CENTER. Mrinal K. Biswas Laurie Carson

NOAA Technical Memorandum OAR GSD-47. COMMUNITY HWRF USERS GUIDE v3.8a. November 2016 THE DEVELOPMENTAL TESTBED CENTER. Mrinal K. Biswas Laurie Carson NOAA Technical Memorandum OAR GSD-47 doi:10.7289/v5/tm-oar-gsd-47 COMMUNITY HWRF USERS GUIDE v3.8a November 2016 THE DEVELOPMENTAL TESTBED CENTER Mrinal K. Biswas Laurie Carson Kathr yn Newman Christina

More information

NESTING IN WRF. Kelly Werner & Wei Wang November 2017

NESTING IN WRF. Kelly Werner & Wei Wang November 2017 NESTING IN WRF Kelly Werner & Wei Wang November 2017 What is a nest? A finer-resolution domain embedded in a coarser resolution domain, and run together with the coarser resolution domain Enables running

More information

Installing WRF & WPS. Kelly Keene NCAR/MMM January 2015

Installing WRF & WPS. Kelly Keene NCAR/MMM January 2015 Installing WRF & WPS Kelly Keene NCAR/MMM January 2015 1 Installing Steps Check system requirements Installing Libraries Download source data Download datasets Compile WRFV3 Compile WPS 2 System Requirements

More information

Using the WRF-ARW on the UWM mortimer HPC Guide for WRF-ARW Version October 15, 2017

Using the WRF-ARW on the UWM mortimer HPC Guide for WRF-ARW Version October 15, 2017 Using the WRF-ARW on the UWM mortimer HPC Guide for WRF-ARW Version 3.9.1 October 15, 2017 Introduction This guide is designed to facilitate basic compilation and execution of the WRF-ARW v3.9.1 model

More information

Image Sharpening. Practical Introduction to HPC Exercise. Instructions for Cirrus Tier-2 System

Image Sharpening. Practical Introduction to HPC Exercise. Instructions for Cirrus Tier-2 System Image Sharpening Practical Introduction to HPC Exercise Instructions for Cirrus Tier-2 System 2 1. Aims The aim of this exercise is to get you used to logging into an HPC resource, using the command line

More information

Foreword. For the latest version of this document, please visit the ARW User s Web site at

Foreword. For the latest version of this document, please visit the ARW User s Web site at Foreword This User s Guide describes the Advanced Research WRF (ARW) Version 2.2 modeling system, released in December 2006. As the ARW is developed further, this document will be continuously enhanced

More information

NBIC TechTrack PBS Tutorial

NBIC TechTrack PBS Tutorial NBIC TechTrack PBS Tutorial by Marcel Kempenaar, NBIC Bioinformatics Research Support group, University Medical Center Groningen Visit our webpage at: http://www.nbic.nl/support/brs 1 NBIC PBS Tutorial

More information

WRF-Var System. WRF-Var Tutorial July 21-22, Xin Zhang and Syed RH Rizvi. Michael Duda

WRF-Var System. WRF-Var Tutorial July 21-22, Xin Zhang and Syed RH Rizvi. Michael Duda System Tutorial July 21-22, 2008 Xin Zhang and Syed RH Rizvi Michael Duda Outline in the WRF Modeling System Software Code Overview 1 in the WRF Modeling System 2 in the WRF Modeling System Blue --> Supported

More information

Installing WRF- Sfire

Installing WRF- Sfire Installing WRF- Sfire Workshop on Modeling of Wildfires and their Environmental Impacts, Trieste, IT June 2015 Adam Kochanski Installing Steps Check system requirements Download required libraries Install

More information

STARTING THE DDT DEBUGGER ON MIO, AUN, & MC2. (Mouse over to the left to see thumbnails of all of the slides)

STARTING THE DDT DEBUGGER ON MIO, AUN, & MC2. (Mouse over to the left to see thumbnails of all of the slides) STARTING THE DDT DEBUGGER ON MIO, AUN, & MC2 (Mouse over to the left to see thumbnails of all of the slides) ALLINEA DDT Allinea DDT is a powerful, easy-to-use graphical debugger capable of debugging a

More information

WELCOME to the PRACTICAL EXERCISES

WELCOME to the PRACTICAL EXERCISES WELCOME to the PRACTICAL EXERCISES Overview For the practical exercises you have got a TUTORIAL, with which you can work most of the time on your own. There are 6 lessons, in which you will learn about

More information

Sharpen Exercise: Using HPC resources and running parallel applications

Sharpen Exercise: Using HPC resources and running parallel applications Sharpen Exercise: Using HPC resources and running parallel applications Andrew Turner, Dominic Sloan-Murphy, David Henty, Adrian Jackson Contents 1 Aims 2 2 Introduction 2 3 Instructions 3 3.1 Log into

More information

Batch Systems. Running calculations on HPC resources

Batch Systems. Running calculations on HPC resources Batch Systems Running calculations on HPC resources Outline What is a batch system? How do I interact with the batch system Job submission scripts Interactive jobs Common batch systems Converting between

More information

Xeon Phi Native Mode - Sharpen Exercise

Xeon Phi Native Mode - Sharpen Exercise Xeon Phi Native Mode - Sharpen Exercise Fiona Reid, Andrew Turner, Dominic Sloan-Murphy, David Henty, Adrian Jackson Contents June 19, 2015 1 Aims 1 2 Introduction 1 3 Instructions 2 3.1 Log into yellowxx

More information

Version 3 Modeling Systems User s Guide January 2009

Version 3 Modeling Systems User s Guide January 2009 Version 3 Modeling Systems User s Guide January 2009 Mesoscale & Microscale Meteorology Division National Center for Atmospheric Research Foreword This User s Guide describes the Advanced Research WRF

More information

For Dr Landau s PHYS8602 course

For Dr Landau s PHYS8602 course For Dr Landau s PHYS8602 course Shan-Ho Tsai (shtsai@uga.edu) Georgia Advanced Computing Resource Center - GACRC January 7, 2019 You will be given a student account on the GACRC s Teaching cluster. Your

More information

Remote Access to Unix Machines

Remote Access to Unix Machines Remote Access to Unix Machines Alvin R. Lebeck Department of Computer Science Department of Electrical and Computer Engineering Duke University Overview We are using OIT Linux machines for some homework

More information

Using LSF with Condor Checkpointing

Using LSF with Condor Checkpointing Overview Using LSF with Condor Checkpointing This chapter discusses how obtain, install, and configure the files needed to use Condor checkpointing with LSF. Contents Introduction on page 3 Obtaining Files

More information

WEATHER RESEARCH AND FORECASTING (WRF) MODELING SYSTEM (ASL 410)

WEATHER RESEARCH AND FORECASTING (WRF) MODELING SYSTEM (ASL 410) WEATHER RESEARCH AND FORECASTING (WRF) MODELING SYSTEM (ASL 410) Steps to be done Download the meteorological data Install WRF model and initialize for model run WRF Pre-processing (WPS) WRF model run

More information

CSCE UVM Hands-on Session-1 Pre-Work

CSCE UVM Hands-on Session-1 Pre-Work CSCE489-689 UVM Hands-on Session-1 Pre-Work Please complete the following steps before the lecture on Feb-16. These steps will help set-up the environment and tools necessary for the hands-on session.

More information

User Guide Version 2.0

User Guide Version 2.0 User Guide Version 2.0 Page 2 of 8 Summary Contents 1 INTRODUCTION... 3 2 SECURESHELL (SSH)... 4 2.1 ENABLING SSH... 4 2.2 DISABLING SSH... 4 2.2.1 Change Password... 4 2.2.2 Secure Shell Connection Information...

More information

RDAV Tutorial: Hands-on with VisIt on Nautilus If you want to work hands-on, you will need to install VisIt and

RDAV Tutorial: Hands-on with VisIt on Nautilus  If you want to work hands-on, you will need to install VisIt and RDAV Tutorial: Hands-on with VisIt on Nautilus http://rdav.nics.tennessee.edu/ If you want to work hands-on, you will need to install VisIt and register a password token. The data that we are using today

More information

Xeon Phi Native Mode - Sharpen Exercise

Xeon Phi Native Mode - Sharpen Exercise Xeon Phi Native Mode - Sharpen Exercise Fiona Reid, Andrew Turner, Dominic Sloan-Murphy, David Henty, Adrian Jackson Contents April 30, 2015 1 Aims The aim of this exercise is to get you compiling and

More information

Introduction: What is Unix?

Introduction: What is Unix? Introduction Introduction: What is Unix? An operating system Developed at AT&T Bell Labs in the 1960 s Command Line Interpreter GUIs (Window systems) are now available Introduction: Unix vs. Linux Unix

More information

wrfxpy Documentation Release Martin Vejmelka

wrfxpy Documentation Release Martin Vejmelka wrfxpy Documentation Release 1.0.0 Martin Vejmelka Aug 05, 2018 Contents 1 Basic topics 3 2 Advanced topics 15 3 Indices and tables 17 i ii wrfxpy is a set of software modules that provide functionality

More information

Introduction to Unix The Windows User perspective. Wes Frisby Kyle Horne Todd Johansen

Introduction to Unix The Windows User perspective. Wes Frisby Kyle Horne Todd Johansen Introduction to Unix The Windows User perspective Wes Frisby Kyle Horne Todd Johansen What is Unix? Portable, multi-tasking, and multi-user operating system Software development environment Hardware independent

More information

RF Tutorial. Rhys Hawkins January This document gives a tutorial introduction to using the RF software.

RF Tutorial. Rhys Hawkins January This document gives a tutorial introduction to using the RF software. RF Tutorial Rhys Hawkins January 2014 1 Introduction This document gives a tutorial introduction to using the RF software. 2 The Tutorial Data The following files should exist in the data directory: RF

More information

15-122: Principles of Imperative Computation

15-122: Principles of Imperative Computation 15-122: Principles of Imperative Computation Lab 0 Navigating your account in Linux Tom Cortina, Rob Simmons Unlike typical graphical interfaces for operating systems, here you are entering commands directly

More information

COSC 6374 Parallel Computation. Debugging MPI applications. Edgar Gabriel. Spring 2008

COSC 6374 Parallel Computation. Debugging MPI applications. Edgar Gabriel. Spring 2008 COSC 6374 Parallel Computation Debugging MPI applications Spring 2008 How to use a cluster A cluster usually consists of a front-end node and compute nodes Name of the front-end node: shark.cs.uh.edu You

More information

Introduction to Joker Cyber Infrastructure Architecture Team CIA.NMSU.EDU

Introduction to Joker Cyber Infrastructure Architecture Team CIA.NMSU.EDU Introduction to Joker Cyber Infrastructure Architecture Team CIA.NMSU.EDU What is Joker? NMSU s supercomputer. 238 core computer cluster. Intel E-5 Xeon CPUs and Nvidia K-40 GPUs. InfiniBand innerconnect.

More information

Singularity: container formats

Singularity: container formats Singularity Easy to install and configure Easy to run/use: no daemons no root works with scheduling systems User outside container == user inside container Access to host resources Mount (parts of) filesystems

More information

Sharpen Exercise: Using HPC resources and running parallel applications

Sharpen Exercise: Using HPC resources and running parallel applications Sharpen Exercise: Using HPC resources and running parallel applications Contents 1 Aims 2 2 Introduction 2 3 Instructions 3 3.1 Log into ARCHER frontend nodes and run commands.... 3 3.2 Download and extract

More information

Advanced Features of the WRF Preprocessing System

Advanced Features of the WRF Preprocessing System * Advanced Features of the WRF Preprocessing System Michael Duda WRF Users Tutorial, 23 27 January 2012 *NCAR is sponsored by the National Science Foundation Outline The GEOGRID.TBL file What is the GEOGRID.TBL

More information

Siemens PLM Software. HEEDS MDO Setting up a Windows-to- Linux Compute Resource.

Siemens PLM Software. HEEDS MDO Setting up a Windows-to- Linux Compute Resource. Siemens PLM Software HEEDS MDO 2018.04 Setting up a Windows-to- Linux Compute Resource www.redcedartech.com. Contents Introduction 1 On Remote Machine B 2 Installing the SSH Server 2 Configuring the SSH

More information

Software Installation - Accessing Linux and Checking your Environmental Variables

Software Installation - Accessing Linux and Checking your Environmental Variables Accessing Linux and Checking your Environmental Although you may be fortunate enough to have a powerful multi-processor desktop running Linux, most of our sponsors do not. Most of our sponsors will have

More information

Table of Contents. Table of Contents Job Manager for remote execution of QuantumATK scripts. A single remote machine

Table of Contents. Table of Contents Job Manager for remote execution of QuantumATK scripts. A single remote machine Table of Contents Table of Contents Job Manager for remote execution of QuantumATK scripts A single remote machine Settings Environment Resources Notifications Diagnostics Save and test the new machine

More information

Using Rmpi within the HPC4Stats framework

Using Rmpi within the HPC4Stats framework Using Rmpi within the HPC4Stats framework Dorit Hammerling Analytics and Integrative Machine Learning Group National Center for Atmospheric Research (NCAR) Based on work by Doug Nychka (Applied Mathematics

More information

MET+ MET+ Unified Package. Python wrappers around MET and METViewer: Python wrappers

MET+ MET+ Unified Package. Python wrappers around MET and METViewer: Python wrappers MET+ Overview MET+ Unified Package Python wrappers around MET and METViewer: 2 Simple to set-up and run Automated plotting of 2D fields and statistics Initial system - Global deterministic with plans to

More information

Using HIPE remotely: NHSC Remote Computing and running batch jobs

Using HIPE remotely: NHSC Remote Computing and running batch jobs NHSC Data Processing Workshop Pasadena 26 th - 30 th Aug 2013 Using HIPE remotely: NHSC Remote Computing and running batch jobs Presenter: Dave Shupe, NHSC with Jeff Jacobson and many other contributors

More information

CS Fundamentals of Programming II Fall Very Basic UNIX

CS Fundamentals of Programming II Fall Very Basic UNIX CS 215 - Fundamentals of Programming II Fall 2012 - Very Basic UNIX This handout very briefly describes how to use Unix and how to use the Linux server and client machines in the CS (Project) Lab (KC-265)

More information

Supercomputing environment TMA4280 Introduction to Supercomputing

Supercomputing environment TMA4280 Introduction to Supercomputing Supercomputing environment TMA4280 Introduction to Supercomputing NTNU, IMF February 21. 2018 1 Supercomputing environment Supercomputers use UNIX-type operating systems. Predominantly Linux. Using a shell

More information

SSH Device Manager user guide.

SSH Device Manager user guide. SSH Device Manager user guide Contact yulia@switcharena.com Table of Contents Operation... 3 First activation... 3 Adding a device... 4 Adding a script... 5 Adding a group... 7 Assign or remove a device

More information

Using WestGrid from the desktop Oct on Access Grid

Using WestGrid from the desktop Oct on Access Grid Using WestGrid from the desktop Oct 11 2007 on Access Grid Introduction Simon Sharpe, UCIT Client Services The best way to contact WestGrid support is to email support@westgrid.ca This seminar gives you

More information

Docker task in HPC Pack

Docker task in HPC Pack Docker task in HPC Pack We introduced docker task in HPC Pack 2016 Update1. To use this feature, set the environment variable CCP_DOCKER_IMAGE of a task so that it could be run in a docker container on

More information

Using ISMLL Cluster. Tutorial Lec 5. Mohsan Jameel, Information Systems and Machine Learning Lab, University of Hildesheim

Using ISMLL Cluster. Tutorial Lec 5. Mohsan Jameel, Information Systems and Machine Learning Lab, University of Hildesheim Using ISMLL Cluster Tutorial Lec 5 1 Agenda Hardware Useful command Submitting job 2 Computing Cluster http://www.admin-magazine.com/hpc/articles/building-an-hpc-cluster Any problem or query regarding

More information

Configuring the HWRF

Configuring the HWRF Configuring the HWRF Why and How? Old system had numerous namelist files. Each was maintained independently. Changes had to be made multiple times. Had to be careful to avoid conflicts. Otherwise, hard-to-detect

More information

Using Sapelo2 Cluster at the GACRC

Using Sapelo2 Cluster at the GACRC Using Sapelo2 Cluster at the GACRC New User Training Workshop Georgia Advanced Computing Resource Center (GACRC) EITS/University of Georgia Zhuofei Hou zhuofei@uga.edu 1 Outline GACRC Sapelo2 Cluster Diagram

More information

Ftp Command Line Manual Windows User Password

Ftp Command Line Manual Windows User Password Ftp Command Line Manual Windows User Password SSH Guide SFTP on Mac OS X SFTP on Windows The most straight forward way to use SFTP is through the command line. a command line, you can alternately use an

More information

GUT. GUT Installation Guide

GUT. GUT Installation Guide Date : 17 Mar 2011 1/6 GUT Contents 1 Introduction...2 2 Installing GUT...2 2.1 Optional Extensions...2 2.2 Installation using the Binary package...2 2.2.1 Linux or Mac OS X...2 2.2.2 Windows...4 2.3 Installing

More information

New User Tutorial. OSU High Performance Computing Center

New User Tutorial. OSU High Performance Computing Center New User Tutorial OSU High Performance Computing Center TABLE OF CONTENTS Logging In... 3-5 Windows... 3-4 Linux... 4 Mac... 4-5 Changing Password... 5 Using Linux Commands... 6 File Systems... 7 File

More information

Minnesota Supercomputing Institute Regents of the University of Minnesota. All rights reserved.

Minnesota Supercomputing Institute Regents of the University of Minnesota. All rights reserved. Minnesota Supercomputing Institute Introduction to Job Submission and Scheduling Andrew Gustafson Interacting with MSI Systems Connecting to MSI SSH is the most reliable connection method Linux and Mac

More information

Introduction to Discovery.

Introduction to Discovery. Introduction to Discovery http://discovery.dartmouth.edu The Discovery Cluster 2 Agenda What is a cluster and why use it Overview of computer hardware in cluster Help Available to Discovery Users Logging

More information

XSEDE New User Training. Ritu Arora November 14, 2014

XSEDE New User Training. Ritu Arora   November 14, 2014 XSEDE New User Training Ritu Arora Email: rauta@tacc.utexas.edu November 14, 2014 1 Objectives Provide a brief overview of XSEDE Computational, Visualization and Storage Resources Extended Collaborative

More information

Introduction to HPC Resources and Linux

Introduction to HPC Resources and Linux Introduction to HPC Resources and Linux Burak Himmetoglu Enterprise Technology Services & Center for Scientific Computing e-mail: bhimmetoglu@ucsb.edu Paul Weakliem California Nanosystems Institute & Center

More information

Introduction to UNIX. SURF Research Boot Camp April Jeroen Engelberts Consultant Supercomputing

Introduction to UNIX. SURF Research Boot Camp April Jeroen Engelberts Consultant Supercomputing Introduction to UNIX SURF Research Boot Camp April 2018 Jeroen Engelberts jeroen.engelberts@surfsara.nl Consultant Supercomputing Outline Introduction to UNIX What is UNIX? (Short) history of UNIX Cartesius

More information

Linux hep.wisc.edu

Linux hep.wisc.edu Linux Environment @ hep.wisc.edu 1 Your Account : Login Name and usage You are given a unique login name (e.g. john) A temporary password is given to you Use this to login name and password to enter the

More information

Introduction to GACRC Teaching Cluster PHYS8602

Introduction to GACRC Teaching Cluster PHYS8602 Introduction to GACRC Teaching Cluster PHYS8602 Georgia Advanced Computing Resource Center (GACRC) EITS/University of Georgia Zhuofei Hou zhuofei@uga.edu 1 Outline GACRC Overview Computing Resources Three

More information

CS 143A. Principles of Operating Systems. Instructor : Prof. Anton Burtsev

CS 143A. Principles of Operating Systems. Instructor : Prof. Anton Burtsev CS 143A Principles of Operating Systems Instructor : Prof. Anton Burtsev (aburtsev@uci.edu) Assistants : Junjie Shen junjies1@uci.edu Vikram Narayanan narayav1@uci.edu Biswadip Maity (Deep) Email : maityb@uci.edu

More information

Introduction to Cheyenne. 12 January, 2017 Consulting Services Group Brian Vanderwende

Introduction to Cheyenne. 12 January, 2017 Consulting Services Group Brian Vanderwende Introduction to Cheyenne 12 January, 2017 Consulting Services Group Brian Vanderwende Topics we will cover Technical specs of the Cheyenne supercomputer and expanded GLADE file systems The Cheyenne computing

More information

Introduction to GALILEO

Introduction to GALILEO Introduction to GALILEO Parallel & production environment Mirko Cestari m.cestari@cineca.it Alessandro Marani a.marani@cineca.it Domenico Guida d.guida@cineca.it Maurizio Cremonesi m.cremonesi@cineca.it

More information

Cheat Sheet on using Electric for Design and Simulations

Cheat Sheet on using Electric for Design and Simulations Cheat Sheet on using Electric for Design and Simulations By Sai Kashyap Nutulapati Revised - 04 October 2010 10/4/2010 1 Instructions before Starting Wherever you see the word , replace it with

More information

Make SAS Enterprise Guide Your Own. John Ladds Statistics Canada Paper

Make SAS Enterprise Guide Your Own. John Ladds Statistics Canada Paper Make SAS Enterprise Guide Your Own John Ladds Statistics Canada Paper 1755-2014 Introduction Any tool that you use regularly you can customize it to suit your needs. With SAS Enterprise Guide, there are

More information

WRF-NMM Standard Initialization (SI) Matthew Pyle 8 August 2006

WRF-NMM Standard Initialization (SI) Matthew Pyle 8 August 2006 WRF-NMM Standard Initialization (SI) Matthew Pyle 8 August 2006 1 Outline Overview of the WRF-NMM Standard Initialization (SI) package. More detailed look at individual SI program components. SI software

More information

GSI Setup, Run and Namelist

GSI Setup, Run and Namelist GSI Setup, Run and Namelist Hui Shao GSI Community Tutorial, June 28-30, 2010, Boulder, CO Observation Error Observation Observation processing and assimilation PrepBUFR and BUFR processing: 06/29, Tue

More information

CHE3935. Lecture 1. Introduction to Linux

CHE3935. Lecture 1. Introduction to Linux CHE3935 Lecture 1 Introduction to Linux 1 Logging In PuTTY is a free telnet/ssh client that can be run without installing it within Windows. It will only give you a terminal interface, but used with a

More information

You can use the WinSCP program to load or copy (FTP) files from your computer onto the Codd server.

You can use the WinSCP program to load or copy (FTP) files from your computer onto the Codd server. CODD SERVER ACCESS INSTRUCTIONS OVERVIEW Codd (codd.franklin.edu) is a server that is used for many Computer Science (COMP) courses. To access the Franklin University Linux Server called Codd, an SSH connection

More information

ECE 2036 Lab 1: Introduction to Software Objects

ECE 2036 Lab 1: Introduction to Software Objects ECE 2036 Lab 1: Introduction to Software Objects Assigned: Aug 24/25 2015 Due: September 1, 2015 by 11:59 PM Reading: Deitel& Deitel Chapter 2-4 Student Name: Check Off/Score Part 1: Check Off/Score Part

More information

Hands-on. MPI basic exercises

Hands-on. MPI basic exercises WIFI XSF-UPC: Username: xsf.convidat Password: 1nt3r3st3l4r WIFI EDUROAM: Username: roam06@bsc.es Password: Bsccns.4 MareNostrum III User Guide http://www.bsc.es/support/marenostrum3-ug.pdf Remember to

More information

High Performance Computing (HPC) Club Training Session. Xinsheng (Shawn) Qin

High Performance Computing (HPC) Club Training Session. Xinsheng (Shawn) Qin High Performance Computing (HPC) Club Training Session Xinsheng (Shawn) Qin Outline HPC Club The Hyak Supercomputer Logging in to Hyak Basic Linux Commands Transferring Files Between Your PC and Hyak Submitting

More information

Supercomputing in Plain English Exercise #6: MPI Point to Point

Supercomputing in Plain English Exercise #6: MPI Point to Point Supercomputing in Plain English Exercise #6: MPI Point to Point In this exercise, we ll use the same conventions and commands as in Exercises #1, #2, #3, #4 and #5. You should refer back to the Exercise

More information

NoahMP namelist.hrldas: Description of Options

NoahMP namelist.hrldas: Description of Options NoahMP namelist.hrldas: Description of Options (notes are in bold blue and indicated with

More information

Parameter searches and the batch system

Parameter searches and the batch system Parameter searches and the batch system Scientific Computing Group css@rrzn.uni-hannover.de Parameter searches and the batch system Scientific Computing Group 1st of October 2012 1 Contents 1 Parameter

More information

bwunicluster Tutorial Access, Data Transfer, Compiling, Modulefiles, Batch Jobs

bwunicluster Tutorial Access, Data Transfer, Compiling, Modulefiles, Batch Jobs bwunicluster Tutorial Access, Data Transfer, Compiling, Modulefiles, Batch Jobs Frauke Bösert, SCC, KIT 1 Material: Slides & Scripts https://indico.scc.kit.edu/indico/event/263/ @bwunicluster/forhlr I/ForHLR

More information

DDT: A visual, parallel debugger on Ra

DDT: A visual, parallel debugger on Ra DDT: A visual, parallel debugger on Ra David M. Larue dlarue@mines.edu High Performance & Research Computing Campus Computing, Communications, and Information Technologies Colorado School of Mines March,

More information

CS 215 Fundamentals of Programming II Spring 2019 Very Basic UNIX

CS 215 Fundamentals of Programming II Spring 2019 Very Basic UNIX CS 215 Fundamentals of Programming II Spring 2019 Very Basic UNIX This handout very briefly describes how to use Unix and how to use the Linux server and client machines in the EECS labs that dual boot

More information

NETCONF Client GUI. Client Application Files APPENDIX

NETCONF Client GUI. Client Application Files APPENDIX APPENDIX B The NETCONF client is a simple GUI client application that can be used to understand the implementation of the NETCONF protocol in Cisco E-DI. This appendix includes the following information:

More information

IoT with Intel Galileo Gerardo Carmona. makerobots.tk

IoT with Intel Galileo Gerardo Carmona. makerobots.tk IoT with Intel Galileo Gerardo Carmona Outline What is Intel Galileo? Hello world! In Arduino Arduino and Linux Linux via SSH Playing around in Linux Programming flexibility How GPIOs works Challenge 1:

More information

AASPI Software Structure

AASPI Software Structure AASPI Software Structure Introduction The AASPI software comprises a rich collection of seismic attribute generation, data conditioning, and multiattribute machine-learning analysis tools constructed by

More information

Docusnap X - Docusnap Script Linux. Script-based Inventory for Linux

Docusnap X - Docusnap Script Linux. Script-based Inventory for Linux Docusnap X - Docusnap Script Linux Script-based Inventory for Linux TITLE Docusnap X - Docusnap Script Linux AUTHOR Docusnap Consulting DATE 12/18/2018 VERSION 1.1 valid from 26.09.2018 This document contains

More information

Introduction to the ESMValTool

Introduction to the ESMValTool Introduction to the ESMValTool 1. General Info 2. Installation 3. Selecting data and diagnostics 4. Recent developments in EMBRACE 5. Modify plots 6. Options to contribute your own diagnostics 7. How to

More information

Syed RH Rizvi.

Syed RH Rizvi. Community Tools: gen_be Syed RH Rizvi National Center For Atmospheric Research NCAR/ESSL/MMM, Boulder, CO-80307, USA rizvi@ucar.edu 0 Talk overview What is gen_be? How it works? Some technical details

More information

GUT. GUT Installation Guide

GUT. GUT Installation Guide Date : 02 Feb 2009 1/5 GUT Table of Contents 1 Introduction...2 2 Installing GUT...2 2.1 Optional Extensions...2 2.2 Installing from source...2 2.3 Installing the Linux binary package...4 2.4 Installing

More information

Programming Assignment 1

Programming Assignment 1 CS 276 / LING 286 Spring 2017 Programming Assignment 1 Due: Thursday, April 20, 2017 at 11:59pm Overview In this programming assignment, you will be applying knowledge that you have learned from lecture

More information

Session 1: Accessing MUGrid and Command Line Basics

Session 1: Accessing MUGrid and Command Line Basics Session 1: Accessing MUGrid and Command Line Basics Craig A. Struble, Ph.D. July 14, 2010 1 Introduction The Marquette University Grid (MUGrid) is a collection of dedicated and opportunistic resources

More information

Slurm basics. Summer Kickstart June slide 1 of 49

Slurm basics. Summer Kickstart June slide 1 of 49 Slurm basics Summer Kickstart 2017 June 2017 slide 1 of 49 Triton layers Triton is a powerful but complex machine. You have to consider: Connecting (ssh) Data storage (filesystems and Lustre) Resource

More information