Updated 2019-09-17

Example Job: Python using hive-all queue

Overview

  • The guide will focus on providing a full runthrough of loading software and submitting a job on hive
  • In this guide, we'll load anaconda3 and run a simple python script, submitting to hive-all queue

The Python Script

  • While logged into hive, use a text editor such as vim to create the following python script, call it test.py
#simple test script
result = 2 ** 2
print("Result of 2 ^ 2: {}".format(result))

Step 1: The PBS Script

Warning

When creating a PBS script, make sure to include a blank line at the end or else the last line in the script may not be executed.

#PBS -N hivePythonExample       # job name
#PBS -l nodes=1:ppn=4           # number of nodes and cores per node required
#PBS -l pmem=1gb                # memory per core
#PBS -l walltime=15:00          # duration of the job (ex: 15 min)
#PBS -q hive-all                # queue name (where job is submitted)
#PBS -j oe                      # combine output and error messages into 1 file
#PBS -o hivePythonExample.out   # output file name

cd $PBS_O_WORKDIR 

module load anaconda3/2019.07

python test.py

  • Similar to the python script above, use a text editor like vim and create the above PBS script. Name it pythonExample.pbs. Make sure the PBS script and the python script are in the same folder
  • All the lines starting with #PBS are PBS directives, and determine info about the job such as what resources are required. These are explained in the comments in the script above
  • cd $PBS_O_WORKDIR directs the cluster to enter into the directory where you submitted the PBS script from. $PBS_O_WORKDIR is simply a variable that contains the path for this directory. Both the PBS script and the python script should be in this directory.
  • module load anaconda3/2019.07 loads anaconda3, which includes python
  • python test.py runs the python script

Part 2: Submit Job and Check Status

  • Make sure you're in the dir that contains the PBS Script and the python script
  • Submit as normal, with qsub <pbs script name>. In this case qsub pythonExample.pbs
  • Check job status with qstat -u <gtusername3> -n, replacing "gtusername3" with your gt username.
  • You can delete the job with qdel 22182721 , replacing the number with the jobid returned after running qsub

Part 3: Collecting Results

  • In the directory where you submitted the PBS script, you should see a hivePythonExample.out file, which contains the results of the job. Use cat hivePythonExample.out or open the file in a text editor to take a look.
  • hivePythonExample.out should look something like this:
---------------------------------------
Begin PBS Prologue Wed Sep 11 15:53:34 EDT 2019
Job ID:     136.sched-hive.pace.gatech.edu
User ID:    shollister7
Job name:   hivePythonExample
Queue:      hive-all
End PBS Prologue Wed Sep 11 15:53:34 EDT 2019
---------------------------------------
Result of 2 ^ 2: 4
---------------------------------------
Begin PBS Epilogue Wed Sep 11 15:53:35 EDT 2019
Job ID:     136.sched-hive.pace.gatech.edu
User ID:    shollister7
Job name:   hivePythonExample
Resources:  nodes=1:ppn=4,pmem=1gb,walltime=01:00:00,neednodes=1:ppn=4
Rsrc Used:  cput=00:00:00,vmem=0kb,walltime=00:00:01,mem=0kb,energy_used=0
Queue:      hive-all
Nodes:     
atl1-1-01-010-14.pace.gatech.edu
End PBS Epilogue Wed Sep 11 15:53:35 EDT 2019
---------------------------------------
  • Congratulations! You have successfully loaded and run software on hive!


This material is based upon work supported by the National Science Foundation under grant number 1828187. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.