Updated 2023-03-31

Run OpenFOAM on the Cluster

Overview

  • OpenFOAM is a GPL-opensource C++ CFD-toolbox. This offering is supported by OpenCFD Ltd, producer and distributor of the OpenFOAM software via www.openfoam.com, and owner of the OPENFOAM trademark. OpenCFD Ltd has been developing and releasing OpenFOAM since its debut in 2004.
  • This guide will cover how to run OpenFOAM on the Cluster.

Walkthrough: Run OpenFOAM on the Cluster

  • This walkthrough will cover a simple example of sbatch job running the pitzDaily example model.
  • openfoam_tutorial folder can be found here
  • SBATCH Script can be found here
  • You can transfer the files to your account on the cluster to follow along. The file transfer guide may be helpful.

Part 1: The SBTACH Script

#!/bin/bash
#SBATCH -JopenfoamTest
#SBATCH -A [Account]
#SBATCH -N4 --ntasks-per-node=2
#SBATCH -t15
#SBATCH -qinferno
#SBATCH -oReport-%j.out

cd $SLURM_SUBMIT_DIR
module load gcc/10.3.0
module load mvapich2/2.3.6
module load openfoam/2112-mva2-d2qaeb

chmod 700 -R pitzDaily && cd pitzDaily
blockMesh
decomposePar
srun simpleFoam -parallel
  • The #SBATCH directives are standard, requesting 15 minutes of walltime and 4 node with 2 task per node. More on #SBATCH directives can be found in the Using Slurm on Phoenix Guide
  • $SLURM_SUBMIT_DIR is a variable that represents the directory you submit the SBATCH script from. Make sure the SBATCH script is in the same folder as the pitzDaily folder.
  • blockMesh is used to creates parametric meshes with grading and curved edges
  • decomposePar is used to decompose a mesh and fields of a case for parallel execution.
  • srun simpleFoam -parallel runs the Steady-state solver for incompressible, turbulent flows.
  • Output Files will also show up in this directory as well
  • To see what OpenFOAM versions are available and to load which modules, run module spider openfoam, and load the ones you want.

Part 2: Submit Job and Check Status

  • Make sure you're in the directory that contains the SBATCH Script and the pitzDaily folder.
  • Submit as normal, with sbatch <script name>. In this case sbatch openfoam.sbatch
  • Check job status with squeue --job <jobID>, replacing with the jobid returned after running sbatch
  • You can delete the job with scancel <jobID> , replacing with the jobid returned after running sbatch

Part 3: Collecting Results

  • In the directory where you submitted the SBATCH script, you should see a Report-<jobID>.out file.
  • Report-<jobID>.out can be found here
  • Report-<jobID>.out should look like this:
--------------------------------------
Begin Slurm Prolog: Feb-24-2023 10:49:00
Job ID:    807247
User ID:   svangala3
Account:   phx-pace-staff
Job name:  openfoamTest
Partition: cpu-small
QOS:       inferno
---------------------------------------
-------------------------------------------------------------------------------
The following dependent module(s) are not currently loaded: mpfr/4.1.0-32gcbv (required by: mpc/1.2.1-zoh6w2, gcc/10.3.0-o57x6h)
-------------------------------------------------------------------------------

The following have been reloaded with a version change:
  1) mpfr/4.1.0-32gcbv => mpfr/4.1.0-tvlqhw

/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2112                                  |
|   \\  /    A nd           | Website:  www.openfoam.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : _14aeaf8dab-20211220 OPENFOAM=2112 version=v2112
Arch   : "LSB;label=32;scalar=64"
Exec   : blockMesh
Date   : Feb 24 2023
Time   : 10:49:04
Host   : atl1-1-02-014-22-2.pace.gatech.edu
PID    : 235200
I/O    : uncollated
Case   : /storage/coda1/pace-admins/svangala3/documentation/site_files/docs/slurm-software/test_directory/pitzDaily
nProcs : 1
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Creating block mesh from "system/blockMeshDict"
Creating block edges
No non-planar block faces defined
Creating topology blocks

Creating topology patches - from boundary section

Creating block mesh topology - scaling/transform applied later

Check topology

        Basic statistics
                Number of internal faces : 5
                Number of boundary faces : 20
                Number of defined boundary faces : 20
                Number of undefined boundary faces : 0
        Checking patch -> block consistency

Creating block offsets
Creating merge list (topological search)...
Deleting polyMesh directory "constant/polyMesh"

Creating polyMesh from blockMesh
Creating patches
Creating cells
Creating points with scale (0.001 0.001 0.001)
    Block 0 cell size :
        i : 0.00158284 .. 0.000791418
        j : 0.000318841 .. 0.000420268
        k : 0.001 .. 0.001

    Block 1 cell size :
        i : 0.000528387 .. 0.00211355
        j : 0.00112889 .. 0.000360188
        k : 0.001 .. 0.001

    Block 2 cell size :
        i : 0.000528387 .. 0.00211355
        j : 0.000318841 .. 0.000420268
        k : 0.001 .. 0.001

    Block 3 cell size :
        i : 0.0020578 .. 0.00514451
        j : 0.000940741 .. 0.000940741
        k : 0.001 .. 0.001

    Block 4 cell size :
        i : 0.0020466 .. 0.00511651
        j : 0.00112889 .. 0.000257962
        k : 0.001 .. 0.001


There are no merge patch pairs

Writing polyMesh with 0 cellZones
----------------
Mesh Information
----------------
  boundingBox: (-0.0206 -0.0254 -0.0005) (0.29 0.0254 0.0005)
  nPoints: 25012
  nCells: 12225
  nFaces: 49180
  nInternalFaces: 24170
----------------
Patches
----------------
  patch 0 (start: 24170 size: 30) name: inlet
  patch 1 (start: 24200 size: 57) name: outlet
  patch 2 (start: 24257 size: 223) name: upperWall
  patch 3 (start: 24480 size: 250) name: lowerWall
  patch 4 (start: 24730 size: 24450) name: frontAndBack

End

/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2112                                  |
|   \\  /    A nd           | Website:  www.openfoam.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : _14aeaf8dab-20211220 OPENFOAM=2112 version=v2112
Arch   : "LSB;label=32;scalar=64"
Exec   : decomposePar
Date   : Feb 24 2023
Time   : 10:49:05
Host   : atl1-1-02-014-22-2.pace.gatech.edu
PID    : 235218
I/O    : uncollated
Case   : /storage/coda1/pace-admins/svangala3/documentation/site_files/docs/slurm-software/test_directory/pitzDaily
nProcs : 1
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
allowSystemOperations : Allowing user-supplied system call operations
  • After the result files are produced, you can move the files off the cluster, refer to the file transfer guide for help.
  • Congratulations! You successfully ran Openfoam on the cluster.