Updated 2023-05-25

Storage on ICE

Each student receives both home and scratch directories. Local disk on compute nodes is accessible inside a compute job.


Running pace-quota will report on utilization of your home and scratch storage allocations.


Upon login, you will be placed in your home directory, which is available from all login and compute nodes. Home directories provide a 15 GB storage quota. A snapshot is taken daily in case data needs to be retrieved after accidental file loss.

Files in home directories are deleted after a user has not had access to ICE or logged in for one year.

Home directories are located on OIT's centralized NetApp storage service.

Users in need of more than 15 GB of storage are encouraged to use scratch or shared directories. If these solutions will not work, instructors and TAs may request additional home directory space on behalf of themselves or students in their course.


Each user is allocated a scratch directory as well, with a 100 GB storage quota on a faster parallel filesystem.

Scratch is not backed up. Any files lost from scratch are permanently gone. In addition, at the end of each semester, all files in scratch directories not touched in 120 days are deleted.

Scratch is hosted on a Lustre parallel filesystem with an InfiniBand network connection to login and compute nodes. Scratch will provide faster performance than home directories and is ideal for computations requiring faster networked storage.

Users in need of more than 100 GB of storage are encouraged to use shared directories where possible. Instructors and TAs may request additional scratch space for themselves if needed, but quota exceptions for students are not available on scratch.

Local Disk

Every ICE compute node has local disk storage available for temporary use in a job, which is automatically cleared upon job completion. Some applications can benefit from this storage for faster I/O than network storage (home and scratch). Most ICE CPU nodes and some GPU nodes have large NVMe local disks, while a few have SAS storage. See ICE resources for details.

  • Use the ${TMPDIR} variable in your Slurm script or interactive session to access the temporary directory for your job on local disk, which is automatically created for every job.
  • When requesting a partial node, guarantee availability of local disk space with #SBATCH --tmp=<size>[units, default MB].
  • To request a node with SAS storage, add #SBATCH -C localSAS.
  • To request a node with NVMe storage, add #SBATCH -C localNVMe.

Shared Directories

Courses may request a shared directory on ICE. These shared directories can be used for collaborative work or for distributing course materials and/or software. Instructors or TAs should discuss preferred access permissions on shared directories with PACE, to ensure they can be used for collaboration by a course, for group assignments, or for distribution only. Shared directories are located on a separate NetApp, and files in them do not count towards individual user quotas.

Some courses may request to have their shared directories placed on the Lustre parallel filesystem for faster performance. These shared directories have no backup, so they are best used for data that could be retrieved from another location if it needed to be recreated. Files in Lustre/scratch shared directories count towards the scratch quota of the user who owns them, even though they are located outside the user's scratch directory.

File Transfer

Transfer files to/from ICE via SCP (Mac/Linux or Windows) or SFTP. Small file transfer can also be done via the "Upload" and "Download" buttons in the "Files" tab of Open OnDemand.

Data from PACE-ICE and COC-ICE Prior to May 2023

Students, instructors, and TAs who used ICE prior to May 2023 accessed its predecessors, PACE-ICE and/or COC-ICE. For anyone with a current GT affiliation as of May 2023, PACE has copied their home directories from the old systems to the new ICE home directory. If your old data is not present and is needed, please contact PACE.

Old shared directories are available by request of the course instructor or TA and can be made available in the new shared directory location. Data in old shared directories not requested by May 1, 2025, will be deleted.