Updated 2019-06-14

Acessing PACE Clusters

Can my non-GT collaborator access my cluster?

  • Yes, but a guest account is required. These may be created on passport. Generally, these credentials are created by the faculty member sponsoring the remote collaborator. Once these credentials are created, PACE staff can enable access to PACE clusters as we do for other GT Accounts.


When creating these credentials, be sure to enable "basic authentication." There are limits on the duration that these accounts may be enabled, but they may be renewed by the sponsor on the passport site.

  • More details on guest accounts can be found on the Passport site.
  • This mechanism can also be used to prolong the access of student collaborators as they graduate.
  • To access PACE clusters from off campus, you must use the GT VPN.

What do you recommend for file transfer between my local machine and clusters?

Please refer to the File Transfer with Globus section. While Globus is recommended, you may use any of the methods found in the Storage and File Transfer section.

Can I access PACE clusters from off-campus when I am at home?

Yes, but you must use the GT VPN.

How do I access head and compute nodes?

  • Head nodes are accessible from on-campus networks (and in some cases only specific networks or IP Addresses) via SSH. If you are off-campus, make sure you have an active VPN connection first.
  • Compute nodes are only accessible from other compute nodes and head nodes via either SSH or RSH. However, you will only be able to login to a given compute node if you have an active job running on that node. If the scheduler detects that your jobs on a given compute node have completed, any "leftover" or "derelict" processes left running on that compute node are subject to termination.
  • RSH is a passwordless protocol, and should require no user action to work. If you encounter trouble using RSH, please contact PACE support.
  • SSH encryption keys are normally created with your user account, and eliminate the need to supply a password when logging in via SSH. You can issue the command ls -al ~/.ssh to see if you already have a preconfigured SSH. If that's not the case, you may either file a remedy ticket, or alternatively follow the procedure below:
  • Starting out with no .ssh directory, you can do the following (just hitting Return when ssh-keygen prompts you for a response):
[user@force ~]$ ssh-keygen -t rsa
Generating public/private rsa key pair.
Enter file in which to save the key (/nv/hp16/user/.ssh/id_rsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /nv/hp16/user/.ssh/id_rsa.
Your public key has been saved in /nv/hp16/user/.ssh/id_rsa.pub.
The key fingerprint is:
86:f1:d7:54:f4:05:8e:d7:68:26:0b:47:16:86:2e:db user@joe.pace.gatech.edu

[user@force ~]$ cd ~/.ssh

[user@force .ssh]$ ls
id_rsa  id_rsa.pub

[user@force .ssh]$ cat id_rsa.pub >> authorized_keys2
      [user@force .ssh]$ ls
      id_rsa  id_rsa.pub   authorized_keys2

How do I use cluster GUI apps?

  • Include the -Y directive when you SSH into the head node as shown here:
ssh -Y someuser3@login-s.pace.gatech.edu