Overview
Welcome to the LPCT Cluster, a HPC platform featuring a heterogeneous architecture with x86 CPUs and NVIDIA GPUs.
Cluster Specifications
- Compute Nodes: Close to 250 nodes
- Accelerated Nodes: About 100 nodes equipped with NVIDIA GPUs
- Networking: 5 high-speed interconnect switches
- Storage: Over 1 petabyte (PB) of disk space mounted via NFS
Fronted Nodes
Login Nodes
The cluster provides 4 login nodes:
lpct-login1.pct.site.univ-lorraine.fr
lpct-login2.pct.site.univ-lorraine.fr
lpct-login3.pct.site.univ-lorraine.fr
lpct-login4.pct.site.univ-lorraine.fr
Visualization Nodes
Dedicated nodes for graphical applications with GPU acceleration:
- 2 nodes with 2× NVIDIA RTX A5000:
lpct-visu1.lpct.site.univ-lorraine.fr
lpct-visu2.lpct.site.univ-lorraine.fr
- 1 node with 4× NVIDIA GeForce GTX 1070:
ebam-login.pct.site.univ-lorraine.fr
- 1 node with 4× NVIDIA GeForce GTX TITAN:
lia-login.pct.site.univ-lorraine.fr
Access is available via:
- SSH: For command-line sessions
- TurboVNC: For graphical applications (Graphical Session Guide)
Graphical Session
X11 Forwarding
The simplest way to access a graphical interface on the cluster is via X11 forwarding.
Use the -X
flag with SSH:
$ ssh -X USER@lpct-login
For better performance, enable compression by adding the -C
flag:
$ ssh -XC USER@lpct-login
TurboVNC
For higher-performance remote desktop access, use TurboVNC.
Note that, only securized sessions via SSH are allowed. The required TurboVNC client version is 3.2
and can be
installed from the release website of turbovnc.
Setting up a TurboVNC session
-
On the login node, run:
$ /opt/TurboVNC/bin/vncpasswd
and choose your VNC password
-
Then run:
$ /opt/TurboVNC/bin/vncserver
to start a new session
-
A message like this should appear:
Desktop 'TurboVNC: lpct-login:n (USER)' started on display lpct-login:n
where the number
n
indicates the X display number of the TurboVNC session
Managing TurboVNC sessions
-
To see all your VNC sessions, run:
$ /opt/TurboVNC/bin/vncserver -list
-
To connect to a running session, run the following command on your local machine:
$ /opt/TurboVNC/bin/vncviewer -tunnel USER@lpct-login:n
You will be prompted to enter your login password, followed by your VNC password.
Note that this is equivalent to running:
$ ssh -L (5900+$n):localhost:(5900+$n) USER@lpct-login
and then, in a second terminal:
$ /opt/TurboVNC/bin/vncviewer localhost:n
The first method uses TurboVNC's internal SSH library to create the tunnel automatically. The second method uses your system's native SSH client to create the tunnel manually, giving you more explicit control over the SSH connection.
-
To kill a session, use:
$ /opt/TurboVNC/bin/vncserver -kill :n
External Access
To access the cluster from an external network, you must connect through the University of Lorraine's VPN using the Cisco Secure Client.
1. Download the Client
Use your UL account to authenticate on the university's VPN portal.
Select the group Universite-de-Lorraine
(remember to append @ul
to your username). From here, you can
download the Cisco Secure Client for your machine.
2. Configure the Connection
Open the Cisco Secure Client and configure a new connection with the following settings:
- Server: vpn.lothaire.net
- Group: Universite-de-Lorraine
- Username: your_username@lpct
- Password: Your UL password
Once configured, you can connect to the VPN and will then have access to the cluster network.