Skip to content

Reply To: Running examples on multiple GPUs

Due to recent bot attacks we have changed the sign-up process. If you want to participate in our forum, first register on this website and then send a message via our contact form.

Forums OpenLB General Topics Running examples on multiple GPUs Reply To: Running examples on multiple GPUs

#8842
Anonymous
Inactive

Currently, I am requesting an interactive allocation using the VNC protocol on the cluster. This means I am not submitting the job through a SLURM script. Instead, I am running the “mpirun” command directly, as if I were on a local PC with two Nvidia cards, as shown in the “nvidia-smi” output.

The cluster runs Rocky Linux operating system.