Using the computing cluster of the ICM

This is a collaborative space. In order to contribute, send an email to maximilien.chaumon@icm-institute.org
On any page, type the letter L on your keyboard to add a "Label" to the page, which will make search easier.

Using the computing cluster of the ICM







Your desktop station can run most classical analyses with MEG-EEG data, from preprocessing, to source reconstruction, connectivity etc. However, you will soon hit the limits of your individual desktop if you start running a LOT of computations. Luckily, the ICM is equipped with a cluster of computing nodes that are reserved for this purpose.

You may be able to use several hundreds computers simultaneously. Also, some of these computers have much larger amounts of memory than your desktop.

The scheduler used at ICM is called SLURM. You can ask chatGPT for help.

The procedure to use the cluster is explained in this documentation:


Create a script

#!/bin/bash #SBATCH --mem=8G #SBATCH --partition=medium #SBATCH --time=1:00:00 #SBATCH --nodes=8 #SBATCH --job-name=Pom-pidom #SBATCH --output=/network/iss/cenir/analyse/meeg/CARACAS/Test_Max/ds003690/derivatives/CardiClassif_30comp/logs/log-%a #SBATCH --error=/network/iss/cenir/analyse/meeg/CARACAS/Test_Max/ds003690/derivatives/CardiClassif_30comp/logs/err/err-%a #SBATCH --array=1-30 echo started on $HOSTNAME date module load matlab tic="date +%s" cmd=$( printf "script_02_ds003690_precomp(%d)" ${SLURM_ARRAY_TASK_ID}) freesurfer -sub $ matlab -nodesktop -nodisplay -nosoftwareopengl -r "cd /network/iss/cenir/analyse/meeg/CARACAS/Test_Max/code; $cmd;exit;" let toc="date +%s" let sec="$toc - $tic" let heu="$sec / 3600" let min="($sec - $heu * 3600) / 60" echo Elapsed time: $heu H : $min min

Log on the cluster and launch your script

ssh sphpc-login02 sbatch /your_script_above

Check that everything runs…

squeue squeue -u your_username

will list all jobs

scancel #jobid scancel -u your_username