EEMCS-HPC Software
The EEMCS-HPC cluster uses Environment Modules for managing software. The modules system permits us to set up the shell environment to make running and compiling software easier. It also allows us to make available many software packages and libraries that would otherwise conflict with one another.
When you first log into the cluster, you will be entered into a very barebone user environment with minimal software available. The module system is a script based system used to manage the user environment and to “activate” software packages. In order to access software that is installed on the cluster, you must first load the corresponding software module.
Local Software
The directory /software contains sub-directories where the separate groups can install additional software.
Software installation policy
- If you need an additional standard package, ask a contact person to arrange for the package to be installed.
- If you need a package that can be installed in user space then install it yourself. If others will be using the package ask for a directory in /software, otherwise use your home directory.
- Tensorflow v1.6+ and Keras require the avx and/or avx2 support, use either the –constraint=avx/avx2 or the main_avx or main_avx2 partitions.
- If the package is not standard and requires root privileges to install: contact Geert Jan Laanstra.
- You are not supposed to contact ICTS without talking to the contact persons.
Installed Software
The preinstalled software on the cluster is a growing list containing the following software :
- opencv, boost, python2.7/3.10 etc…
GNU Project C and C++ compiler
The Ubuntu 22.04 default version is 11.4.0. Not all source packages can be compiled with this version, for compatibility the version 7.5.0 is provided.
To activate this version load the module gnu/gcc-7, this is for example required by the older sample code of the cuda versions of 10.x and below.
Downloading packages and/or software using Proxy
Due to changes in the infra structure, downloads are only possible using the proxy server. define an environment variable for this purpose.
- shell environment varable (some tools use UPPERCASE or lowercase).
export HTTP_PROXY=http://proxy.utwente.nl:3128 export HTTPS_PROXY=http://proxy.utwente.nl:3128 export http_proxy=http://proxy.utwente.nl:3128 export https_proxy=http://proxy.utwente.nl:3128
- python variable
os.environ["HTTP_PROXY"]= "http://proxy.utwente.nl:3128" os.environ["HTTPS_PROXY"]= "http://proxy.utwente.nl:3128" os.environ["http_proxy"]= "http://proxy.utwente.nl:3128" os.environ["https_proxy"]= "http://proxy.utwente.nl:3128"
python packages in your home folder
In order to install your own set of python packages within your home folder run one of the following commands (depending on your python version).
pip install --user <PackageName> pip3 install --user <PackageName>
The default python versions are 2.7.18 and 3.10.12, for different versions you need to load the correct module file.
Optional Software
Additional software that is not directly available, can be added to your environment using the Environment Modules. To load this software on login, just add the following command to your .bashrc file:
module load <ModuleFolder>/<ModuleName>
To list the available modules run the following command:
module avail
Below is a list of the available optional software and their additional instructions.
Organization | Name | Version | <ModuleFolder> | <ModuleName> | Examples |
---|---|---|---|---|---|
Anaconda | Python/R Data Science Platform (python 2.7/3.7/3.8/3.11) | 2019.03, 2020.11, 2022.05, 2023.09 | anaconda2/anaconda3 | 2019.03, 2020.11, 2022.05, 2023.09 | Anaconda or Jupyter notebooks. |
miniconda | Miniconda is a free minimal installer for conda. | 3.8 | miniconda3 | 3.8 | module load miniconda3/3.8 |
blender | Blender is the free and open source 3D creation suite. | 3.2.1 | blender | 3.2.1 | module load blender/3.2.1 |
Comsol | Comsol Multiphysics | 5.3a, v5.5, v5.6, v6.0 | comsol | vx.x | Comsol in batch, server or interactive mode. |
GNU project | C and C++ compiler | 5.5.0, 7.5.0 | gnu | gcc-5, gcc-7 | module load gnu/gcc-7 |
Google Go | Go Language | 1.15.6, 1.16.5, 1.17.7 | go | 1.15.6, 1.16.5, 1.17.6 | module load go/1.17.6 |
Gurobi | Mathematical Solver | 9.5.1 | gurobi | 9.5.1 | module load gurobi/9.5.1 |
Intel | Manycore Platform Software Stack | 3.8.4 | intel | mpss-3.8.4 | module load intel/mpss-3.8.4 |
Intel | parallelstudio 2017 | 2017 | intel | parallelstudio2017 | module load intel/parallelstudio2017 |
julia | Julia Programming Language | 1.8.3, 1.10.3 | julia | 1.x.3 | module load julia/1.10.3 |
nVidia | Cuda Toolkit | 8.0, 9.0, 9.2, 10.0, 10.1, 10.2, 11.0, 11.1, 11.2, 11.3, 11.5, 11.6, 11.7, 11.8 (for 12.x see nvhpc) | nvidia | cuda-x.x | module load nvidia/cuda-11.2 |
nVidia | Deep Neural Network library | 5.1, 6.0, 7.0, 7.1, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.4, 8.6, 8.8 | nvidia | cuda-x.x_cudnn-x.x | module load nvidia/cuda-10.1_cudnn-7.6 |
nVidia | Collective Communications Library | 1.3, 2.0, 2.3, 2.4, 2.8, 2.9, 2.11, 2.12, 2.14, 2.16 | nvidia | cuda-x.x_nccl_x.x | module load nvidia/cuda-9.2_nccl-2.3 |
nVidia | High-performance deep learning inference library | 7.2, 8.0, 8.4, 8.6 | nvidia | cuda-x.x_tensorrt-x.x | module load nvidia/cuda-11.2_tensorrt-7.2 |
nVidia | HPC Sdk | 23.3 (cuda 12.0) | nvidia/nvhpc | 23.3 | nVidia Hpc Sdk |
nVidia | NVIDIA GPU top | x.x | nvidia | nvtop | nvtop-node <nodename> or nvidia-smi-node <nodename> |
Mathworks | Matlab | r2017a, r2018a, r2018b, r2019a, r2020b, r2021a, r2021b, r2022a, r2023b | mathworks | matlab_rxxxxy | Matlab in batch or interactive mode. |
Mathworks | Matlab Compiler Runtime | r2017a, r2018a, r2018b, r2019a, r2020b, r2021a, r2021b, r2022a, r2023b | mathworks | mcr_rxxxxy | Matlab in batch or interactive mode. |
Microsoft | Visual Studio Code | 1.33.1 | microsoft/vscode | 1.33.1, 1.78.2 | module load microsoft/vscode/1.78.2 |
Microsoft | .NET Core 3.0 SDK | 3.0.100-preview5 | microsoft | dotnet-sdk-3.0.100-preview5 | module load microsoft/dotnet-sdk-3.0.100-preview5 |
Python | Python Software Foundation | 3.7.3, 3.9.9 3.10.7 | python | 3.7.3, 3.9.9 3.10.7 | module load python/3.7.3 |
R-project | The R Project for Statistical Computing | 3.6.0, 4.1.2, 4.3.2 | R | 3.6.0, 4.1.2, 4.3.2 | module load R/4.3.2 |
Rust | Rust Forge Toolchain | 1.69 (stable) | rust | 1.69 | module load rust/1.69 |
Singularity | Singularity is a container platform | 3.7.4, 3.8.0, 3.9.5 | singularity | 3.x.x | Singularity |
Slurm | Scheduler Utilities (sinteractive, etc.) | n.a. | slurm | utils | sinteractive / admin guide |
SPI | Message Passing Interface | 4.1.5 (default 4.0.5) | openmpi | 4.1.5 | module load openmpi/4.1.5 |
SpinRCP | Spin Model Checker | 3.1.1 | SpinRCP | 3.1.1 | module load SpinRCP/3.1.1 |
Tensorflow/Tensorboard | An end-to-end open source machine learning platform | x.x | Tensorflow Tensorboard | ||
Wolfram | Mathematica | v11.1 | wolfram | mathematica_11.1 | module load wolfram/mathematica_11.1 |
Utwente | monitoring utilities | n/a | monitor | node | module load monitor/node Monitoring Computenodes |
FMT software
The FMT group also uses Environment Modules to manage its software. To use these modules add the following commands to your .bashrc file:
module use /software/fmt/other/modules module use /software/fmt/easybuild/modules/all
Other modules
The other hierarchy contains modules with software that needs to be installed by hand. Currently, it contains installations of
- Java (Various SE versions)
- CADP
Easybuild modules
EasyBuild is a software build system. The provided configurations allow build of a long list of HPC software, including various compiler tool chains.
When a package can be built automatically, adding it as a module here has the great advantage that you can share the build configuration with other people. This directory contains modules for:
- mCRL2
- LTSmin
The configs for these are archived on github.