A Step by step guide to use Comsol MultiPhysics in batch mode on the HPC/Slurm Cluster.

Initial setup.

  • Prepare a work folder for your comsol jobs and copy a template to submit these jobs.
mkdir ~/comsol
cd ~/comsol
cp /deepstore/software/examples/run-comsol.sbatch .
  • Modify the input/output files, required version of comsol, resource requirements (cpu/mem/gpu/nodes) etc. in the run-comsol.sbatch file.

Submit your comsol jobs.

  • Start your job by submitting the job using sbatch.
sbatch run-comsol.sbatch

A Step by step guide to use Comsol MultiPhysics Server on the HPC/Slurm Cluster.

Initial setup.

  • Create a folder to store your comsol mphserver files and copy the example submit script.
mkdir ~/comsol
cd ~/comsol
cp /deepstore/software/examples/run-comsol-server.sbatch .
  • Modify the comsol server requirements, required version of comsol, resource requirements (cpu/mem/gpu/nodes) etc. in the run-comsol-server.sbatch file.
  • (Re)Set your comsol server password (this will start temporary the comsol server, close this after setting the password).

This password will be stored in ~/.comsol/<version>/login.properties

comsolpasswd

Starting the Comsol Server

  • Submit the job by running the following command :
sbatch run-comsol-server.sbatch

The command will show a <job-id>, wait until the job is actually running. Use this <job-id> to check the content of the comsol-server-<job-id>.log file. This will file will contain the required url to connect to the comsol-server using your local comsol-client.

Connect to the Comsol Server.

Use your local comsol to connect to your comsol server by using the Server name, port and username listed in the log file. The content will probably look like this :

Adding Matlab r2019a
Adding Comsol MultiPhysics v5.3a
Server : ctit083.ewi.utwente.nl
Port : 8805
Username : laanstragj
COMSOL Multiphysics server 5.3a (Build: 348) started listening on port 8805
Use the console command 'close' to exit the program

The last two lines in the log file show the running comsol server.

  • Please terminate the comsol server when you are done
scancel <job-id>

This will use the sinteractive wrapper, this will depend on the availability of the requested resources !

Initial setup

Load the slurm utils software module (sinteractive)

module load slurm/utils

Request resources using sinteractive (optionally modify the default resources, default is : 2 cpu's for 1 hour.).

sinteractive --cpus-per-task=4 --time=1-0:0:0

Once the resources are claimed you will get a similar response as the following:

srun: job 164362 queued and waiting for resources
srun: job 164362 has been allocated resources

The result is that you will get a prompt on one of the compute nodes.

Start comsol

module load comsol/v5.6
comsol
exit

Don't forget to exit the requested resources when you finish comsol (use the exit command) !