Table of Contents

Running Ollama on the HPC Cluster

Starting an interactive session for the ollama user service

module load slurm/utils
sinteractive --gres=gpu:1

wait until you got resources assigned !!

Starting the ollama server and interacting with it

module load ollama/0.1.41
start-ollama
ollama run llama3
stop-ollama
exit