Nextflow errors running on HPC: /bin/bash: .command.run: No such file or directory

162 views Asked by At

Having some trouble running a nextflow pipeline on a HPC server which I think comes from the scratch space.

Some background on the server: User's are encouraged to not use their own /home/user/ directories and are encouraged to use /home/project-directories.

The server runs CentOS Linux 7 and has 3 separate nodes.

When building my pipeline I relied on my own /home/user/ directory for testing - the pipeline runs great. It was not until I moved to a /home/project-directory/ that I ran into issues. I did not set scratch to anything within the config, workflow or modules and relied on the default value.

Most processes within the pipeline rely on singularity images stored in a central location on the server: /home/opt/singularity_containers/. (permissions set to 777)

I have one process which runs immediately, and does not rely on a singularity container, and completes without error under all the conditions below.

However, any process relying on a singularity container gives the errors below.

When running the pipeline in /home/project-directory/ I get the error: /bin/bash: .command.run: No such file or directory.

I tried the following:

  • Within a given process setting: scratch /home/scratch (this directory exists)
    • Does not work, still get /bin/bash: .command.run: No such file or directory
  • Within a given process setting: scratch false
    • Does not work, still get /bin/bash: .command.run: No such file or directory
  • Within a given process setting: scratch true
    • Does not work, still get /bin/bash: .command.run: No such file or directory

Does anyone have any ideas on how to solve this? Maybe I am going in the wrong direction with the scratch directory and the problem is coming from something else.

EDIT: Solution found Got a solution! However, if anyone can add insight I would appreciate it - CharGPT for the "win". The solution was to add these lines to the nextflow.config file after enabled = true:

singularity {

    enabled = true
    singularity.enabled = true
    singularity.autoMounts = true
}

My confusion is why this is different when running in a user home directory versus a shared home project directory.

0

There are 0 answers