Popen command doesn't run the code in the cluster

318 views Asked by At

I want to run a python code which sends another bash code to a cluster (slurm). For some reason, the bash code doesn't run but I get no errors at all (I think that python sends the job because I can retrieve the ID number of the job).

Inside the script, I defined "command" to be the command to be executed. It calls to another bash script "svm_run.sh" which receives two parameters "path/fit" and "runname".

import subprocess as sp


command = '/home/saarb/SVM/svm/RUNS/svm_run.sh ' + runname + ' ' + path + '/fit'

ID = sp.Popen(command, shell=True, stdout=sp.PIPE, universal_newlines=True).communicate()[0]

ID = ID.split()[-1].strip()

However, when I use the exact same lines in python prompt (not in script) on the front end computer I have no problems at all.

Here is the bash code "svm_run.sh":

#!/bin/bash
#
cd ..

name=$1
path=$2
job_name=${name}


output_dir=/home/saarb/SVM/svm/output/${path}
input_dir=/home/saarb/SVM/svm/data/${path}

cat > sbatch_script.${job_name}.$$ << EOF
#!/bin/bash
#

#SBATCH --job-name=${job_name}
#SBATCH --output=${output_dir}/${job_name}.sc
#SBATCH --error=${output_dir}/${job_name}.sc
#SBATCH --partition=serial
#SBATCH --ntasks=1
#SBATCH --qos=serial



date 

# mpiexec ./svm.x ${path}/${name} > ${job_name}.out
mpiexec ./svm.x ${path}/${name}

echo "done!"
date
\rm sbatch_script.${job_name}.$$

EOF

sbatch sbatch_script.${job_name}.$$
0

There are 0 answers