I am submitting a application in python/pyspark through spark shell using a shell script. However, even when the python code fails the return code I am getting is 0. However I wish to either terminate the parent shell script or perform a required action if i get a non-zero status.
I have already tried:
- Catching the exception in python block and returning a non-zero status, however spark job still returns a zero status.
- Killing the parent shell script from python (using pid), still it gracefully kills stops spark and returns zero.
This is how I am calling the job:
.bin/spark-submit test.py param1 > log.txt 2>&1 disown
This is how I am returning the non zero status,
try: code block except Exception as e: sys.exit(2)
This is how,I am killing the parent process,
try: code block except Exception as e: os.kill(os.getppid(),signal.SIGTERM) os.system('pkill -f test.sh')