How to kill a spark job if application id is known?

1.6k views Asked by At

I am using dse with spark . I have submitted an spark job to master using dse submit . How to kill the job by knowing its application ID ?

1

There are 1 answers

3
phact On

dse spark-class org.apache.spark.deploy.Client kill <spark-master> <driver-id>

or

directly from the web UI (just ensure that spark.ui.killEnabled is set to true)