According to this documentation a Spark app, which was started/submitted with the SparkLauncher and the method startApplication
, can be killed with the returned SparkAppHandle and the kill()
method, as it's a child-process. I tried to implement this in combination with a CountDownLatch and a timer, but it doesn't work for me.
The Java-app with the SparkLauncher is finishing after 20min, but the spark app is still running on my YARN-cluster afterwards.
I'm using the following code:
`
// launcher config...
CountDownLatch countDownLatch;
Listener handleListeners = new Listener() {
@Override
public void stateChanged(SparkAppHandle handle) {
if (handle.getState().isFinal()) {
countDownLatch.countDown();
}
@Override
public void infoChanged(SparkAppHandle handle) {}
};
countDownLatch = new CountDownLatch(1);
SparkAppHandle handle = launcher.startApplication(handleListeners);
boolean regularExit = countDownLatch.await(20, TimeUnit.MINUTES);
if (!regularExit)
handle.kill();
` I'm still wondering that the kill-command shall work when it's already running on the cluster.
(Please, can somebody fix the code-snippet? I don't get it properly formatted -_- Thanks.)