SparkAppHandle gives unknown state forever

954 views Asked by At

I am launching a spark Job from Java application using SparkLauncher.

SparkAppHandle jobHandle;
try {
    jobHandle = new SparkLauncher()
            .setSparkHome("C:\\spark-2.0.0-bin-hadoop2.7")
            .setAppResource("hdfs://server/inputs/test.jar")
            .setMainClass("com.test.TestJob")
            .setMaster("spark://server:6066")
            .setVerbose(true)
            .setDeployMode("cluster")
            .addAppArgs("abc")
            .startApplication();

} catch (IOException e) {
    throw new RuntimeException(e);
}

while(!jobHandle.getState().isFinal());

I can see my job running on SparkUI and also it is finishing without any errors.

However my java application never terminates since jobHandle.getState() always remains in UNKNOWN state. what am I missing here? My spark API version is 2.0.0. One more detail that might be relevant is that my launcher application is running on windows.

1

There are 1 answers

0
Sayat Satybald On

You need to block in your main thread and await for a callback from the driver. I've explained a concept in my previous answer.

You can do Thread.sleep in try/catch block, or use Spark Listeners with CountDownLatch.

while(!jobHandle.getState().isFinal()) { 
   //await until job finishes
   Thread.sleep(1000L);
}