Tried using both Spark shell and Spark submit, getting this exception?
Initializing SparkContext with MASTER: spark://1.2.3.4:7077
ERROR 2015-06-11 14:08:29 org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
WARN 2015-06-11 14:08:29 org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend: Application ID is not initialized yet.
ERROR 2015-06-11 14:08:30 org.apache.spark.scheduler.TaskSchedulerImpl: Exiting due to error from cluster scheduler: All masters are unresponsive! Giving up.
Make sure the URL for the master is correct, and that the master is still alive.
You can check what the correct URL should be by going to the spark web UI in your browser. If you are running the master locally try typing localhost:8080 into any browser window.
Here is more about the web UI https://spark.apache.org/docs/1.2.0/monitoring.html