Error while submitting a spark scala job using spark-submit

540 views Asked by At

I have written a simple app in scala using Eclipse -> New Scala Project.

I am using Scala 2.10.6 and Spark 2.0.2. The app is compiling without error and I also exported the jar file.

I am using the following command to execute the JAR

spark-submit  TowerTest.jar --class com.IFTL.EDI.LocateTower MobLocationData Output1

The scala code snippet is as follows

package com.IFTL.EDI

import scala.math.pow
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object LocateTower {
def main(args: Array[String]){

//create Spark context with Spark configuration
val sc = new SparkContext(new SparkConf().setAppName("TowerCount"))

//helper to add locations function used as a helper in finding tower 
centroid
def addLocations(p1: (Double,Double), p2: (Double,Double)) ={
(p1._1 + p2._1,p1._2 + p2._2)
 }
}

This is not the full code. when I run this, I get the following error.

[cloudera@quickstart ~]$ spark-submit  --class com.IFTL.EDI.LocateTower 
 TowerTest.jar MobLocationData LocationOut1
 java.lang.ClassNotFoundException: com.IFTL.EDI.LocateTower
 at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:270)
 at org.apache.spark.util.Utils$.classForName(Utils.scala:176)
 at 

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$
runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I am new to spark and scala, so not sure what I'm missing.

1

There are 1 answers

3
Dmitry On

Try with this order, if you want to pass some parameters you can put them after the jar file now, make sure you specify path to the jar or run it from jar location spark-submit --class com.IFTL.EDI.LocateTower /Users/myJarFilePath/TowerTest.jar

Try like that first, ones you have it working you can add command line arguments