I'm doing a project on Google Cloud Platform on which I installed hadoop. I wrote a program in scala and created an executable JAR using the assembly instruction of sbt
Now I have to upload and run it on my platform. I tried to use the command spark-submit --class "Hi" provaciao.jar
but I get an error even if it works in local on the spark standalone.
I use the version 1.1.0 of spark and 2.4 of hadoop
This is my error log
marooned91_gmail_com@hadoop-m-on8g:/home/hadoop/spark-install/bin$ spark-submit --class "Hi" provaciao.jar java.lang.ClassNotFoundException: Hi at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:274) at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:318) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
try to Add full package path to Hi class. If you don't specified package for class, do it.
Like
spark-submit --class "com.mycompany.something.Hi" provaciao.jar
Also unzip your jar file and check that Hi class exists in it