Spark Scala app not running in eclipse using sbt

142 views Asked by At

i, I have followed the below link and created a spark scala application in Eclipse using sbt eclpise plugin.

https://www.nodalpoint.com/development-and-deployment-of-spark-applications-with-scala-eclipse-and-sbt-part-1-installation-configuration/

Followed all the steps and was able to run the SampleApp using sbt. But when I port the app to eclipse, I am not able to run the app as such. But can run line by line using the Scala interpreter. Following is the error I am getting when running the application. Any idea on what is going wrong?

Using Spark's default log4j profile: org/apache/spark/log4j-
defaults.properties
17/09/12 22:27:55 INFO SparkContext: Running Spark version 1.6.0
17/09/12 22:27:56 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
17/09/12 22:27:56 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your 
configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:401)
at TowerLocator$.main(TowerLocator.scala:11)
at TowerLocator.main(TowerLocator.scala)
17/09/12 22:27:56 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL 
must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:401)
at TowerLocator$.main(TowerLocator.scala:11)
at TowerLocator.main(TowerLocator.scala)

Thanks

1

There are 1 answers

1
dumitru On

You have to specify the master URL when launching the application from eclipse.

val conf = new SparkConf().setAppName("Sample Application").setMaster("local[*]")

When launching from shell you specify it using --master parameter