I'm trying to create SQL context object using JavaSparkContext object as it's parameter like:
SparkConf sparkConf=new SparkConf().setMaster("local").setAppName("Example");
JavaSparkContext sc=new JavaSparkContext(sparkConf);
SQLContext sqlctx=new HiveContext(sc);
Eclipse is throwing an error saying:
The constructor HiveContext(JavaSparkContext) is undefined
But all examples I have looked up on the internet, including the documentation uses JavaSparkContext as parameter. Am I missing out something?
Maven dependencies:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.2.0</version>
<scope>provided</scope>
</dependency>
Shouldn't you have Spark 2.2 dependency for spark_hive ?
And if you use Spark 2.2, HiveContext is deprecated I think, you should just only use SparkSession as the entry point for queries and computations :