The constructor HiveContext(JavaSparkContext) is undefined error while create SQLContext object

697 views Asked by At

I'm trying to create SQL context object using JavaSparkContext object as it's parameter like:

SparkConf sparkConf=new SparkConf().setMaster("local").setAppName("Example");
JavaSparkContext sc=new JavaSparkContext(sparkConf);
SQLContext sqlctx=new HiveContext(sc);

Eclipse is throwing an error saying:

The constructor HiveContext(JavaSparkContext) is undefined

But all examples I have looked up on the internet, including the documentation uses JavaSparkContext as parameter. Am I missing out something?

Maven dependencies:

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.2.0</version>
<scope>provided</scope>
</dependency>
1

There are 1 answers

0
tricky On BEST ANSWER

Shouldn't you have Spark 2.2 dependency for spark_hive ?

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>

And if you use Spark 2.2, HiveContext is deprecated I think, you should just only use SparkSession as the entry point for queries and computations :

Upgrading From Spark SQL 1.6 to 2.0

SparkSession is now the new entry point of Spark that replaces the old SQLContext and HiveContext. Note that the old SQLContext and HiveContext are kept for backward compatibility. A new catalog interface is accessible from SparkSession - existing API on databases and tables access such as listTables, createExternalTable, dropTempView, cacheTable are moved here.