object HiveContext in package hive cannot be accessed in package

1.3k views Asked by At

HI Coders, I'm back again. I'm trying to create a hive table from a dataframe using HIve context in my scala code, im able to do it in sqlContext but when it comes to HiveContext, it is throwing this error

[error] /home/mapr/avroProject/src/main/scala/AvroConsumer.scala:75: object HiveContext in package hive cannot be accessed in package org.apa                                                che.spark.sql.hive
[error] HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc.sc());

I've tried the same with slightly different declarations as well,

val hiveContext = org.apache.spark.sql.hive.HiveContext(sc)

I have added the sbt library dependenccies too,

libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.6.1"

i tried with "provided" also.

Here is my piece of code

 messages.foreachRDD(rdd=>
{
 import org.apache.spark.sql.hive.HiveContext
HiveContext sqlContext = new org.apache.spark.sql.hive.HiveContext(sc.sc());
//import org.apache.spark.sql.hive._
//val dataframe = sqlContext.read.json(rdd.map(_._2))
val dataframe =sqlContext.read.json(rdd.map(_._2))
val df =dataframe.toDF()

Any fix on this? I've never came across this "not accessible" error.

And also i tried to create a temptable from the code

val dataframe =sqlContext.read.json(rdd.map(_._2))
 val df =dataframe.toDF()
  df.registerTempTable("mdl_events")

But where could i find the mdl_events table? is there any default database in spark where i can look for this? I cannot fond from spark shell though.

1

There are 1 answers

0
jack AKA karthik On

Hi i figured this out, from 1.3v spark the hivecontext is available as SqlContext by default. So explicilty calling the Hivecontext is not encouraged. The below code can help to overcome this issue,

messages.foreachRDD(rdd=>
      {
      val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
         import sqlContext.implicits._
      val dataframe =sqlContext.read.json(rdd.map(_._2))
      val df =dataframe.toDF()
})