Phoenix "org.apache.phoenix.spark.DefaultSource" error

1.6k views Asked by At

I am new to phoenix, I am trying to load hbase table into Phoenix. When I try to load Phoenix, I am getting below error.

java.lang.ClassNotFoundException: org.apache.phoenix.spark.DefaultSource

My code:

package com.vas.reports
import org.apache.spark.SparkContext
import org.apache.spark.sql.{SQLContext, SaveMode}
import org.apache.phoenix.spark
import java.sql.DriverManager
import com.google.common.collect.ImmutableMap
import org.apache.hadoop.hbase.filter.FilterBase
import org.apache.phoenix.query.QueryConstants
import org.apache.phoenix.filter.ColumnProjectionFilter;
import org.apache.phoenix.hbase.index.util.ImmutableBytesPtr;
import org.apache.phoenix.hbase.index.util.VersionUtil;
import org.apache.hadoop.hbase.filter.Filter


object PhoenixRead {

case class Record(NO:Int,NAME:String,DEPT:Int)


def main(args: Array[String]) {

val sc= new SparkContext("local","phoenixsample")

val sqlcontext=new SQLContext(sc)

val numWorkers = sc.getExecutorStorageStatus.map(_.blockManagerId.executorId).filter(_ != "driver").length

import sqlcontext.implicits._


val df1=sc.parallelize(List((2,"Varun", 58),

(3,"Alice", 45),

(4,"kumar", 55))).

toDF("NO", "NAME", "DEPT")



df1.show()

println(numWorkers)

println("pritning df2")

val df =sqlcontext.load("org.apache.phoenix.spark",Map("table"->"udm_main","zkUrl"->"phoenix url:2181/hbase-unsecure"))

df.show()

SPARK-SUBMIT ~~~~~~~~~~~~

spark-submit --class com.vas.reports.PhoenixRead --jars /home/hadoop1/phoenix-core-4.4.0-HBase-1.1.jar /shared/test/ratna-0.0.1-SNAPSHOT.jar

Please look into this and suggest me.

1

There are 1 answers

0
ROOT On

This is because, you need to add following library files in HBASE_HOME/libs and SPARK_HOME/lib.

in HBASE_HOME/libs:

  • phoenix-spark-4.7.0-HBase-1.1.jar
  • phoenix-4.7.0-HBase-1.1-server.jar

in SPARK_HOME/lib:

  • phoenix-spark-4.7.0-HBase-1.1.jar
  • phoenix-4.7.0-HBase-1.1-client.jar