hive-warehouse-connector_2.11 + Required field 'client_protocol' is unset

350 views Asked by At

I am using a hadoop cluster with cloudera-6.3.2 distribution. I have a requirement to read hive ACID table from spark (Java client). So native spark does not read hive acid table. Hence planning to use Hive WareHouse Connector. But getting below exceptions.Even could not read non acid table also. Any Thoughts?

20/03/27 21:26:46 INFO HiveWarehouseSessionImpl: Created a new HWC session: 9f46ffe3-c863-4fbf-82ef-9f730f0c0cfc
    20/03/27 21:26:47 INFO LlapBaseInputFormat: Handle ID 6439738b-a20a-46cc-9ee3-63214b573660: query=select * from test
    20/03/27 21:26:47 ERROR HiveConnection: Error opening session
    org.apache.thrift.TApplicationException: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, use:database=default/})
        at org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)

Java Code

HiveWarehouseSession hive = HiveWarehouseSession.session(spark).build();
 hive.executeQuery("select * from test").show();

Here is my pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>fw_analytics</groupId>
    <artifactId>com.analytics</artifactId>
    <version>1.0-SNAPSHOT</version>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>8</source>
                    <target>8</target>
                </configuration>
            </plugin>
        </plugins>
    </build>
    <packaging>jar</packaging>
    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.0-cdh6.3.2</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.11</artifactId>
            <version>2.4.0-cdh6.3.2</version>
        </dependency>


        <dependency>
            <groupId>com.hortonworks.hive</groupId>
            <artifactId>hive-warehouse-connector_2.11</artifactId>
            <version>1.0.0.7.0.3.0-79</version>
        </dependency>
    </dependencies>

    <repositories>
        <repository>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>true</enabled>
            </snapshots>
            <id>hortonworks.extrepo</id>
            <name>Hortonworks HDP</name>
            <url>http://repo.hortonworks.com/content/repositories/releases</url>
        </repository>
        <repository>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>true</enabled>
            </snapshots>
            <id>Cloudera.extrepo</id>
            <name>Cloudera HDP</name>
            <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
        </repository>
        <repository>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>true</enabled>
            </snapshots>
            <id>hortonworks.other</id>
            <name>Hortonworks Other Dependencies</name>
            <url>http://repo.hortonworks.com/content/groups/public</url>
        </repository>
    </repositories>

</project>
0

There are 0 answers