Apache Phoenix Bulk Data Loading - Can't use table name by small letter

668 views Asked by At

i have existing Phoenix table abc i wanna Bulk Data Loading via MapReduce And then used the following command to load the csv file

hadoop jar /root/Phoenix/apache-phoenix-4.8.0-HBase-0.98-bin/phoenix-4.8.0-HBase-0.98-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --t abc --input /example.csv

but, it does not seem to find the table abc

Exception in thread "main" java.lang.IllegalArgumentException: Table ABC not found

i try change command table name --t 'abc' and --t "abc" but it doesn't work how can i use table name small letter ??

And also, i found same case

http://apache-phoenix-user-list.1124778.n5.nabble.com/Load-into-Phoenix-table-via-CsvBulkLoadTool-cannot-find-table-and-fails-td2792.html

thanks

1

There are 1 answers

0
Piyush Saxena On

I got the same error, after a lot of debugging I realized that phoenix converts you input "abc" into all caps "ABC" and tries to look for a table with this name. And Phoenix is case sensitive as mentioned here.

Try to create your table in phoenix shell with all caps and then run the same command, it should work just fine.

Sample table creation and bulk upload via mapreduce:

CREATE TABLE "CODEFREQUENCY" (pk VARCHAR PRIMARY KEY,"week"."weekNum" VARCHAR,"week"."addition" VARCHAR,"week"."deletion" VARCHAR);

HADOOP_CLASSPATH=$(hbase mapredcp):~/Installs/Hbase/conf/:~/Installs/apache-phoenix-4.10.0-HBase-1.2-bin/ ./hadoop jar ~/Installs/apache-phoenix-4.10.0-HBase-1.2-bin/phoenix-4.10.0-HBase-1.2-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dfs.permissions.umask-mode=000 -d $'\t' -t CODEFREQUENCY --input /hbase/Code_Frequency.csv