i have existing Phoenix table abc i wanna Bulk Data Loading via MapReduce And then used the following command to load the csv file
hadoop jar /root/Phoenix/apache-phoenix-4.8.0-HBase-0.98-bin/phoenix-4.8.0-HBase-0.98-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool --t abc --input /example.csv
but, it does not seem to find the table abc
Exception in thread "main" java.lang.IllegalArgumentException: Table ABC not found
i try change command table name --t 'abc' and --t "abc" but it doesn't work how can i use table name small letter ??
And also, i found same case
thanks
I got the same error, after a lot of debugging I realized that phoenix converts you input "abc" into all caps "ABC" and tries to look for a table with this name. And Phoenix is case sensitive as mentioned here.
Try to create your table in phoenix shell with all caps and then run the same command, it should work just fine.
Sample table creation and bulk upload via mapreduce: