We have sqoop installed in one of our machines. When I run the sqoop command from terminal, it runs fine (with some errors / warnings) and I am able to import data from MySQL to HDFS. Here is the command/logs for reference:

sqoop import -D mapreduce.job.queuename="production.P2" --connect "jdbc:mysql://xxxx:0000/DB" --password "pass" --username "uu" --table temp_pet --target-dir "/home/hadoop/work"

Warning: /opt/ais/cloudera/parcels/CDH-5.13.2-1.cdh5.13.2.p0.3/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
ls: cannot access /opt/ais/cloudera/parcels/CDH-5.13.2-1.cdh5.13.2.p0.3/bin/../lib/sqoop/../hadoop/hadoop-core*.jar: No such file or directory
Error: Could not find or load main class org.apache.hadoop.util.PlatformName
Error: Could not find or load main class org.apache.hadoop.util.PlatformName
19/04/22 12:28:49 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.2
19/04/22 12:29:17 INFO mapreduce.ImportJobBase: Transferred 41 bytes in 23.5093 seconds (1.744 bytes/sec)
19/04/22 12:29:17 INFO mapreduce.ImportJobBase: Retrieved 1 records.

However, when I wrote a scala job and deployed the jar (build through maven), I am getting the following error:

java -jar GenericDataStoreTransfer-1.0-SNAPSHOT.jar
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/sqoop/Sqoop
at com.apple.Utility.GenericDataStoreTransfer.MySQLToHDFS$.main(MySQLToHDFS.scala:11)
at com.apple.Utility.GenericDataStoreTransfer.MySQLToHDFS.main(MySQLToHDFS.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.sqoop.Sqoop
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

This is my relevant pom snippet



Let me know if I am missing anything. Any pointers would be helpful

0 Answers