Giraph: Class not Found Exception on custom Job

409 views Asked by At

I am developing an algorithm using Giraph. I am working with version 1.0.0 on Hadoop 1.2.1.

I am pretty new to developing Giraph, so please be gentle ;)

My custom job is split into three packages:

  • io: contains the input and output format classes
  • layout: contains the Vertex Class, the Aggregator Class and the MasterCompute class.
  • run: contains the Tool-implementing class.

I program it in Eclipse using the built giraph-core jar as reference and then I export it in another jar called "customJob.jar".

Here is how I launch it in Hadoop:

 hadoop jar /opt/hadoop/lib/customJob.jar layout.customrVertex -vif 
 io.JSONLongDoubleFloatDoubleToMapVertexInputFormat -vip /users/hadoop/input/tiny_graph.txt
 -of io.VertexIdAndPositionOutputFormat -op /users/hadoop/output/customJob -w 1 

The Job starts, it gets to the MapReduce phase and then fails:

14/12/16 17:39:35 INFO job.GiraphJob: run: Since checkpointing is disabled (default), do not allow any task retries (setting mapred.map.max.attempts = 0, old value = 4)
14/12/16 17:39:37 INFO mapred.JobClient: Running job: job_201412161121_0025
14/12/16 17:39:38 INFO mapred.JobClient:  map 0% reduce 0%
14/12/16 17:39:49 INFO mapred.JobClient: Job complete: job_201412161121_0025
14/12/16 17:39:49 INFO mapred.JobClient: Counters: 4
14/12/16 17:39:49 INFO mapred.JobClient:   Job Counters 
14/12/16 17:39:49 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=9487
14/12/16 17:39:49 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
14/12/16 17:39:49 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
14/12/16 17:39:49 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0

Further investigation on the JobTracker showed that the JobSetup fails, with a ClassNotFoundException Error:

java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: layout.customVertex
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:889)
at org.apache.giraph.conf.ClassConfOption.get(ClassConfOption.java:94)
at org.apache.giraph.conf.GiraphClasses.readFromConf(GiraphClasses.java:152)
at org.apache.giraph.conf.GiraphClasses.<init>(GiraphClasses.java:142)
at org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.<init>(ImmutableClassesGiraphConfiguration.java:93)
at org.apache.giraph.bsp.BspOutputFormat.getOutputCommitter(BspOutputFormat.java:56)
at org.apache.hadoop.mapred.Task.initialize(Task.java:515)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: layout.customVertex
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:881)
... 12 more
Caused by: java.lang.ClassNotFoundException: layout.customVertex
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:810)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:855)
... 13 more

The Hadoop configuration is the one suggested in the Giraph Quick start page.

I will appreciate any help/suggestion you could give :)

Thanks in advance!

1

There are 1 answers

0
Masoud Sagharichian On

first change hadoop-env.sh and add the jar file (s) to hadoop_classpath. then, add reference to your jar file using -libjars (path-to-your-jar/jar_file.jar)