I am doing a project in Apache hama to implement breadth first search and am facing trouble in partitioning the input graph.can anybody suggest a method to do the same?
public static class MinIntCombiner extends Combiner<IntWritable> {
@Override
public IntWritable combine(Iterable<IntWritable> messages) {
int visited = Integer.MAX_VALUE;
Iterator<IntWritable> it = messages.iterator();
// int msgValue = it.next().get();
while (it.hasNext()== true) {
int msgValue = it.next().get();
if (visited > msgValue)
visited = msgValue;
// msgValue = it.next().get();
}
return new IntWritable(visited);
}
The partitioner used here is
ssspJob.setPartitioner(HashPartitioner.class);
As we cannot use Hashpartitioner for bfs, can anyone suggest an alternative method?