Errors caused by adding Mahout Dependency to Gradle

316 views Asked by At

I am trying to run a hadoop job using gradle to build my project and when I add the mahout dependencies

apply plugin: 'java'

repositories {
    mavenCentral()
}

dependencies {
    compile group: 'org.apache.hadoop', name: 'hadoop-mapreduce-client-core', version: '3.1.2'

    compile group: 'org.apache.hadoop', name: 'hadoop-common', version: '3.1.2'

    compile group: 'org.apache.mahout', name: 'mahout-hdfs', version: '0.13.0'
    compile group: 'org.apache.mahout', name: 'mahout-mr', version: '0.13.0'
    compile group: 'org.apache.mahout', name: 'mahout-math', version: '0.13.0'
}

jar {
    from configurations.compile.collect { it.isDirectory() ? it : zipTree(it) }
}

ext.hadoopVersion = "3.1.2"

to my build file I get the following error:

Exception in thread "main" java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
    at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3217)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3262)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3301)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:227)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:463)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:542)
    at ConvertText.ConvertTextJob.main(ConvertTextJob.java:25)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:232)

but without this mahout dependency the job works just fine.

Here is the code where I am using the library:

package ConvertText;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat;
import org.apache.mahout.math.VectorWritable;

public class ConvertTextJob {
  public static void main(String[] args){
    try {
      //Setup for the first job
      Configuration conf = new Configuration();

      //Setup for jar of class
      Job job = Job.getInstance(conf, "Convert Text");
      job.setJarByClass(ConvertTextJob.class);

      // path to input/output in HDFS
      FileInputFormat.addInputPath(job, new Path(args[0]));
      FileOutputFormat.setOutputPath(job, new Path(args[1]));

      //Set Mapper class
      job.setMapperClass(ConvertTextMapper.class);

      // Outputs from the Mapper
      job.setOutputKeyClass(NullWritable.class);
      job.setOutputValueClass(VectorWritable.class);

      //Set format of the key/value format
      job.setOutputFormatClass(SequenceFileOutputFormat.class);

      job.setNumReduceTasks(0);

      // Block until the job is completed.
      System.exit(job.waitForCompletion(true) ? 0 : 1);

    } catch (IOException | InterruptedException | ClassNotFoundException e) {
      System.err.println(e.getMessage());
    }
  }

}

Does anyone know what the issue is and how I can fix it so I can use the dependency? I am doing a project that involves mahout and requires this dependency.

2

There are 2 answers

1
Andrew Palumbo On

the hdfs module was factored out of mahout-core in mahout-0.13.0.

https://github.com/apache/mahout/tree/mahout-0.13.0

~ > mahout > mahout-0.13.0 > 7? > $ > ls 
  100-interpreter-spec.yaml       LICENSE.txt           NOTICE.txt
  README.md                       bin/                  buildtools/
  community/                      conf/                 core/
  distribution/                   doap_Mahout.rdf       docs/
  dry_run.sh                      engine/               examples/
  experimental/                   flink/                h2o/
  hdfs/                           integration/          issuse
  lib/                            mahout.iml            math/
  math-scala/                     mr/                   pom.xml
  resource-managers/              runtests.sh           scratch
  spark/                          src/                  target/
  viennacl/                       viennacl-omp/

you'll have to add the mahout-hdfs arifact to build for 0.13.0.

// https://mvnrepository.com/artifact/org.apache.mahout/mahout-hdfs
compile group: 'org.apache.mahout', name: 'mahout-hdfs', version: '0.13.0'

https://mvnrepository.com/artifact/org.apache.mahout/mahout-hdfs/0.13.0

0
Ben Watson On

You need to find and remove/exclude hadoop-hdfs-2.x.x.jar from the classpath. It's clashing with the newer version of HDFS pulled in by Mahout.