Why I cannot access a package-private class in another jar (NOT sealed)?

3.3k views Asked by At

I've encounter a strange behaviour of Java classloader:

Assuming that I submit an Apache Spark jar to a cluster, which contains an extension of HiveServer2:

package org.apache.hive.service.server;
public class MyOP2 extends HiveServer2.ServerOptionsProcessor(
  String var) {
...

The class HiveServer2.ServerOptionsProcessor is already pre-loaded on the cluster (as a Spark dependency), but is declared as package-private.

package org.apache.hive.service.server;
public class HiveServer2 extends CompositeService {
...

  static interface ServerOptionsExecutor {
  ...
  }
}

This class is loaded first the JVM when the cluster is setup. Then my class (in another jar) is loaded by the same JVM when my application is submitted.

At this point I got the following error:

Exception in thread "main" java.lang.IllegalAccessError: class org.apache.hive.service.server.DPServerOptionsProcessor cannot access its superclass org.apache.hive.service.server.HiveServer2$ServerOptionsProcessor at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.hive.thriftserver.DPHiveThriftServer2$.main(DPHiveThriftServer2.scala:26) at org.apache.spark.sql.hive.thriftserver.DPHiveThriftServer2.main(DPHiveThriftServer2.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I'm under the impression that package-private class can be accessed by any other class in the same package. And I have double checked the manifest files in Spark's jars, none of them declare org.apache.hive.service.server as a sealed package. So why JVM classloader gave me this error? What condition has JVM used to trigger the exception?

1

There are 1 answers

0
jrtapsell On BEST ANSWER

As the 2 packages are loaded by different ClassLoaders they are treated as 2 different packages, so this means that the package private method is not accessible,resulting in the error message

More Info