Attempts to test against a HiveContext when dependencies are provided throws java.lang.SecurityException

27 views Asked by At

When running unit tests that create a spark context I get an java.lang.SecurityException. I understand what the cause is but not sure how to track down how to solve it. This being that multiple dependencies sharing the same package javax.servlet having different signer information.

// Dependencies
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-hive_2.10" % "1.6.2" % Provided
)

// Test dependencies
libraryDependencies ++= Seq(
  "junit" % "junit" % "4.10" % Test,
  "org.scalatest" %% "scalatest" % "3.0.4" % Test,
  "org.apache.hadoop" % "hadoop-minicluster" % "2.5.0" % Test
)

I've created a sample project to demonstrate this.

There are many examples of this same problem which suggest exclusion rules for org.mortonbay.jetty and javax.servlet, though none seem to work for me.

when I use spark-submit on the built sbt assembly jar it works file, I just can't write tests for it.

1

There are 1 answers

0
Brett Ryan On

Ok, figured out the issue. It turns out that the dependency tree that's causing the problem is actually in my test dependencies, not the runtime dependencies.

Updating to the following has resolved the issue.

// Test dependencies
libraryDependencies ++= Seq(
  "junit" % "junit" % "4.10" % Test,
  "org.scalatest" %% "scalatest" % "3.0.4" % Test,
  "org.apache.hadoop" % "hadoop-minicluster" % "2.5.0" % Test
).map(
  _.excludeAll(ExclusionRule(organization = "javax.servlet"))
)

I've updated the sample on github.