Unable to find the internal logging class: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class

349 views Asked by At

I am trying to build spark cluster on DNAnexus platform.

I tried creating spark context from jupyterlab notebook.

import pyspark
sc = pyspark.SparkContext()
spark = pyspark.sql.SparkSession(sc)

I get the following error stack trace.

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
    at org.apache.spark.scheduler.DAGScheduler.<init>(DAGScheduler.scala:125)
    at org.apache.spark.scheduler.DAGScheduler.<init>(DAGScheduler.scala:128)
    at org.apache.spark.scheduler.DAGScheduler.<init>(DAGScheduler.scala:137)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:536)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

I checked the jar files and I could not find which jar file is required for internal logging class. Can someone point out whether my spark installation is faulty or which jar file is required for missing class?

0

There are 0 answers