Segmentation fault error while running pyspark in Apache Spark 2.4.7

125 views Asked by At

Getting Segmentation fault error while running /opt/spark2/bin/pyspark --master yarn --conf spark.ui.port=0 in Kali Linux.

I verified Python3.7 is under /usr/bin and spark can access it. While running /opt/spark2/bin/spark-shell --master yarn --conf spark.ui.port=0, spark-shell appears correctly. But with pyspark,python is coming but it is showing Segmentation fault and stops executing immediately. It should start the shell. Please help me to overcome the issue.

0

There are 0 answers