I am using Spark 2.3.2.3.1.0.0-78. I tried to use
spark_session.sparkContext._conf.get('spark.executor.memory')
but I only received None.
How can I get spark.executor.memory's value?
I am using Spark 2.3.2.3.1.0.0-78. I tried to use
spark_session.sparkContext._conf.get('spark.executor.memory')
but I only received None.
How can I get spark.executor.memory's value?
If you received
None, it means that you're using the default value of1g(see the docs). Only if you specifically givespark.executor.memorya value in yourspark-submitorpysparkcommand you will be able to retrieve a non-null value.So you can still programatically say that if the output of your
._conf.get()isNone, your executor has 1G of memory.Demonstration:
Starting up a
pysparkshell without any special configuration in your command line:And then executing the command to get the config's value gives you an empty value:
Starting up a
pysparkshell with a different value forspark.executor.memory:And then executing that command does return a value: