How to pass configuration parameters from a file as environment variables for spark job?

1.6k views Asked by At

Am running a spark application which will use configuration parameters from a file.

File:- Spark.conf

username=ankush
password=ankush
host=https://
port=22
outputDirectory=/home/ankush/data/

How to use this file at runtime. Instead of restarted the job when we make changes to the configuration file how to make the job pick the file at runtime dynamically.

I tried to use it with spark-submit using --conf spark.yarn.appMasterEnv but this is looking for a variable not for a file.

Is there any possibility to implement this.

Thanks in advance for your help.

1

There are 1 answers

2
jdprasad On

You can keep the variables in conf/spark-defaults.conf file.

Ref: https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties