Configuring spark-ec2

164 views Asked by At

I noticed that when I start my Spark EC2 cluster from my local machine with spark/ec2/spark-ec2 start mycluster the setup routine has a nasty habit of destroying everything I put in my cluster's spark/conf/. Short of having to run a put-my-configs-back.sh script every time I start up my cluster, is there a "correct" way to set up persistent configurations that will survive a stop/start? Or just a better way?

I'm working off of Spark master locally and Spark 1.2 in my cluster.

0

There are 0 answers