Getting file not found error because of escape character

84 views Asked by At

I am trying to execute the below spark-shell command in Linux terminal through java code.

echo spark.sparkContext.parallelize\(1 to 3,3\).map\(x => \
(x,\"city_\"+x\)\).toDF\(\"num",\"city\"\).write.partitionBy\(\"num\"\).mode\
(SaveMode.Overwrite\).parquet\(\"/tmp/abinash\"\) | /opt/ab/cd/de/spark-shell

But getting "No such file or directory" error for /tmp/abinash even if file exist

I tried so many ways to solve this. But did not get any success. I assume there is an issue with escape character.

Can anyone help me with this what i am doing wrong here.

1

There are 1 answers

1
stack0114106 On BEST ANSWER

Try this.

> echo "spark.sparkContext.parallelize(1 to 3,3).map(x => (x,\"city_\"+x)).toDF(\"num\",\"city\").write.partitionBy(\"num\").mode(SaveMode.Overwrite).parquet(\"/tmp/abinash\")"
spark.sparkContext.parallelize(1 to 3,3).map(x => (x,"city_"+x)).toDF("num","city").write.partitionBy("num").mode(SaveMode.Overwrite).parquet("/tmp/abinash")