I am trying to execute the below spark-shell command in Linux terminal through java code.
echo spark.sparkContext.parallelize\(1 to 3,3\).map\(x => \
(x,\"city_\"+x\)\).toDF\(\"num",\"city\"\).write.partitionBy\(\"num\"\).mode\
(SaveMode.Overwrite\).parquet\(\"/tmp/abinash\"\) | /opt/ab/cd/de/spark-shell
But getting "No such file or directory" error for /tmp/abinash even if file exist
I tried so many ways to solve this. But did not get any success. I assume there is an issue with escape character.
Can anyone help me with this what i am doing wrong here.
Try this.