Apache Airflow sparksubmit

33 views Asked by At

I have airflow and spark on different hosts. I am trying to submit it, but I got the following error:

{standard_task_runner.py:107} ERROR - Failed to execute job 223 for task spark_job (Cannot execute: spark-submit --master spark://spark-host --proxy-user hdfs --name arrow-spark --queue default --deploy-mode client /opt/airflow/dags/plugins/plug_code.py. Error code is: 1.; 27605)

Spark-Submit cmd:

spark-submit --master spark://spark-host   --name arrow-spark --queue default --deploy-mode client /opt/airflow/dags/plugins/plug_code.py

I tried with a proxy name, did not make sense.

0

There are 0 answers