how do I use the --conf option in airflow

37.6k views Asked by At

I am trying to run a airflow DAG and need to pass some parameters for the tasks.

How do I read the JSON string passed as the --conf parameter in the command line trigger_dag command, in the python DAG file.

ex: airflow trigger_dag 'dag_name' -r 'run_id' --conf '{"key":"value"}'

3

There are 3 answers

3
Daniel Huang On

Two ways. From inside a template field or file:

{{ dag_run.conf['key'] }}

Or when context is available, e.g. within a python callable of the PythonOperator:

context['dag_run'].conf['key']
0
Muhammad Ashir Ali On

You can use the param variable in DAG initialization to send data in DAG tasks.

0
zingsy On

In the example provided here https://github.com/apache/airflow/blob/master/airflow/example_dags/example_trigger_target_dag.py#L62 while trying to parse 'conf' passed in an airflow REST API call, use provide_context=True in pythonOperator.

Also, the key-value pair passed in json format in the REST API call, can be accessed in bashOperator and sparkOperator as '\'{{ dag_run.conf["key"] if dag_run else "" }}\''

dag = DAG(
    dag_id="example_dag",
    default_args={"start_date": days_ago(2), "owner": "airflow"},
    schedule_interval=None
)

def run_this_func(**context):
    """
    Print the payload "message" passed to the DagRun conf attribute.
    :param context: The execution context
    :type context: dict
    """
    print("context", context)
    print("Remotely received value of {} for key=message".format(context["dag_run"].conf["key"]))

#PythonOperator usage
run_this = PythonOperator(task_id="run_this", python_callable=run_this_func, dag=dag, provide_context=True)

#BashOperator usage
bash_task = BashOperator(
    task_id="bash_task",
    bash_command='echo "Here is the message: \'{{ dag_run.conf["key"] if dag_run else "" }}\'"',
    dag=dag
)

#SparkSubmitOperator usage
spark_task = SparkSubmitOperator(
        task_id="task_id",
        conn_id=spark_conn_id,
        name="task_name",
        application="example.py",
        application_args=[
            '--key', '\'{{ dag_run.conf["key"] if dag_run else "" }}\''
        ],
        num_executors=10,
        executor_cores=5,
        executor_memory='30G',
        #driver_memory='2G',
        conf={'spark.yarn.maxAppAttempts': 1},
        dag=dag)