BigQueryInsertJobOperator dryRun is returning success instead of failure on composer (airflow)

505 views Asked by At

When using BigQueryInsertJobOperator and setting the configuration to perform a dry run on a faulty .sql file/ a hardcoded query, the task succeeds even though it should fail. The same error gets properly thrown out by task failure when running with dryRun as false in configuration. Below is the code used for testing in composer(airflow)

from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator
from airflow import DAG

default_args = {
    'depends_on_past': False,
}


dag = DAG(dag_id='bq_script_tester',
          default_args=default_args,
          schedule_interval='@once',
          start_date=datetime(2021, 1, 1),
          tags=['bq_script_caller']
          )

with dag:
    job_id = BigQueryInsertJobOperator(
        task_id="bq_validator",
        configuration={
                "query": {
                    "query": "INSERT INTO `bigquery-public-data.stackoverflow.stackoverflow_posts` values('this is cool');",
                    "useLegacySql": False,
                },
            "dryRun": True
            },
        location="US"
        )

How can a bigquery be validated using dryRun option in composer. Is there an alternative approach in composer to achieve the same functionality. The alternative should be an operator capable of accepting sql scripts that contain procedures and simple sql with support of templating.

Airflow version: 2.1.4
Composer version: 1.17.7
0

There are 0 answers