Job Sensors in Databricks Workflows

524 views Asked by At

At the moment we schedule our Databricks notebooks using Airflow. Due to dependencies between projects, there are dependencies between DAGs. Some DAGs wait until a task in a previous DAG is finished before starting (by using sensors). We are now looking to use Databricks DBX. It is still new for us, but it seems that DBX' main added value is when you use Databricks workflows. It would be possible to run a Python wheel in a job that was created by DBX. My question is now, is it possible to add dependencies between Databricks jobs? Can we create 2 different jobs using DBX, and make the second job wait until the first one is completed.

I am aware that I can have dependencies between tasks in one job, but in our case it is not possible to have only one job with all the tasks.

I was thinking about adding a notebook/python script before the wheel with ETL logic. This notebook would check then if the previous job is finished. Once this is the case, the task with the wheel will be executed. Does this make sense, or are there better ways? Is something like the ExternalTaskSensor in Airflow available within Databricks workflows? Or is there a good way to use DBX without DB workflows?

1

There are 1 answers

3
renardeinside On

author of dbx here.

TL;DR - dbx is not opinionated in terms of the orchestrator choice.

It is still new for us, but it seems that DBX' main added value is when you use Databricks workflows. It would be possible to run a Python wheel in a job that was created by DBX.

The short answer is yes, but it's done on the tasks level (read more here on the difference between workflow and task).

Another approach would be the following - if you still need (or want) to use Airflow, you can do it in the following way:

  1. Deploy and update your jobs from your CI/CD pipeline with dbx deploy commands.
  2. In Airflow, use the Databricks Operator to launch the job (either by name or by id).