Task failure in DataprocCreateClusterOperator when i add metadata

12 views Asked by At

This is my cluster config, where i have added some connector metadata

cluster_config = { "project_id": project_id, "cluster_name": cluster_name, "region": region, "cluster_config": { "master_config": { "num_instances": 1, "machine_type_uri": "n1-standard-4", }, "worker_config": { "num_instances": 0, "machine_type_uri": "n1-standard-4", }, }, "metadata":{"gcs-connector-version": '2.1.1', "bigquery-connector-version": '1.1.1', "spark-bigquery-connector-version": '0.17.2'} }

create_task = DataprocCreateClusterOperator( task_id="create_dataproc_cluster", dag=dag, **cluster_config )

so when i run in ariflow composer i get very non-specific error that above task failed.

Traceback (most recent call last): grpc._cython.cygrpc._store_c_metadata ValueError: too many values to unpack (expected 2) [2024-03-30 07:48:37.855695+00:00] {taskinstance.py:1346} INFO - Marking task as FAILED.

when i don't give metadata it runs fine, but adding metadata causes failure of the task.

0

There are 0 answers