I'm trying to update cluster custom_tags using the DataBricks API found here. The error message I receive does not match the Required parameters shown in the docs (cluster_id and spark_version are the only Required params).
Request:
curl --location --request POST 'https://databricks.com/api/2.0/clusters/edit' \
--header 'Authorization: Bearer token' \
--header 'Content-Type: application/json' \
--data-raw '{
"cluster_id": 1234567,
"cluster_version": "10.4.x-scala2.12",
"custom_tags": {
"TEST": "TEST"
}
}'
Response: Status 400:
{
"error_code": "INVALID_PARAMETER_VALUE",
"message": "Exactly 1 of virtual_cluster_size, num_workers or autoscale must be specified.",
"details": [
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "CM_API_ERROR_SOURCE_CALLER_ERROR",
"domain": ""
}
]
}
EDIT 1: Got it working though quite cumbersome.
Thank you @JayashankarGS.
autoscale min_workers and max_workersare required though not clearly documented.node_type_idis required though undocumented- Be sure to parse and add any existing
custom_tagsto the/editrequest. Otherwise they will be lost as the/editcustom_tagsaction overwrites the existing values.
You need to provide required fields in json. In documentation the given required fields are
cluster_id,spark_versionand autoscale -min_workers,max_workersornum_workers.Even after adding this fields it ask some of the fields required like
node_type_id.Try below code.
Output:
and