Preload multiple Spark versions in instance pools

104 views Asked by At

I am updating a terraform module on instance pools, and I wonder whether it's possible to preload multiple Spark versions in pools as to accommodate users that might wanna create clusters that run on different Spark versions.

Under the Terraform registry Databricks provider documentation regarding the databricks_instance_pool resource there is an argument reference precisely on this, but as far as I can tell the pool may only install up to one runtime version, though the argument is called preloaded_spark_versions which lead me into confusion.

Thank you!

1

There are 1 answers

0
Alex Ott On BEST ANSWER

It's a known behavior of the API, and it's documented (also in TF docs):

A list containing at most one preloaded Spark image version for the pool.

Most probably (I don't know exactly), the idea was to support multiple DBR versions, but it wasn't implemented. But we need to keep attribute names in Terraform the same as in API.