spark session isolation in databricks

169 views Asked by At

I am using databricks and running multiple notebooks on the same cluster. Each notebook represents a task which requires specific task settings.

Now when i change the spark settings on a notebook, it seems this gets applied at the cluster level and the same change happens for the spark session on the other notebook.

But this is not what i want, i want to be able to provide different spark setting for each notebook on the same cluster

I have tried setting the spark.databricks.session.share to false but still the same issue persists.

Is this possible in Databricks?

0

There are 0 answers