Cannot create secret scope: Azure KeyVault is not available. Terraform

629 views Asked by At

I'm deploying a keyvault and a databricks resource with terraform. The resources are deploy via a Service Principal that will give contributor rights to every thing it deploys.

I also added to myself contributor role for the KeyVault and created a user that is assigned to admin group in Databricks.

I followed the documentation on how to create a secret scope on databricks, it worked with the UI with my account. But I wish to automate this with Terraform, so I added this in my tf file:

resource "databricks_secret_scope" "kv_db" {
  name = module.keyvault_gb.name

  keyvault_metadata {
    resource_id = module.keyvault_gb.id
    dns_name    = module.keyvault_gb.uri
  }
}

This is the result of my terraform plan

  # databricks_secret_scope.kv_db will be created
  + resource "databricks_secret_scope" "kv_db" {
      + backend_type = (known after apply)
      + id           = (known after apply)
      + name         = "xxxxxx"

      + keyvault_metadata {
          + dns_name    = "https://xxxxxx.vault.azure.net/"
          + resource_id = "/subscriptions/***/resourceGroups/*********/providers/Microsoft.KeyVault/vaults/xxxxxx"
        }
    }

And the result of my terraform apply

│ Error: cannot create secret scope: Azure KeyVault is not available
│ 
│   with databricks_secret_scope.kv_db,
│   on databricks.tf line 68, in resource "databricks_secret_scope" "kv_db":
│   68: resource "databricks_secret_scope" "kv_db" {

I don't understand why it does not work in terraform since I managed to create one with the UI. (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#create-an-azure-key-vault-backed-secret-scope-using-the-ui)

The only difference is the user used, the UI is my own user, and the terraform is a Service Principal that has contributor role on all resources deployed by it. The SP is also admin member on databricks.

I even tried to recreate the keyvault policy to be sure to add the good rights for the SP, but the policy already exists:

data "azurerm_client_config" "current" { # Using the deployement Service Principal
}

resource "azurerm_key_vault_access_policy" "this" {
  key_vault_id       = module.keyvault_gb.id
  tenant_id          = data.azurerm_client_config.current.tenant_id
  object_id          = data.azurerm_client_config.current.object_id
  secret_permissions = ["Delete", "Get", "List", "Set"]
}

│ Error: A resource with the ID "/subscriptions/***/resourceGroups/xxxxx/providers/Microsoft.KeyVault/vaults/xxxxxx/objectId/xxxxxx-xxxxx-xxxxx-xxxx-xxxxx" already exists - to be managed via Terraform this resource needs to be imported into the State. Please see the resource documentation for "azurerm_key_vault_access_policy" for more information.

Update: Here are the rights of the role of the SP used to deploy the KeyVault/Databricks and attempting to create the secret scope SP KeyVault

Here is the role of my user: enter image description here

Update 2: I added more roles to the SP, but still not working. enter image description here

Update 3: Using @Vinay-B code and changing the keyvault name, this is what I get in terraform plan and apply:

Terraform will perform the following actions:

  # azurerm_resource_group.example will be created
  + resource "azurerm_resource_group" "example" {
      + id       = (known after apply)
      + location = "eastus"
      + name     = "demorgvk1"
    }

  # azurerm_key_vault.example will be created
  + resource "azurerm_key_vault" "example" {
      + access_policy                 = (known after apply)
      + id                            = (known after apply)
      + location                      = "eastus"
      + name                          = "kv-3-stackoverflow"
      + public_network_access_enabled = true
      + resource_group_name           = "demorgvk1"
      + sku_name                      = "standard"
      + soft_delete_retention_days    = 90
      + tenant_id                     = "***"
      + vault_uri                     = (known after apply)
    }

  # azurerm_key_vault_access_policy.example will be created
  + resource "azurerm_key_vault_access_policy" "example" {
      + id                 = (known after apply)
      + key_vault_id       = (known after apply)
      + object_id          = "4ace89c9-93bf-4041-b6e9-bc73c2fcb144"
      + secret_permissions = [
          + "Get",
          + "List",
          + "Set",
          + "Delete",
        ]
      + tenant_id          = "***"
    }

  # azurerm_resource_group.example will be updated in-place
  ~ resource "azurerm_resource_group" "example" {
        id       = "/subscriptions/***/resourceGroups/demorgvk1"
        name     = "demorgvk1"
      ~ tags     = {
          - "availability"    = "A1" -> null
          - "confidentiality" = "C1" -> null
          - "integrity"       = "I1" -> null
          - "spoke_type"      = "APPI" -> null
          - "traceability"    = "T1" -> null
        }
        # (1 unchanged attribute hidden)
    }


  # databricks_secret_scope.kv_db will be created
  + resource "databricks_secret_scope" "kv_db" {
      + backend_type = (known after apply)
      + id           = (known after apply)
      + name         = "vk-secret-scope"

      + keyvault_metadata {
          + dns_name    = (known after apply)
          + resource_id = (known after apply)
        }
    }


│ Error: cannot create secret scope: Azure KeyVault is not available
│ 
│   with databricks_secret_scope.kv_db,
│   on tests.tf line 25, in resource "databricks_secret_scope" "kv_db":
│   25: resource "databricks_secret_scope" "kv_db" {

enter image description here

Update 4: (Partial workaround)

I found an ugly work around, it's to automate inside the terraform the usage of the Databricks's API 2.0

// Create SP in Databricks
resource "databricks_service_principal" "sp" {
  application_id = <SP_APPLICATION_ID> 
  external_id    = <SP_OBJECT_ID> 
  force          = false 
}
//Get the admin group in Databricks
data "databricks_group" "group_admins" { display_name = "admins" }
//Assign databricks users to Admin databricks group
resource "databricks_group_member" "admin_assign" {
  group_id  = data.databricks_group.group_admins.id
  member_id = databricks_service_principal.sp.id
  }
resource "null_resource" "databricks_secret_scope_api" {
  provisioner "local-exec" {
    command = <<-EOT atoken=$(curl -X POST \
     -H "Content-Type: application/x-www-form-urlencoded" \ 
     https://login.microsoftonline.com/<TENANT_ID>/oauth2/v2.0/token \
     -d "client_id=<SP_APPLICATION_ID>" \
     -d "grant_type=client_credentials" \
     -d "scope=2ff814a6-3304-4ab8-85cb-cd0e6f879c1d%2F.default" \
     -d "client_secret=<SP_PASSWORD>" | jq --raw-output .access_token)
    curl -X POST \
      https://${<YOUR DATABRICKS BASE URL>}/api/2.0/secrets/scopes/create \
      --header "Content-type: application/json" \
      --header "Authorization: Bearer $atoken" \
      --data '{
        "scope": "<SCOPE NAME>",
        "scope_backend_type": "AZURE_KEYVAULT",
        "backend_azure_keyvault": {
          "resource_id": "<KEYVAUKT ID>",
          "dns_name": "<KEYVAULT URI>"
        }
      }'
  EOT
  }
  depends_on = [databricks_group_member.admin_assign]
}

This will run only ONCE. So if someone deletes the secret scope outsite the terraform, the state will never know it and thus never recreate it.

You can add a trigger to make it run all the time, it will destroy the terraform reference (NOT the secret scope) and create it back. It will fail because the secret scope already exists. You could improve this to run every time and list the secret scope names on databricks and create it only if you don't find it.

2

There are 2 answers

12
Vinay B On

I tired to create secret scope: Azure KeyVault & DataBricks using terraform I was able to provision the requirement successfully.

The issue you're encountering with Terraform when trying to create a Databricks secret scope backed by Azure KeyVault seems to be related to the permissions of the Service Principal (SP) you are using.

  • Your Service Principal needs specific permissions to manage KeyVault secrets and Databricks. The error message suggests that the SP might not have the necessary permissions to access the KeyVault.
  • Ensure the SP has 'Get', 'List', 'Set', and 'Delete' permissions on the KeyVault secrets. It also needs permissions to manage resources in Databricks.

The error message regarding the azurerm_key_vault_access_policy suggests a resource ID conflict. If a resource was created outside of Terraform and you now want to manage it with Terraform, you need to import it into your Terraform state.

My terraform configuration:

main.tf:

terraform {
  required_providers {
    azurerm = {
      source  = "hashicorp/azurerm"
      version = "~> 2.0"
    }
    databricks = {
      source  = "databricks/databricks"
      version = "~> 0.3"  # Specify the version you need
    }
  }
}

provider "azurerm" {
    features {}
}

provider "databricks" {
  azure_workspace_resource_id = "/subscriptions/[your subscription_id]/resourceGroups/[resource_group_name]/providers/Microsoft.Databricks/workspaces/[databricks_name]"
}

data "azurerm_client_config" "current" {}

resource "azurerm_resource_group" "example" {
  name     = "demorgvk1"
  location = "east us"
}

resource "azurerm_key_vault" "example" {
  name                        = "examplevk-vault"
  location                    = azurerm_resource_group.example.location
  resource_group_name         = azurerm_resource_group.example.name
  tenant_id                   = data.azurerm_client_config.current.tenant_id
  sku_name                    = "standard"

  # Other necessary configurations...
}

resource "azurerm_key_vault_access_policy" "example" {
  key_vault_id       = azurerm_key_vault.example.id
  tenant_id          = data.azurerm_client_config.current.tenant_id
  object_id          = data.azurerm_client_config.current.object_id
  secret_permissions = ["Get", "List", "Set", "Delete"]
}

resource "databricks_secret_scope" "kv_db" {
  name = "vk-secret-scope"

  keyvault_metadata {
    resource_id = azurerm_key_vault.example.id
    dns_name    = azurerm_key_vault.example.vault_uri
  }
}

Output:

enter image description here

enter image description here

enter image description here

Now open Databricks UI and create a notebook and run the command mentioned to list the available scopes created

This command lists available scopes on databricks:

dbutils.secrets.listScopes()

enter image description here

ref: https://docs.databricks.com/api/azure/workspace/secrets

ref: List databricks secret scope and find referred keyvault in azure databricks

ref: https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes

0
Kojo Saah On

@BeGreen, I solved this by adding a Provider to initialize the databricks workspace.

provider "databricks_provider" {
  #Get the workspace_url from the output of the databricks module
  alias               = "databricks"
  host                = workspace_url
  azure_client_id     = client_id 
  azure_client_secret = sp_clientSecret_value
  azure_tenant_id     = tenant_id
}

resource "databricks_secret_scope" "kv" {
  provider = databricks_provider.databricks
  name     = "secret-kv-managed"

  keyvault_metadata {
    resource_id = data.azurerm_key_vault.main.id
    dns_name    = data.azurerm_key_vault.main.vault_uri
  }
}