Google Cloud Dataflow Workbench instance is created via Terraform but notebook is not up

20 views Asked by At

I have provisioned a Google Cloud Dataflow Workbench instance via Terraform and below is a sample snippet. I used a container image block with notebook instance resources which bring up the Apache Beam environment (user-managed-notebook env).

When I tried to open the notebook by launching link from the console it throws a 524 error. When I checked the backend VM error logs, it shows the notebook-collection agent is not able to reach the Jupyter API server and seems like it is not up.

The question is, do I need to pass any specific settings in the Terraform block for container image for the notebooks to be up properly?

Note: When I create the same via manual, everything looks good and I am able to launch the notebook, so no issues with firewall/network/service account/roles etc.

resource "google_notebooks_instance" "instance" {
  name = "notebooks-instance"
  location = "us-central1-a"
  machine_type = "e2-medium"

   container_images {
        repository = "gcr.io/deeplearning-platform-release/beam-notebooks"
        tag = "latest"
      }

  service_account = "[email protected]"

  install_gpu_driver = true
  boot_disk_type = "PD_SSD"
  boot_disk_size_gb = 110

  no_public_ip = true
  no_proxy_access = false

  network = data.google_compute_network.my_network.id
  subnet = data.google_compute_subnetwork.my_subnetwork.id

  labels = {
    k = "val"
  }

  metadata = {
    proxy-mode=service-account
    terraform = "true"
  }
0

There are 0 answers