I'm deploying Azure OpenAI Service via Terraform, and I want to set up a private endpoint for it. The docs and this article suggest that, besides a private endpoint, I need a private DNS zone containing an A record for the private endpoint.
It looks like this is not enough because I get the error "Public access is disabled. Please configure private endpoint." in Azure AI Studio when I test my GPT-35-turbo model.
Here's my Terraform code:
main.tf
resource "azurerm_resource_group" "rg" {
location = "westeurope"
name = "test-rg"
}
resource "azurerm_cognitive_account" "openai" {
name = "REDACTED"
location = "westeurope"
resource_group_name = azurerm_resource_group.rg.name
kind = "OpenAI"
sku_name = "S0"
custom_subdomain_name = "REDACTED"
public_network_access_enabled = false
}
resource "azurerm_virtual_network" "vnet" {
name = "test-network"
location = azurerm_resource_group.rg.location
resource_group_name = azurerm_resource_group.rg.name
address_space = ["10.1.0.0/16"]
}
resource "azurerm_subnet" "private_subnet" {
name = "test-private-subnet"
resource_group_name = azurerm_resource_group.rg.name
virtual_network_name = azurerm_virtual_network.vnet.name
address_prefixes = ["10.1.1.0/24"]
private_endpoint_network_policies_enabled = true
}
resource "azurerm_private_endpoint" "private_endpoint" {
name = "test-openai-private-endpoint"
location = azurerm_resource_group.rg.location
resource_group_name = azurerm_resource_group.rg.name
subnet_id = azurerm_subnet.private_subnet.id
private_service_connection {
name = "test-openai-privconn"
private_connection_resource_id = azurerm_cognitive_account.openai.id
subresource_names = ["account"]
is_manual_connection = false
}
}
resource "azurerm_private_dns_zone" "openai" {
name = "privatelink.openai.azure.com"
resource_group_name = azurerm_resource_group.rg.name
}
resource "azurerm_private_dns_a_record" "openai" {
name = "test-openai-private-endpoint"
zone_name = "privatelink.openai.azure.com"
resource_group_name = azurerm_resource_group.rg.name
ttl = 300
records = [azurerm_private_endpoint.private_endpoint.private_service_connection[0].private_ip_address]
}
resource "azurerm_private_dns_zone_virtual_network_link" "link" {
name = "test-vnet-link"
resource_group_name = azurerm_resource_group.rg.name
private_dns_zone_name = azurerm_private_dns_zone.openai.name
virtual_network_id = azurerm_virtual_network.vnet.id
}
resource "azurerm_cognitive_deployment" "model_gpt_35_turbo" {
name = "test-gpt-35-turbo-model"
cognitive_account_id = azurerm_cognitive_account.openai.id
model {
format = "OpenAI"
name = "gpt-35-turbo"
version = "0301"
}
scale {
type = "Standard"
}
}
providers.tf
terraform {
required_version = ">=0.12"
required_providers {
azurerm = {
source = "hashicorp/azurerm"
version = "~>3.64.0"
}
}
}
provider "azurerm" {
features {}
subscription_id = "REDACTED"
}
Additional info: I don't have any DNS server (the virtual network uses the default DNS server by Azure).
I tried the same scenario in my environment and got the same result as you, even though I created the private endpoint in
OpenAI Studio
.Disabled all networks and enabled Private Endpoint.
When I try to open the Chat Service in OpenAI Studio from another network, I also receive the same message.
You will encounter the above error even if you disable all networks under the
Firewall and Virtual Network
settings and create aprivate endpoint
in OpenAI Studio.This confirms that the issue is not related to the configurations.To test the connection, I created a virtual machine within the same
VNet
and tested theChat Service
. It is working as expected.To create a private endpoint for
OpenAI Studio
, you need to integrate the private endpoint with aPrivate DNS Zone
, as per microsoft's suggestion.Reference: DNS configuration