Following Meta Llama 2 7B product in Azure marketplace, I deployed it in azure ai service. AFter the deployment I got my endpoint as something like
https://mydemo-llama-serverless.eastus2.inference.ai.azure.com
I already enabled the private endpoint for my Azure AI but those endpoints can give the private endpoints for the domain privatelink.api.azureml.ms and privatelink.notebooks.azure.net however my Pay-as-you-go deployment having the domain as inference.ai.azure.com which is still public.
Question: How can I make my inference endpoint for my Pay-as-you-go model deployment as private?
If the answer is not yet supported can someone guide me on any workaround or when we can expect this feature?