Azure / terraform-azurerm-openai

Terraform module for deploying Azure OpenAI Service.
MIT License
44 stars 33 forks source link

private_endpoint/private_service_connection is invalid - reporting "request_message" attribute must not be empty #77

Open chickbr opened 5 months ago

chickbr commented 5 months ago

Is there an existing issue for this?

Greenfield/Brownfield provisioning

brownfield

Terraform Version

1.5.7

Module Version

0.1.3

AzureRM Provider Version

3.109.0

Affected Resource(s)/Data Source(s)

azurerm_private_endpoint

Terraform Configuration Files

module "openai" {
  source              = "Azure/openai/azurerm"
  version             = "0.1.3"
  resource_group_name = azurerm_resource_group.westernim_poc.name
  location            = azurerm_resource_group.westernim_poc.location
  private_endpoint = {
    pe = {
      name                 = "pe"
      vnet_rg_name         = "resource_group"
      vnet_name            = "vnet"
      subnet_name          = "subnet"
      is_manual_connection = true
    }
  }
  public_network_access_enabled = false
}

tfvars variables values

n/a

Debug Output/Panic Output

Error: validating the configuration for Private Endpoint (Subscription: "xxxxxxx"
│ Resource Group Name: "resource_group"
│ Private Endpoint Name: "pe"): "private_service_connection":"privateserviceconnection" is invalid, the "request_message" attribute must not be empty
│ 
│   with module.openai.azurerm_private_endpoint.this["pe"],
│   on .terraform/modules/openai/private_endpoint.tf line 6, in resource "azurerm_private_endpoint" "this":
│    6: resource "azurerm_private_endpoint" "this" {

Expected Behaviour

it should create the private endpoint even without a request_message as the attribute is supposed to be optional

Actual Behaviour

produces error indicating request_message is required

Steps to Reproduce

terraform apply

Important Factoids

n/a

References

request_message is supposed to be optional: https://registry.terraform.io/providers/hashicorp/azurerm/3.109.0/docs/resources/private_endpoint#request_message

but I found a somewhat similar issue reported here: https://github.com/hashicorp/terraform-provider-azurerm/issues/23763 and one of the contributors to the azurerm provider indicates that it actually is required: https://github.com/hashicorp/terraform-provider-azurerm/issues/23763#issuecomment-1867254981

Which is more a problem with the azurerm documentation than with this module but I assume this module will have to adapt.

zioproto commented 5 months ago

hello @chickbr this issue has nothing to do with our AzureOpenAI module.

If you believe this is a bug in the provider, you should open an issue at https://github.com/hashicorp/terraform-provider-azurerm/issues

The reason why request_message is marked as optional, is that you need it only when is_manual_connection is set to true, so you can safely omit it when is_manual_connection is false.

You can also submit a documentation PR https://github.com/hashicorp/terraform-provider-azurerm to make it explicit that the value is required when using manual connections.

Is there any actionable change you would like to see in this module to improve this situation ? please let me know how I can help. Thanks