hashicorp / terraform-provider-azurerm

Terraform provider for Azure Resource Manager
https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs
Mozilla Public License 2.0
4.53k stars 4.6k forks source link

Error creating azurerm_hdinsight_spark_cluster&azurerm_hdinsight_hbase_cluster for HDInsight on Azure with Terraform #15504

Open brucema-cloud opened 2 years ago

brucema-cloud commented 2 years ago

Community Note

Terraform (and AzureRM Provider) Version

terraform { required_version = "= 0.14.11" }

provider "azurerm" { version = "~> 2.0" }

Affected Resource(s)

https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/hdinsight_spark_cluster https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/hdinsight_hbase_cluster

Terraform Configuration Files

# Copy-paste your Terraform configurations here - for large Terraform configs,
# please use a service like Dropbox and share a link to the ZIP file. For
# security, you can also encrypt the files using our GPG public key: https://keybase.io/hashicorp

Error Output

Error: creating HDInsight Spark Cluster "spark2-stg-stream" (Resource Group "rg-stg-stream"): hdinsight.ClustersClient#Create: Failure sending request: StatusCode=400 -- Original Error: Code="BadRequest" Message="User input validation failed. Errors: The request payload is invalid. Endpoint type 'privatelink' in url wasb://spark-stg-01@ststgstreamspark.privatelink.blob.core.windows.net is neither 'blob' or 'dfs',User input validation failed. Errors: The request payload is invalid. Endpoint type 'privatelink' in url wasb://spark-stg-01@ststgstreamspark.privatelink.blob.core.windows.net is neither 'blob' or 'dfs',User input validation failed. Errors: The request payload is invalid. Endpoint type 'privatelink' in url wasb://spark-stg-01@ststgstreamspark.privatelink.blob.core.windows.net is neither 'blob' or 'dfs'"

Panic Output

Expected Behaviour

Actual Behaviour

Steps to Reproduce

  1. terraform apply

Important Factoids

Plan Output

module.hdi-stg-spark-stream.azurerm_hdinsight_spark_cluster.hid_spark_cluster will be created
  + resource "azurerm_hdinsight_spark_cluster" "hid_spark_cluster" {
      + cluster_version               = "4.0"
      + encryption_in_transit_enabled = (known after apply)
      + https_endpoint                = (known after apply)
      + id                            = (known after apply)
      + location                      = "southeastasia"
      + name                          = "spark02-stg-stream"
      + resource_group_name           = "xxxx"
      + ssh_endpoint                  = (known after apply)

      + tier                          = "Standard"

      + component_version {
          + spark = "2.4"
        }

      + gateway {
          + enabled  = true
          + password = (sensitive value)
          + username = "admin"
        }

      + network {
          + connection_direction = "Outbound"
          + private_link_enabled = false
        }

     ........................

      + storage_account {
          + is_default           = true
          + storage_account_key  = (sensitive value)
          + storage_container_id = "https://ststgstreamspark.privatelink.blob.core.windows.net/spark-stg-01"
          + storage_resource_id = "/subscriptions/xxxxxxx/resourceGroups/rg-stg-stream/providers/Microsoft.Storage/storageAccounts/ststgstreamspark"
        }
    }

References

brucema-cloud commented 2 years ago

Hi team. Because My storage account is using a private link. When the storage_container_id be formatted like "https://ststgstreamspark.privatelink.blob.core.windows.net/spark-stg-01", the creation for Spark and Hbase will be failed. When the storage_container_id be formatted like "https://ststgstreamspark.blob.core.windows.net/spark-stg-01", it will work.

the error output: Error: creating HDInsight Spark Cluster "spark2-stg-stream" (Resource Group "rg-stg-stream"): hdinsight.ClustersClient#Create: Failure sending request: StatusCode=400 -- Original Error: Code="BadRequest" Message="User input validation failed. Errors: The request payload is invalid. Endpoint type 'privatelink' in url wasb://spark-stg-01@ststgstreamspark.privatelink.blob.core.windows.net is neither 'blob' or 'dfs',User input validation failed. Errors: The request payload is invalid. Endpoint type 'privatelink' in url wasb://spark-stg-01@ststgstreamspark.privatelink.blob.core.windows.net is neither 'blob' or 'dfs',User input validation failed. Errors: The request payload is invalid. Endpoint type 'privatelink' in url wasb://spark-stg-01@ststgstreamspark.privatelink.blob.core.windows.net is neither 'blob' or 'dfs'"

neil-yechenwei commented 2 years ago

@brucema-cloud , thanks for raising this issue. Per the error message, seems Service API doesn't support the format "https://xxxxx.privatelink.blob.core.windows.net/xxx".

brucema-cloud commented 2 years ago

Hi, @neil-yechenwei Could you be sure that the Service API does not support this format?

neil-yechenwei commented 2 years ago

The error message returned by the property validation from service side has indicated.

brucema-cloud commented 2 years ago

Hi @neil-yechenwei Is there any plan to fix it?

neil-yechenwei commented 2 years ago

It's Service API limitation not TF bug. May I ask is there any document indicates this scenario should be supported? Could you share us the link? Thanks.

brucema-cloud commented 2 years ago

https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-restrict-public-connectivity As mentioned in this document, when I restrict public connectivity in Azure HDInsight, I can configure Azure Private Link-enabled dependency resources to use with HDInsight clusters. I just want to specify the default storage for HDInsight by using the storage account in private link format. @neil-yechenwei

neil-yechenwei commented 2 years ago

Filed an issue on Azure/azure-rest-api-specs/issues/18020 for tracking.