confluentinc / terraform-provider-confluent

Terraform Provider for Confluent
Apache License 2.0
118 stars 61 forks source link

Terraform provider does not work well when deploying a Flink Model/Statement that uses sql.secrets.* #397

Open novarz opened 5 days ago

novarz commented 5 days ago

TF tries to redeploy the resource even when nothing has changed with the consequent failure.

image

novarz commented 5 days ago

Here is the TF

--------------------------------------------------------

Flink SQL: CREATE Model vector_encoding

--------------------------------------------------------

resource “confluent_flink_statement” “create_model” { depends_on = [ resource.confluent_environment.environment, resource.confluent_schema_registry_cluster.essentials, resource.confluent_kafka_cluster.cluster, resource.confluent_connector.datagen_products, resource.confluent_flink_compute_pool.my_compute_pool, resource.confluent_role_binding.app-general-environment-admin ] organization { id = data.confluent_organization.main.id } environment { id = confluent_environment.environment.id } compute_pool { id = confluent_flink_compute_pool.my_compute_pool.id } principal { id = confluent_service_account.app-general.id } properties = { “sql.current-catalog” : confluent_environment.environment.display_name “sql.current-database” : confluent_kafka_cluster.cluster.display_name “sql.secrets.openaikey” : var.openai_key } statement = “CREATE MODEL vector_encoding INPUT (input STRING) OUTPUT (vector ARRAY) WITH( ‘TASK’ = ‘classification’,‘PROVIDER’ = ‘OPENAI’,‘OPENAI.ENDPOINT’ = ‘https://api.openai.com/v1/embeddings','OPENAI.API_KEY’ = ‘{{sessionconfig/sql.secrets.openaikey}}’);” rest_endpoint = data.confluent_flink_region.my_flink_region.rest_endpoint credentials { key = confluent_api_key.my_flink_api_key.id secret = confluent_api_key.my_flink_api_key.secret } lifecycle { prevent_destroy = false } }