confluentinc / terraform-provider-confluent

Terraform Provider for Confluent
Apache License 2.0
30 stars 64 forks source link

Missing attribute "confluent_topic_type" of Kafka topic config in Terraform provider #427

Open RishuSinghS opened 2 months ago

RishuSinghS commented 2 months ago

Hello,

I have noticed that a new attribute, confluent_topic_type, is automatically added to Kafka Topic configurations when created manually through the Confluent portal.

image

However, I cannot find this attribute currently documented or supported in the Terraform provider for Kafka topics: Terraform Kafka Topic Resource.

Our workflow fails when attempting to import a manually created topic through Terraform, as it tries to remove the confluent.topic.type attribute:

### 0 to add, 1 to change, 0 to destroy, 0 to replace.
- change
    - module.kafka_topic.confluent_kafka_topic.kafka_topic
<details><summary>Change details</summary>

# module.kafka_topic.confluent_kafka_topic.kafka_topic will be updated in-place
@@ -3,7 +3,6 @@
     "cleanup.policy": "delete",
     "confluent.key.schema.validation": "false",
     "confluent.key.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicNameStrategy",
-    "confluent.topic.type": "standard",
     "confluent.value.schema.validation": "true",
     "confluent.value.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicNameStrategy",
     "delete.retention.ms": "86400000",

Could you please assist with a fix for this issue?

channingdong commented 1 month ago

Hi Customer,

Which TF provider version were you using to replicate this issue?

And to replicate the issue, is the step-by-step workflow as below?

channingdong commented 1 month ago

Also, could you please share how do you import the manually created Kafka topic through TF? You don't need to share the whole code block, the essential code snippet is okay.

dedovicnermin commented 4 weeks ago

Important to create the topic through the UI (DYNAMIC_TOPIC_CONFIG)

TF import block Instead of running terraform import command, include an import block and run terraform plan

import {
  to = confluent_kafka_topic.example
  id = "lkc-xyz/orders"
}

resource "confluent_kafka_topic" "example" {
  kafka_cluster {
    id = lkc-xyz
  }
  topic_name         = "orders"
  partitions_count   = 4
  rest_endpoint      = confluent_kafka_cluster.basic-cluster.rest_endpoint
  config = {
    "cleanup.policy"                      = "delete"
    "delete.retention.ms"                 = "86400000"
    "max.compaction.lag.ms"               = "9223372036854775807"
    "max.message.bytes"                   = "2097164"
    "message.timestamp.after.max.ms"      = "9223372036854775807"
    "message.timestamp.before.max.ms"     = "9223372036854775807"      
    "message.timestamp.difference.max.ms" = "9223372036854775807"
    "message.timestamp.type"              = "CreateTime"
    "min.compaction.lag.ms"               = "0"
    "min.insync.replicas"                 = "2"
    "retention.bytes"                     = "-1"
    "retention.ms"                        = "604800000"
    "segment.bytes"                       = "104857600"
    "segment.ms"                          = "604800000"
  }
  credentials {
    key    = confluent_api_key.app-manager-kafka-api-key.id
    secret = confluent_api_key.app-manager-kafka-api-key.secret
  }
}
davekpatrick commented 3 days ago

any update on the resolution of this issue ? Pull request https://github.com/confluentinc/terraform-provider-confluent/pull/437 has been opened to update the editableTopicSettings code