Closed FeLiNa22 closed 11 months ago
Hello, @FeLiNa22 I am unable to reproduce the problem. If you head out to https://github.com/upstash/terraform-provider-upstash/tree/master/examples/examples/kafka_topic and give
resource "upstash_kafka_topic" "exampleKafkaTopic" {
topic_name = var.topic_name
partitions = var.partitions
retention_time = 1099511627776000
retention_size = 1099511627776
max_message_size = var.max_message_size
cleanup_policy = var.cleanup_policy
cluster_id = resource.upstash_kafka_cluster.exampleKafkaCluster.cluster_id
}
You should be able to create however large sizes you want. And there, there is also a small example of how you can give input as GB or TB, using locals, in commented out sections.
One thing to bear in mind is that, for you to successfully create the topic, the limits you provide should be less than or equal to what is set in your Upstash account. Meaning, if you are given 1 TB as retention_size limit, you cannot give 2TB in terraform as well. This results in "Invalid Retention Size", which comes from the API, not terraform itself.
Below, I am sharing an example that I configured only using terraform:
We are currently using upstash with terraform however as the
retention_size
,retention_time
andmax_message_size
are set in megabits and milliseconds, the maximum number that can be passed prevents creating a topic with a retention size of 1TB. I suggest this is changed to accept string inputs.