confluentinc / terraform-provider-confluent

Terraform Provider for Confluent
Apache License 2.0
29 stars 64 forks source link

Error while importing confluent_kafka_topic #157

Closed kobejn-jb closed 1 year ago

kobejn-jb commented 1 year ago

Hi,

I have two envs managed with terraform, I'm using Service Account(OrganizationAdmin role) api key as credentials for provider. I was updating my terraform code to use confluent provider ver 1.21 (previously it was using deprecated confluentcloud provider). I had no issue migrating one env, I've removed topics from TF state and imported them as new provider resources. But I'm having issues on my second env:

IMPORT_KAFKA_API_KEY="<<KEY>>" IMPORT_KAFKA_API_SECRET="<<SECRET>>" IMPORT_KAFKA_REST_ENDPOINT="https://pkc-xxxxx.us-east4.gcp.confluent.cloud:443" terragrunt import confluent_kafka_topic.topics[\"platform.presence.classroom\"] lkc-o32gzo/platform.presence.classroom
...
Error: error importing Kafka Topic "lkc-o32gzo/platform.presence.classroom": 401 Unauthorized: Unauthorized

In env vars I was putting key and secret that were set as credentials when creating topics and I've also tried using key and secret from SA with org admin role I'm getting 401 on each try, any hints?

linouk23 commented 1 year ago

πŸ‘‹ @kobejn-jb, thanks for creating the issue!

I had no issue migrating one env, But I'm having issues on my second env: Is there any difference between 2 environments?

Overall, it seems like you're doing the right thing.

In env vars I was putting key and secret that were set as credentials when creating topics

That's right.

It'd be helpful to share a little bit more details about your TF configuration but here's a rough list of steps you might want to follow when importing a Kafka Topic:

  1. Double check Kafka API Key (credentials.key), Kafka API Secret (credentials.secret), Kafka REST endpoint, Kafka Cluster ID.

  2. You TF configuration should look like:

    
    terraform {
    required_providers {
    confluent = {
      source  = "confluentinc/confluent"
      version = "1.23.0"
    }
    }
    }

provider "confluent" { cloud_api_key = var.confluent_cloud_api_key cloud_api_secret = var.confluent_cloud_api_secret }

... resource "confluent_kafka_topic" "orders" { kafka_cluster { id = confluent_kafka_cluster.basic.id } topic_name = "orders" rest_endpoint = confluent_kafka_cluster.basic.rest_endpoint credentials { key = confluent_api_key.app-manager-kafka-api-key.id secret = confluent_api_key.app-manager-kafka-api-key.secret } }


3. 

$ export IMPORT_KAFKA_API_KEY="" # confluent_api_key.app-manager-kafka-api-key.id $ export IMPORT_KAFKA_API_SECRET="" # confluent_api_key.app-manager-kafka-api-key.secret $ export IMPORT_KAFKA_REST_ENDPOINT="" # confluent_kafka_cluster.basic.rest_endpoint $ terraform import confluent_kafka_topic.my_topic lkc-abc123/orders-123


It might be also helpful to share debug logs:

➜ $ export TF_LOG=trace
➜ $ export TF_LOG_PATH="./logs.txt"



and then you might look at `logs.txt` for a more descriptive error message / feel free to send it to `cflt-tf-access@confluent.io` after removing all the sensitive data.
linouk23 commented 1 year ago

I just tested it manually and it seems to work for me:

➜ βœ— terraform init                                                  

Initializing the backend...

Initializing provider plugins...
- Finding confluentinc/confluent versions matching "1.23.0"...
- Installing confluentinc/confluent v1.23.0...
- Installed confluentinc/confluent v1.23.0 (signed by a HashiCorp partner, key ID D4A2B1EDB0EC0C8E)

Partner and community providers are signed by their developers.
If you'd like to know more about provider signing, you can read about it here:
https://www.terraform.io/docs/cli/plugins/signing.html

Terraform has created a lock file .terraform.lock.hcl to record the provider
selections it made above. Include this file in your version control repository
so that Terraform can guarantee to make the same selections by default when
you run "terraform init" in the future.

Terraform has been successfully initialized!

You may now begin working with Terraform. Try running "terraform plan" to see
any changes that are required for your infrastructure. All Terraform commands
should now work.

If you ever set or change modules or backend configuration for Terraform,
rerun this command to reinitialize your working directory. If you forget, other
commands will detect it and remind you to do so if necessary.

➜ βœ— export TF_VAR_confluent_cloud_api_key="NAUGWXY7QYJHT3OJ"                                                   
➜ βœ— export TF_VAR_confluent_cloud_api_secret="..."
➜ βœ— export IMPORT_KAFKA_API_KEY="ARWEQEZAGSFJJXV2"
➜ βœ— export IMPORT_KAFKA_API_SECRET="..."
➜ βœ— export IMPORT_KAFKA_REST_ENDPOINT="https://pkc-....us-west-2.aws.confluent.cloud:443"                                                     

➜ βœ— terraform import confluent_kafka_topic.orders lkc-nwom9d/topic_1
confluent_kafka_topic.orders: Importing from ID "lkc-nwom9d/topic_1"...
confluent_kafka_topic.orders: Import prepared!
  Prepared confluent_kafka_topic for import
confluent_kafka_topic.orders: Refreshing state... [id=lkc-nwom9d/topic_1]

Import successful!

The resources that were imported are shown above. These resources are now in
your Terraform state and will henceforth be managed by Terraform.

➜ βœ— terraform plan                                                                 
confluent_kafka_topic.orders: Refreshing state... [id=lkc-nwom9d/topic_1]

No changes. Your infrastructure matches the configuration.

Terraform has compared your real infrastructure against your configuration and found no differences, so no changes are needed.
linouk23 commented 1 year ago

I can reproduce your 401 error when using a wrong Kafka API Key:

➜ βœ— export IMPORT_KAFKA_API_KEY="ARWEQEZAGSFJJXV21" # updating it from ARWEQEZAGSFJJXV2 to ARWEQEZAGSFJJXV21                                              
➜ βœ— terraform state rm confluent_kafka_topic.orders
Removed confluent_kafka_topic.orders
Successfully removed 1 resource instance(s).
➜ βœ— terraform import confluent_kafka_topic.orders lkc-nwom9d/topic_1
confluent_kafka_topic.orders: Importing from ID "lkc-nwom9d/topic_1"...
β•·
β”‚ Error: error importing Kafka Topic "lkc-nwom9d/topic_1": 401 Unauthorized: Unauthorized

so it might be a great idea for you to double check that IMPORT_KAFKA_API_KEY and IMPORT_KAFKA_API_SECRET are accurate. Otherwise, it might be a great idea to create a new Kafka API Key for that Kafka cluster and use it instead to import your topic.

Let me know if that helps @kobejn-jb!

kobejn-jb commented 1 year ago

Thank you @linouk23

It turned out that cluster on the environment that was giving me issues didn't have any API keys (I think there was a key at some point but user account associated with that key was deleted) and I was using cloud API keys not being aware that there is a difference.