confluentinc / terraform-provider-confluentcloud

Confluent Cloud Terraform Provider is deprecated in favor of Confluent Terraform Provider
https://registry.terraform.io/providers/confluentinc/confluentcloud/latest/docs
52 stars 23 forks source link

Unable to import cluster from confluent cloud #58

Closed jmborsani closed 2 years ago

jmborsani commented 2 years ago

Terraform version

Terraform v0.14.8
+ provider registry.terraform.io/confluentinc/confluentcloud v0.5.0
+ provider registry.terraform.io/hashicorp/aws v3.75.0
+ provider registry.terraform.io/hashicorp/vault v3.2.1

Configuration

The API key was created following this documentation: https://registry.terraform.io/providers/confluentinc/confluentcloud/latest/docs/guides/sample-project#get-a-confluent-cloud-api-key

resource "confluentcloud_kafka_cluster" "basic" {
  count = var.type == "basic" ? 1 : 0

  display_name = var.cluster_name
  availability = var.availability
  cloud        = var.cloud
  region       = var.region

  basic {}

  environment {
    id = var.environment_id
  }
}

Command

terraform import "confluentcloud_kafka_cluster.basic[0]" "<ENV_ID>/<CLUSTER_ID>"

The environment_id looks like env-1234 and cluster_id looks like lkc-1234.

Output

confluentcloud_kafka_cluster.basic[0]: Importing from ID "<ENV_ID>/<CLUSTER_ID>"...
confluentcloud_kafka_cluster.basic[0]: Import prepared!

Import successful!

The resources that were imported are shown above. These resources are now in
your Terraform state and will henceforth be managed by Terraform.

Trace

2022-03-22T16:31:31.793-0300 [INFO]  plugin.terraform-provider-confluentcloud_0.5.0: 2022/03/22 16:31:31 [ERROR] Kafka cluster get failed for id <CLUSTER_ID>, &{403 Forbidden 403 HTTP/1.1 1 1 map[Access-Control-Allow-Credentials:[true] Access-Control-Allow-Headers:[Authorization,Accept,Origin,DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Content-Range,Range] Access-Control-Allow-Methods:[GET,POST,OPTIONS,PUT,DELETE,PATCH] Connection:[keep-alive] Content-Length:[193] Content-Type:[application/json] Date:[Tue, 22 Mar 2022 19:31:26 GMT] Server:[nginx] Strict-Transport-Security:[max-age=31536000; includeSubDomains; preload] X-Content-Type-Options:[nosniff] X-Frame-Options:[deny] X-Request-Id:[c1f310758e972c4b59130f19618f5ae0] X-Xss-Protection:[1; mode=block]] {{
  "errors": [
    {
      "id": "c1f310758e972c4b59130f19618f5ae0",
      "status": "403",
      "code": "forbidden_access",
      "detail": "Forbidden Access",
      "source": {}
    }
  ]
}} 193 [] false false map[] 0xc0002b0800 0xc000245550}, 403 Forbidden: timestamp=2022-03-22T16:31:31.793-0300
2022-03-22T16:31:31.793-0300 [INFO]  plugin.terraform-provider-confluentcloud_0.5.0: 2022/03/22 16:31:31 [WARN] Kafka cluster with id=<CLUSTER_ID> is not found: timestamp=2022-03-22T16:31:31.793-0300
confluentcloud_kafka_cluster.basic[0]: Import prepared!
2022-03-22T16:31:31.795-0300 [WARN]  plugin.stdio: received EOF, stopping recv loop: err="rpc error: code = Unavailable desc = transport is closing"
2022-03-22T16:31:31.799-0300 [DEBUG] plugin: plugin process exited: path=.terraform/providers/registry.terraform.io/confluentinc/confluentcloud/0.5.0/linux_amd64/terraform-provider-confluentcloud_0.5.0 pid=342318
2022-03-22T16:31:31.799-0300 [DEBUG] plugin: plugin exited
2022/03/22 16:31:31 [INFO] Writing state output to: 

Import successful!

The resources that were imported are shown above. These resources are now in
your Terraform state and will henceforth be managed by Terraform.
linouk23 commented 2 years ago

Thanks for opening an issue @jmborsani!

That's definitely a bug (displaying Import successful! vs actual 403) that we will address in our next release.

Speaking about the reason why you got 403, you need to set the following env variables first before importing the resource:

$ export CONFLUENT_CLOUD_API_KEY="<cloud_api_key>"
$ export CONFLUENT_CLOUD_API_SECRET="<cloud_api_secret>"
# and then run
$ terraform import "confluentcloud_kafka_cluster.basic" "<ENV_ID>/<CLUSTER_ID>"

Let me know if that helps.

jmborsani commented 2 years ago

That worked. Thank you! @linouk23

linouk23 commented 2 years ago

@jmborsani we're very excited to let you know we've just published a new version of TF Provider that includes a fix for this issue among other very exciting improvements: it enables fully automated provisioning of our key Kafka workflows (see the demo) with no more manual intervention and makes it our biggest and most impactful release.

The only gotcha we've renamed it from confluentinc/confluentcloud to confluentinc/confluent but we published a migration guide so it should be fairly straightforward. The existing confluentinc/confluentcloud will be deprecated soon so we'd recommend switching as soon as possible.

New confluentinc/confluent provider also includes a lot of sample configurations so you won't need to write them from scratch. You can find them here, find a full list of changes here.