Closed linouk23 closed 2 years ago
We're very excited to let you know we've just published a new version of TF Provider that includes user
data source among other very exciting improvements: it enables fully automated provisioning of our key Kafka workflows (see the demo) with no more manual intervention and makes it our biggest and most impactful release.
The only gotcha we've renamed it from confluentinc/confluentcloud
to confluentinc/confluent
but we published a migration guide so it should be fairly straightforward. The existing confluentinc/confluentcloud
will be deprecated soon so we'd recommend switching as soon as possible.
New confluentinc/confluent
provider also includes a lot of sample configurations so you won't need to write them from scratch. You can find them here, find a full list of changes here.
Hi @linouk23 , I noticed you added API keys to the official provider. is this for cloud, schema and cluster API keys? if so, does the cluster API key work on Dedicated Clusters on Private Link?
@adronamraju you might take a look at confluent_api_key
resource's docs for all the details but to answer your specific question, as of now (0.7.0
version) confluent_api_key
resource supports both Cloud and Kafka API Keys (schema registry clusters / keys are not supported at the moment).
if so, does the cluster API key work on Dedicated Clusters on Private Link?
They do! Moreover, you could use one of the prewritten e2e fully automated configurations to simplify your deployment process (so you only need to fill in terraform.tfvars
without writing any .tf
configuration files from scratch):
Also you might find more instructions about running it (as well as a cool demo) by reading our updated Sample Project guide.
Feel free to ping me here or create new issues if you run into any problems.
What
Add confluentcloud_user data source (see User on the Confluent Cloud API docs).
How
We could start by looking at @dwimsey's amazing PR.