Mongey / terraform-provider-kafka

Terraform provider for managing Apache Kafka Topics + ACLs
MIT License
517 stars 129 forks source link

client_cert and client_key loaded as files instead of strings #146

Closed jakubmiarka closed 4 years ago

jakubmiarka commented 4 years ago

The following config works for me just fine:

provider kafka {
  bootstrap_servers = module.msk.bootstrap_brokers_tls
  client_cert       = "-----BEGIN CERTIFICATE-----\nMIIE <...> N7W\nH4=\n-----END CERTIFICATE-----"
  client_key        = "-----BEGIN PRIVATE KEY-----\nMIIE <...> XP7TqxA==\n-----END PRIVATE KEY-----"
  tls_enabled       = true
}

However, when I want to use values for the cert and key from somewhere else, such as:

data aws_ssm_parameter kafka_client_cert {
  name = "/atlantis/${var.name}/kafka_client_cert"
}

data aws_ssm_parameter kafka_client_key {
  name = "/atlantis/${var.name}/kafka_client_key"
}

provider kafka {
  bootstrap_servers = module.msk.bootstrap_brokers_tls
  client_cert       = data.aws_ssm_parameter.kafka_client_cert.value
  client_key        = data.aws_ssm_parameter.kafka_client_key.value
  tls_enabled       = true
}

apply errors out with:

terraform apply
module.acls.kafka_acl.example-topic-acl: Creating...

Error: open -----BEGIN PRIVATE KEY-----\nMIIE <...> TqxA==\n-----END PRIVATE KEY-----: no such file or directory

  on .module/acls/example-topic.tf line 5, in resource "kafka_acl" "example-topic-acl":
   5: resource kafka_acl example-topic-acl {

Running version 0.2.10 and TF 0.12. My set-up is nearly identical to https://github.com/Mongey/terraform-provider-kafka/issues/123#issuecomment-640918718 so I'm a bit puzzled?

Mongey commented 4 years ago

🤔 this is very strange. Can you share a minimal reproduction terraform setup using AWS, and I can see if I can repro.

My instinct is to just remove the file-based loading, it has been deprecated for a long time now.

jakubmiarka commented 4 years ago

I did more digging

This works:

variable kafka_client_cert {
  default = "-----BEGIN CERTIFICATE-----\nMIIE <...> niH4=\n-----END CERTIFICATE-----"
}

variable kafka_client_key {
  default = "-----BEGIN PRIVATE KEY-----\nMIIE <...> TqxA==\n-----END PRIVATE KEY-----\n"
}

provider kafka {
  bootstrap_servers = module.msk.bootstrap_brokers_tls
  client_cert       = var.kafka_client_cert
  client_key        = var.kafka_client_key
  tls_enabled       = true
}

However, even if I parse the data via locals from the ssm it still thinks it's a file and tries to open it. Also tried to put quotes around it such as client_cert = "\"${data.aws_ssm_parameter.kafka_client_cert.value}\"" or use the tostring() method but it still errors with Error: open "-----BEGIN CERTI ...

It errors on the plan so you should be able to reproduce quite easily with:

data aws_ssm_parameter kafka_client_cert {
  name = "/my-path/kafka_client_cert"
}

provider kafka {
  bootstrap_servers = ["localhost:9094"]
  client_cert       = data.aws_ssm_parameter.kafka_client_cert.value
  client_key        = "not-relevant-for-plan"
  tls_enabled       = true
}

Removing file-based loading would probably fix it. Strings are easier to handle (and preferred), especially when secrets are involved.

jakubmiarka commented 4 years ago

I think I've cracked it.

It's the way strings and new lines \n are handled based on the source. It seems they get escaped when retrieving from SSM but get interpreted when reading directly from the string. So using a single line (e.g. "-----BEGIN CERTIFICATE-----\nMIIE ...) in terraform is fine, but when storing it in SSM it must be standard x509 format with new lines, e.g.

-----BEGIN CERTIFICATE-----
nMIIE ...
....
-----END CERTIFICATE-----