Open rymnc opened 3 years ago
On further investigation of the state, it looks like instead of using 1 source with multiple secrets, the provider is placing each secret in a different source. I assume this is why it wants to update the deployment every time Example -
{
"projected": [
{
"default_mode": "0644",
"sources": [
{
"config_map": [],
"downward_api": [],
"secret": [
{
"items": [],
"name": "service-secrets",
"optional": false
}
],
"service_account_token": []
},
{
"config_map": [],
"downward_api": [],
"secret": [
{
"items": [],
"name": "zzz-secrets",
"optional": false
}
],
"service_account_token": []
}
]
}
]
}
I've modified the volume config to look like this
dynamic "volume" {
for_each = local.secret_length_gt_0 && var.inject_db_url ? [1] : []
content {
name = "${var.service_name}-secrets"
projected {
sources {
secret {
name = "${var.service_name}-secrets"
optional = false
}
}
sources {
secret {
name = "zzz-secrets"
optional = false
}
}
}
}
}
}
It seems to work fine. There should probably be a check in the provider to disallow multiple secrets in the same source
Marking this issue as stale due to inactivity. If this issue receives no comments in the next 30 days it will automatically be closed. If this issue was automatically closed and you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. This helps our maintainers find and focus on the active issues. Maintainers may also remove the stale label at their discretion. Thank you!
We still see the same behavior in a job
resource with a projected config_map
and secret
in the same volume.
Terraform v1.2.5 - provider registry.terraform.io/hashicorp/kubernetes v2.12.1
Notably, the perpetual diff does not resolve by updating in-place to two separate source
blocks (one per secret
/config_map
) and by adding the (according to the documentation optional) optional = false
parameter. It still tries to change the order of the source
blocks around in each plan, but doesn't seem to update it to its preferred order in the state then. Perhaps during plan there is an ordered comparison and during apply it is unordered?
Only tainting and recreating the resource in the exact shape and order as shown in the old plans has fixed it.
Example snippet with perpetual diff (part of a kubernetes_job_v1 spec.template.spec.volume
)
volume {
name = "v"
projected {
sources {
config_map {
name = "c"
}
secret {
name = "s"
}
}
}
}
Fixed example, after taint/recreation:
volume {
name = "v"
projected {
sources {
config_map {
name = "c"
optional = false
}
}
sources {
secret {
name = "s"
optional = false
}
}
}
}
Marking this issue as stale due to inactivity. If this issue receives no comments in the next 30 days it will automatically be closed. If this issue was automatically closed and you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. This helps our maintainers find and focus on the active issues. Maintainers may also remove the stale label at their discretion. Thank you!
Not stale
We just ran into this issue as well. Without any change to the code, Terraform will remove the source
and recreate it on every apply
. IMHO this is a definite bug in the provider. Without any change to the code or the deployed resource, there should not be any diff in Terraform.
Terraform Version, Provider Version and Kubernetes Version
Affected Resource(s)
Terraform Configuration Files
Assume the following inputs -
Steps to Reproduce
terraform apply
--> creates the deployment with the volumeterraform apply
--> the volume must be replacedExpected Behavior
It should apply the second time without any changes
Actual Behavior
The second secret in volume.projected.sources is always replaced. Example -
The state for the service above ->
Is it intended to store them in different sources?
Important Factoids
The volume setup is made so that only if the service requires the zzz secret, it is mounted as a projected volume with the existing secrets (if any). Only one of these volumes must be mounted.
The issue only arises with the projected volume
Community Note