hashicorp / terraform

Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned.
https://www.terraform.io/
Other
42.68k stars 9.55k forks source link

Module depends_on not followed on terraform destroy #29090

Closed rmgpinto closed 3 years ago

rmgpinto commented 3 years ago

Terraform Version

1.0.0

Terraform Configuration Files

module "kops" {
 source = "../../modules/kops"
}

module "aws-load-balancer-controller" {
  source                  = "../../modules/ingress/aws-load-balancer-controller"
  dns_domain              = local.dns_domain
  kubernetes_cluster_name = local.kubernetes_cluster_name
  depends_on = [
    module.kops
  ]
}

module "doorman" {
  source                  = "../../modules/doorman"
  aws_region              = var.aws_region
  dns_domain              = local.dns_domain
  kubernetes_cluster_name = local.kubernetes_cluster_name
  ssl_certificate         = module.dns.ssl_certificate
  depends_on = [
    module.kops,
    module.aws-load-balancer-controller
  ]
}

Expected Behavior

We have a kops cluster and on top of that there are multiple apps. In particular we have an aws-load-balancer-controller module that creates ALB load balancers and target groups automatically. When I try to terraform destroy -target module.kops, both modules are destroyed simultaneously and the depends_on is not followed by terraform on the destroy order. I was expecting the module doorman to be completely destroyed before starting the destroy on the aws-load-balancer-controller. This behavior is leaving dangling AWS load balancers and target groups, which don't get destroyed properly by aws-load-balancer-controller, because this module is destroyed before it can delete correctly the resources.

Actual Behavior

Terraform is destroying both modules in parallel and we have to manually destroy resources on AWS.

Steps to Reproduce

terraform destroy -target module.kops

jbardin commented 3 years ago

Hi @rmgpinto,

Thanks for filing the issue. Modules in terraform are not isolated in the way you are describing here, they are logical groupings of configurations but terraform is still free to perform resource actions concurrently when possible. It is always preferable to pass the desired data between modules and allow terraform to order all resources individually than to use depends_on and try to force the ordering of modules as a whole.

When managed resources are destroyed, the ordering is defined by the dependencies they had when they were created or last updated. This means that the depends_on references here should add the desired dependencies for the correct destroy ordering. Since being able to destroy resources should not be effected by the usage of modules or depends_on, this may be failing in other ways.

I would first verify that the configuration can be reliably destroyed without using the -target flag. If that does work, perhaps there is some transitive dependency which is being dropped with the use of targeting.

If the configuration cannot be reliably destroyed in any case, then it is more likely a mis-configuration within the modules themselves, and my first guess would be that some resources require create_before_destroy and are missing that in their configuration.

In either case, it would be helpful to have a minimal reproducible example to work with here, so that we can see how the configuration is structured, and what happens when the destroy operation fails.

Thanks!

jbardin commented 3 years ago

Since we have not heard back in a while I'm going to close the issue. If you have any updates about the problem, feel free to open a new issue with the requested information. If you have more questions, you can also use the community forum where there are more people ready to help.

Thanks!

github-actions[bot] commented 3 years ago

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.