hashicorp / terraform

Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned.
https://www.terraform.io
Other
43.15k stars 9.58k forks source link

terraform test expected_failure not recognizing missing data resource as failure #35949

Open mgrubb opened 4 weeks ago

mgrubb commented 4 weeks ago

Terraform Version

Actual code
Terraform v1.9.4
on darwin_arm64
+ provider registry.terraform.io/hashicorp/aws v5.63.0

Minimal Example
Terraform v1.9.4
on darwin_arm64
+ provider registry.terraform.io/hashicorp/aws v5.74.0

Terraform Configuration Files

main.tf:

variable "batch_exec_role_name" {
  type = string
 }

data "aws_iam_role" "batch_exec_role" {
    name = var.batch_exec_role_name
}
# ...

main.tftest.hcl:

variables {
  some_other_var = "some_val"
}

run "invalid_role_name" {
  variables {
    batch_exec_role_name = "does-not-exist"
  }
  expect_failures = [
    data.aws_iam_role.batch_exec_role,
  ]
  command = plan
}

Debug Output

https://gist.github.com/mgrubb/ba584b36173af380a9c0c8ea191e7beb

Expected Behavior

Test should pass by noting the data resource was not found and caused an error

Actual Behavior

Test fails by missing not finding the failure caused by the missing data resource.

Steps to Reproduce

  1. terraform init
  2. terraform test

Additional Context

using expect_failures = [var.some_var] for detecting variables that do not pass validation rules works as expected. I also tried adding a postcondition to the data resource with a condition of self.name == var.batch_exec_role_name but that didn't seem to have any impact.

References

No response

liamcervante commented 4 weeks ago

Hi @mgrubb, the expect_failures attribute only validates against Custom Conditions within the configuration. The custom conditions are a part of the configuration that configuration authors have control over and hence is something they could write tests against. The behaviour you've described seems more about testing the behaviour of Terraform itself rather than the configuration that you've written - you can trust that Terraform will error if the provider for a given data source returns an error when it is retrieved. That's not something you should have to test.

I've updated this to an enhancement request, but I think it's unlikely we'll make this change. This is because, as mentioned, the test command is to enable authors to test the configuration they've written rather than validating the behaviour of Terraform internals. There's a similar feature request in #34871 which effectively has the same response as here, configuration authors should be able to trust that Terraform will behave as expected and we have other tests for that.