hashicorp / terraform

Terraform enables you to safely and predictably create, change, and improve infrastructure. It is a source-available tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned.
https://www.terraform.io
Other
42.78k stars 9.56k forks source link

Allow expect_failures to target child module input variables in Terraform test run blocks #34951

Open trajan3 opened 7 months ago

trajan3 commented 7 months ago

Terraform Version

1.7.4

Use Cases

I'm looking to add automated test cases to ensure that our variable validation blocks are working as expected, since some of the involved conditional logic is non-trivial. (So, for avoidance of doubt, I'm looking at input variables only here, rather than resources or other checkable objects - see #34700 for that.)

Whilst most of our validations are at the root module level (as many as possible), a number have been delegated to child modules too; this is primarily due to the limitation Terraform still has in cross-referencing input variables within validation blocks (see #25609), but we can also then rely on derived variables without defining them multiple times.

This balance has worked pretty well till now, and we've been comfortable to just get the validation error messages (regardless of where they appear in the hierarchy). However, in looking to add Terraform tests for the nested validation blocks, I'm not (afaict - see here) able to target the input variables in child modules and hence have no way of catching the error messages that are produced there. Omitting the expect_failures attribute produces the underlying error message from the validation block, but fails the test (which I want to be considered as a pass).

I understand various points that have been made elsewhere around encapsulation best practices, and I also appreciate that the delegated validation blocks could be tested at the child module level. That being said, treating our child modules as black boxes for a moment, I don't really need to define tests at that level as I can read their outputs just fine from the root module level for other testing. Furthermore, if we can already read the outputs from child modules it makes a certain amount of sense to support reading their inputs as well - this doesn't mean that I care about the way in which the child module resources are defined, and so the module is still fairly encapsulated.

Attempted Solutions

No obvious solutions, not able to target the checkable objects in question.

Proposal

Add the capability to access module input variables from within the expect_failures attribute of Terraform test run blocks - perhaps something like module.<module_name>.var.<input_name>, e.g. module.s3.var.bucket_name. This provides a consistent approach to modules from the outside, in the sense that one can access the module inputs as well as the outputs (as can already be done).

Failing this, modifying the attribute to expect errors without an object reference would be acceptable, since the error message already gets produced during the test. I'm not sure if this is preferable, however, as unexpected errors could produce false passes (unless we can identify errors based on the error message itself).

References

No response

crw commented 7 months ago

Thanks for this feature request! If you are viewing this issue and would like to indicate your interest, please use the 👍 reaction on the issue description to upvote this issue. We also welcome additional use case descriptions. Thanks again!

trajan3 commented 6 months ago

Hi, just wanted to check if there were any thoughts on this yet? Specifically, I'd like to understand if there are any suggestions on how best to workaround this until a good mechanism is available... I understand that testing at the child module level is possible, but this would require crafting more complex variable blocks (including derived variables) and wouldn't necessarily be a good check of the workflow end-to-end.

Also, whilst allowing cross-variable referencing will allow me to shift our validations to the root module level, where they can then be tested, the validations themselves will become more complex owing to the need to recreate the derived variables that they rely upon. So not sure if that's necessarily the golden ticket out of my particular predicament.

Any thoughts appreciated! I'm sure this issue must have been encountered by others, doesn't seem especially niche.

crw commented 6 months ago

Hi @trajan3, I would suggest asking this in the community forum where there are more people ready to help. The GitHub issues here are monitored only by a few core maintainers. Thanks!