Open rvdouderaa opened 7 months ago
Thanks for this feedback, @rvdouderaa!
Earlier iterations of the terraform test
experiment did have some JUnit XML support, but we found two challenges along the way:
We also got very, very little feedback on that part of the experiment and so we didn't feel confident enough in our guesses at problem 1 to warrant spending all the time working through problem 2.
However, now that you're here with a request specifically for this, we can potentially make some progress on problem 1!
To help with that, I wonder if you'd be willing to create and share a practical example of a JUnit XML file reporting the results of a realistic terraform test
run you've done where there were some interesting failures to report, and/or where the success case produces something useful for you in the Azure DevOps UI.
That would help by illustrating one possible mapping from Terraform's concepts to the JUnit concepts that is at least useful for Azure DevOps in particular, and hopefully also useful in some other similar test orchestrators that support JUnit XML. The idea here would be to figure out a mapping that is useful in practice with JUnit XML parser implementations that are in real use, as opposed to a mapping which is theoretically plausible but causes the result to be less useful in real existing test reporting software.
Thanks again!
Thanks for the response @apparentlymart
I found the following page, which describes the format https://github.com/testmoapp/junitxml / https://windyroad.com.au/dl/Open%20Source/JUnit.xsd
Maybe no official documentation, however the Apache Ant implementations seems to be the defacto standard.
As for use cases. We want to add Terraform Test to our CI/CD pipelines as we are using tfsec
and tflint
. Results should be exported to a readable format by Azure Devops, so test results can be published and pipeline then failed. This would need a feature to continue when he step fails, so the error can be handled by the nexts step.
It doesn't really matter on which test it would fail, the tests are there for a reason.
For example:
run "input_validation" {
command = plan
variables {
name = "asdfaslkdjfkasjdflkasjdf!@$@$"
resource_group_name = "core-services-rg"
}
expect_failures = [
var.name
]
}
This name should not be accepted by the module (only a-z and 0-9) so the input should fail. If not, something is wrong with the modules validation check and we should not be able to publish the module. And we would it to be displayed like this tfsec
error
Hope this makes our usecase(s) clear.
Addition:
This are the test formats supported by Azure Devops: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/reference/publish-test-results-v2?view=azure-pipelines&tabs=trx%2Ctrxattachments%2Cyaml
Junit scheme: https://github.com/windyroad/JUnit-Schema/blob/master/JUnit.xsd
Thanks for that extra context, @rvdouderaa!
If you're able to do so, it would help to see an example of exactly what JUnit XML tfsec
and tflint
are creating for you, since presumably in your case mimicking how those tools use JUnit XML would give a result that fits well with how you're already using it with those other tools.
Of course if you can't share it then we ought to be able to test with those tools ourselves at some point to find out, but seeing an example from your already-running system will make it easier to quickly see if mimicking how those tools use the format is likely to be a viable strategy for terraform test
.
If you would like to share those files, I'd suggest creating a GitHub Gist with the two (or more) files inside it and linking to it here, just because GitHub comments don't work very well for sharing longer code examples.
Thanks!
@apparentlymart
I created a gist with a tfsec
and a tflint
xml output
https://gist.github.com/rvdouderaa/40821f63aa1407279a3e29292f34ce0c
Thanks for sharing those, @rvdouderaa.
It seems that both of these tools have made the decision that the file containing the problem should be treated as the "test class name" in JUnit terms. However, it also seems from your screenshot of Azure DevOps that it doesn't actually pay attention to that at all; I don't see the filenames appearing anywhere in the UI.
It also seems that tfsec elected to use "tfsec" as the name of the entire suite, while tflint did not name the test suite at all.
For naming the test cases themselves, tflint used some sort of identifier terraform_typed_variables
-- I guess that's the machine-readable name for one of its lint rules? -- while tfsec seems to have just chosen a human-readable description of the problem itself (duplicating the text from the failure message inside) rather than of what was being tested.
For terraform test
we have some additional concepts that would need to be mapped onto the JUnit XML model:
.tftest.hcl
file) that was being evaluated.run
block) that was being evaluated.check
blocks.Given that, I suppose one possible way to map it would be:
testsuites
element, so that the report can include multiple suites.testsuite
element, named after the basename of the .tftest.hcl
file it came from.run
block is a testcase
, whose name is the label in the run
block header. class
would not be present at all, since Terraform doesn't have anything analogous to "classes" in JUnit, and the tool you are using seems to ignore it anyway.system-err
element beneath that test step's testcase
.If at least one checkable object fails any of its checks, the testcase
contains a failure
element whose body contains some textual representation of all of the failures.
(It isn't clear to me whether multiple failure
elements are supported; ideally there'd be a separate failure
for each checkable object that failed so that each one can be presented separately, but I've assumed here that tools would only accept one since I can't see any example of multiple failures in the docs you linked.)
Does that seem plausible to you as a way to populate the JUnit XML format based on a terraform test
outcome?
@apparentlymart you can see the testsuite name in the Azure Devops screenshot (first line with a cross, tfsec (1/1)
That's the only reference. In the screenshot above, there were no tflint
findings.
The proposed solutions seems plausible. However I tried this with multiple findings in tflint, and it creates 2 seperate testcase
entries... I updated the gist with an example.
looking forward for this functionality to be enabled. When can we expect this to be live?
@mefarazahmad There is no commitment. I would recommend following the PR: https://github.com/hashicorp/terraform/pull/34291
Hi all,
Today's alpha release of Terraform CLI includes an early experimental option to generate JUnit XML output.
For those who are interested in seeing this move forward, it would be much appreciated if you could share feedback in the community forum topic about the experiment.
I must forewarn that I'm currently focused primarily on a different project and so this JUnit XML experiment is a bit of a "background task" for me and so I might be slow to respond, but I do intend to collect up all the feedback and act on it later.
Thanks!
(We use the community forum for this sort of early feedback, rather than discussion here in GitHub, because the Discourse forum does at least have some support for tracking which comments are in reply to which other comments, whereas unstructured GitHub issue discussions tend to just become a confusing pile! We'll transition back to discussing in GitHub once it seems clearer what design we're going to move forward with and we become more concerned with the details of getting it implemented "for real".)
Hi again!
After some discussion in the community forum topic I linked earlier, there were some conclusions I wanted to bring back in here to inform subsequent rounds of experiment:
The way we’re describing the test scenarios (each separate .tftest.hcl
file) doesn’t seem to match what these tools are expecting: testsuite
names didn't appear anywhere in the UI of either Azure DevOps or GitLab CI/CD. It seems like we should try moving the test scenario name into the “classname” attribute instead, to see if that makes it visible to these tools.
Test time durations are effectively mandatory in this format, because tools assume that if they are absent then the test took zero seconds to run, rather than (what we had hoped for) treating it as “test duration unknown”.
This one is trickier because the test harness doesn’t currently measure total duration of tech test step and scenario at all, so we’ll need to add that to the test harness itself before we could include that information in the JUnit XML output.
Thanks to everyone for sharing their feedback and screenshots!
I'm going to be away from this issue for at least a little while since my attention is needed elsewhere, but hopefully the above will be useful for either future-me or someone else working on a second round of experiment soon.
Some subsequent discussion in the community forum topic led to an additional idea:
It would be nice if the test harness treated test failures as different to normal errors, and then we placed the failure message text in the single-line "failure message" (the message
attribute of the failure
element), which today is always set to one of a small set of hard-coded messages reporting the test status.
Much like the feedback about durations, this seems to require some changes to the test runner itself -- to report test failures as a separate signal to errors -- rather than just a change to the JUnit renderer.
Hi all,
(Sorry for accidentally closing this before. I accidentally triggered GitHub's automatic closing of this issue due to the wording I used in a pull request. :man_facepalming: )
The latest alpha release includes some updates responding to the two items of feedback I described two comments above this one. There's more information in the community forum topic requesting feedback, and I'd appreciate any efforts to re-test this (or to test it for the first time!) with your chosen JUnit XML-supporting software.
This doesn't include any changes for the third item I described in the comment directly above this one, because that seems to require more invasive changes. Although I hope to do it eventually, my focus for the moment is on figuring out the best way to map Terraform's testing model onto the JUnit XML format, and it already seems clear where the improved failure messages would be placed (in the message
attribute of the failure
element), so there's no strong need to immediately experiment further with that part.
Thanks!
Terraform Version
Use Cases
Tools like
tfsec
andtflint
can output their test results as junit xml, which can then eg. be published and shown in the tests tab of azure devops pipeline runs.Attempted Solutions
n/a it's not in the documentation
Proposal
Add the functionality to export test results (or at least on test failure) as JUNIT XML files, so this can be used in eg. Azure Devops to show result in the tests tab, like tfsec / tflint.
References
No response