Closed red8888 closed 1 year ago
It does not. I would merge a PR adding this, but are you sure that format can even work for mutation testing output?
@boxed this is a good question. I rewrote my question because I suppose my real question is much broader. I don't want to throw out my static analysis platform in leu of something else and would like to avoid having a whole different system for processing these reports.
I'm pretty ignorant of how this is generally implemented. Do users integrate this with tools like sonar or do they treat it like a whole separate thing in addition to sonar?
Also, FYI I'm asking sonar the same thing, cross linking my questions on their forum and this github just for reference: https://community.sonarsource.com/t/is-python-mutmut-supported/79256
It's a question of motive I think. For me, the point of mutation testing is to have a tool to help me make my tests better. In that case a "report" is just a list of mutants to kill, or a todo list. I've heard of another tool selling mutation testing as a way to prove that the test suite of a product is weak in order to negotiate a lower purchase price in a takeover (I find this use case distasteful and dishonest).
Reporting the number of mutants to me seems not really useful. Either a team cares about having a solid test suite, or they don't. Putting the number of mutants on a board doesn't really help either way, and having a detailed report showing exactly which mutants exist seems even less useful.
That being said, the html report is something a lot of people have asked for. I'm not sure anyone actually used it after they got what they asked for though :P
@boxed I'm realizing how ignorant I really am of this type of testing.
So it seems like theres no reasonable way to try to shove mutmut test results into a regular pytest coverage/test results report right? Would it not be useful to see mutation testing results in the same place as the unit test results. ie a widget on the PR page for example- this is what sonar cloud does.
For build automation do you just run mutmut and if the mutation "score" (Killed Mutants / Total number of Mutants) is not 100% fail the build?
I'm realizing how ignorant I really am of this type of testing.
I literally wrote mutmut just to understand it myself! You're in good company :P
So it seems like theres no reasonable way to try to shove mutmut test results into a regular pytest coverage/test results report right?
You could just translate any surviving mutant on a line as that line having partial coverage. That would certainly work.
For build automation do you just run mutmut and if the mutation "score" (Killed Mutants / Total number of Mutants) is not 100% fail the build?
Oh no, that's crazy unless you're doing something truly extreme like deep space missions, or heart pacemakers or something (but then you probably shouldn't be using python!!). I run mutmut once in a while and try to kill some mutants that I think should not have survived or that I think look especially worrisome. That's it. I don't run mutmut in CI and I don't see the point honestly.
I'm new to mutation testing and its unclear to me how it works with standard code coverage and static analysis tooling.
I have several different languages and for all of them I use sonarqube to process, view, report on code quality and reject bad code that doesn't meet the coverage and quality rules I set.
How does mutmut play into this? If I want to use this will I need to use a whole different system for reading mutmut reports or can I integrate it into sonar qube/cloud?
About formatting for sonarqube
https://docs.sonarqube.org/latest/analyzing-source-code/test-coverage/python-test-coverage/#analysis-parameter
Sonarqube requires the report be in "Cobertura XML" format. Does it support that?
Can I run this through tox or pytest to generate the correctly formatted report?
Thanks!