Open jycamus90 opened 8 years ago
Hi J.Y, Can you specify what the input will be? list of test classes or list of test methods
I am thinking of having users specify test case(s) in config file so that Tacoco can ignore them during the execution. For example, adding another option like -Dtacoco.ignore and pass a value that consists of class path + test case name (i.e. org.spideruci.tacoco.TacocoAnalyzer.testCase - this doesn't exist. I'm just giving you an example of a format like test.class.path.testcasemethodname).
@jycamus90 - when you say test case, do you mean test class or test method?
test method. sorry for the confusion.
@jycamus90 - it might be tricky to actually block the execution of a specific test method, because the Analyzer will run on the entire test class, not on individual test methods. My point is that ignoring test classes will be easier, and totally do-able. But i am not sue exactly how you would block the execution of a single test method. Any ideas?
I thought we could maintain a blacklist list like in Blinky and when JUnitRunner.java iterates through test methods in a test class in shouldRun() method, we could search through the list to check that the list contains that test method's name. Please advise me if I'm missing something or wrong.
We can maintain a list of methods (with their pull class+package name), but how/where exactly would you prevent the execution of those test methods?
Oh okay, it seems like that Junit only supports running all test cases in class situation. Hmmm let me re-think about this issue.
I guess that we could still have an ignore list at a class-level granularity, i.e. ignore an entire test class. And if tomorrow, we are able to figure this out for methods, then we can extend this approach to ignore test methods as well.
I was looking through the code and see that PITHandler which has the run method for PITAnalyzer already has some exclusionTest method. any idea what is it for?
That is for the mutation testing using PIT :)
@reetasingh i say that you go ahead with implementing this feature assuming that the input will be an ignore-list of test-classes. We can extend it to specific method later.
@VijayKrishna Yes. I have looked at the code and have rough idea about how to go about it. Lets review that tomorrow.
Hi, Have implemented this feature considering input as comma separated list of ignore-classes. Tested it for spiderMath_TestNG and spiderMath_Junit projects under resources file. Have tested this against Tacoco Analyzer. I also want to test whether code works for PIT analyzer. Any idea can I use same projects (spiderMath_TestNG and spiderMath_Junit ) to test this with PIT analyzer or do I need to use some another project
@reetasingh: that is fantastic news! JY will be busy this week, else i would have asked her to provide some pointers on the PIT analyzer. But before we get to PIT, can you think of a way scheme for testing the black list, and possibly automating this test-scheme? For instance, how are you currently verifying that a test-class is ignored by the test runner and an analysis?
I am checking the log on command prompt and db file. In the database file generated, there is a Test Case table which has list of all test cases executed. If a particular test case class is ignored, it will not be present in this table. and in the statement coverage table there will no record pointing to this test case.
what does the log file say? and can we inspect that to check if a test case is ignored?
@reetasingh So, the reason i am asking for details+ideas for automated tests is so that you can open a pull request and we can start the process of merging in your changes. The sooner we get this done, the sooner we can move to other more exciting features :)
Yes. So in the project, all the test case class that are executed are printed on screen. The ignored test class do not come in the executed section. This is how I am judging it that the test case is ignored. I was planning to test it for PIT analyzer and then open pull request. Since the code is written in base class and I have tested the functionality only for one subclass. After that's done , I can open the pull request.
I also need to so some code refactoring before opening pull request. I will get back to it once my finals are over (Dec 9).
Now, there is no way that a user can specify the list of test cases that she wants Tacoco to ignore while running. It would be nice if a user can specify the test case black list in Tacoco config file so Tacoco can automatically ignore the specified test cases.