hcoles / pitest

State of the art mutation testing system for the JVM
Apache License 2.0
1.67k stars 358 forks source link

False positive on autowired spring beans when using @import on configuration files. #308

Open peter-janssen opened 7 years ago

peter-janssen commented 7 years ago

Took me a while to figure this out. But in the case when a Spring configuration file is importing another one. Mutations on the bean return statements survive.

Example:

@Configuration
@Import({Services.class})
public class App 
{
    @Bean
    public HelloWorldService helloWorldService() {
        return new HelloWorldService();
    }
}
@Configuration
public class Services 
{    
    @Bean
    public HellnoWorldService hellnoWorldService() {
        return new HellnoWorldService();
    }
}

When pit is run against the following tests it reports this for one of the beans (depending on test order):

  1. mutated return of Object value for nl/test/App::helloWorldService to ( if (x != null) null else throw new RuntimeException ) → SURVIVED
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(loader = AnnotationConfigContextLoader.class, classes = {App.class, Services.class})
//@ContextConfiguration(classes = {App.class, Services.class}) (these configs yield the same result)
//@ContextConfiguration(classes = App.class)
public class AppTest 
{
    @Autowired
    private HelloWorldService helloWorldService;

    @Autowired
    private HellnoWorldService hellnoWorldService;

    @Test
    public void testHelloWorldService()
    {
        assertNotNull(helloWorldService);
        assertEquals("Hello World!", helloWorldService.getGreeting());
    }

    @Test
    public void testHellnoWorldService()
    {
        assertNotNull(hellnoWorldService);
        assertEquals("O hell no!", hellnoWorldService.getGreeting());
    }
}
StefanPenndorf commented 7 years ago

Sounds strange. Can you publish the complete sample project at github or at least post your service beans as well? Which version of spring do you use? Spring creates proxies (subclasses) for @Configuration beans - maybe those will be created before mutating the bytecode and maybe spring copies/duplicates some of the byte code. But this is just speculation. Without a reproducible test case it's hard to investigate.

fenghuadong commented 7 years ago

@KyleRogers Hello Kyle, do you know whether pitest provide progress information during the execution? Mine has been running for a week, and I want to see whether it is close to finish or not even a bit.... Basically progress information such as how many mutants has been tested, how many is still remaining.

peter-janssen commented 7 years ago

@KyleRogers The services are just returning strings:

public class HelloWorldService {

    public String getGreeting() {
        return "Hello world";
    }
}

It is all very straight forward but if you really want I can supply a test project.

StefanPenndorf commented 7 years ago

I just re-read your first post - maybe this has to do with Spring's Application Context caching. The survived mutation is not inside your services but it's the configuration class (App.class) that has a surviving mutation. But that's just a guess. Maybe we can try the following two things to confirm or refute that theory?

1) Can you try to add @DirtiesContext to both test methods? 2) Can you try to split your tests up into two test classes with distinct @ContextConfiguration where one initializes a context with App.class (and Service.class because that is imported) and the other just initializes Service.class?

Does that change anything?

StefanPenndorf commented 7 years ago

@fenghuadong

No, I don't think there is such a progress information. There is just a summary report at the end. But maybe progress information can be added easily. Feel free to open an issue. Didn't had a test suite that took so long yet. Hours, maybe one or two days but not a week. How many tests and production classes or LOC do you have?

If you have questions you can write to Pitest mailing list: pitusers@googlegroups.com Community response times are better than mine here at github ;-)

fenghuadong commented 7 years ago

@KyleRogers I'm testing an implementation of EM algorithm in Weka using combinatorial testing, which for 1-way test, I have about 10 test cases, and 6-way tests, I have probably 200 test cases. The average execution time for these test cases are about 2 minutes each, The shortest one takes about a second, the longest takes about 5 minutes. I set the time out to be 5 minutes to avoid timing out working test cases. And there were 4000 mutants generated from the source code. So guess what? Lol....