tim-group / test-driven-detectors4findbugs

Test-Driven Detectors For FindBugs. Utility project to ease the development of custom plugin detectors for FindBugs.
Other
17 stars 17 forks source link

DetectorAssert.addRegistrar(...) has no effect #20

Open GrimmiMeloni opened 10 years ago

GrimmiMeloni commented 10 years ago

I just tried to register a custom Analysis Engine via DetectorAssert.addRegistrar(....)

This however does not work as expected, i.e. the custom engines are not used during during the initial analysis of the application classes.

To my understanding the cause is in the static nature of the initialization. addRegistrar() mofifies the static field USER_DEFINED_ENGINE_REGISTRARS in DetectorRunner. But this happens effectively too late, because the static inner class SINGLETON is automatically initialized the moment DetectorAssert tries to touch DetectorRunner.

So effectively the whole initialization procedure is already completed once the custom registrar is added to the list.

Grundlefleck commented 10 years ago

Hi, sorry for delay in acknowledging your bug report. Taking a gander now.

Grundlefleck commented 10 years ago

Hi @GrimmiMeloni,

I don't believe there's any need to initialise FindBugs quite so aggressively. It should be possible to add and clear registrars, and also retrieve the testing BugReporter, without initialising FindBugs. Ideally this could be done with just an instance of DetectorAssert, which can be constructed and collected as needed, but there's so much static state within FindBugs that it would be very difficult to support this. I think just removed the eager initialisation in a few key places is the answer.

Making the change would be quite easy, but I would prefer to have an automated test. Would you be able to supply a minimal, not-working example? I'm not too familiar with custom analysis engines in FB, so it would really help me out. Gist or pull request, or pasting some text here, whatever is easiest for you.

GrimmiMeloni commented 10 years ago

Well, in theory I think a minimal EngineRegistrar that puts some noop-engine into the AnalysisContext could be sufficient already. A testcase would then just have to check if that engine is available (or if that's not directly possible check for some predetermiend result that's caused by that noop-engine).

I will try to find some time to have a look at the testcases again. Cannot promise anything at this point though (my desk is loaded).