Open dgh05t opened 3 years ago
There are actually several reasons why this might happen. Let me explain what it takes for the sample to be "interesting". When Jackalope processes a sample and it triggers new coverage, Jackalope reruns the sample a certain number of times (10 by default). Only samples that have stable coverage (some part of coverage that is seen for each run with this sample) are considered interesting. In other words, we are looking for coverage that is specific to the sample in question and not e.g. initialization code etc. The samples that only produce variable coverage (coverage that is seen in one or some iterations but not all) are discarded. Additionally, for a sample to be considered interesting, none of the repeated runs with the sample must cause a crash or a hang.
To see which is the case for you, I suggest placing a breakpoint in Fuzzer::RunSample here https://github.com/googleprojectzero/Jackalope/blob/main/fuzzer.cpp#L312 (that's going to be the first run with a sample) and see what happens after that.
@ifratric Thank you for replying. it seems because the sample will produce different coverage for each run.
Note that different coverage for each run is fine, but there needs to be at least one coverage offset that is seen in all the runs with the sample.
Hi ifratric,
On macOS, using litecov can generate coverage file, and shows log such as:
Found 1701 new offsets in ...
but with the same instrument_module and input file, Jackalope shows:
Why's that?