Closed aladur closed 2 months ago
I'm unable to reproduce a "memory effect" resulting in count variations. Can you find a public git repo, or post a tar of files, so we can work with the same inputs?
Also try a run of Steps 1 & 2 with the extra switches --found=Found.txt --counted=Count.txt --ignored=Ignored.txt
to see what exactly was found, counted, ignored. Then diff the results from the two runs, perhaps that will shed light.
... Sorry this was my fault. Due to all the testing I forgot that still a file $HOME/.config/cloc/options.txt existed applying a default file filter.
Describe the bug Sorry for once again bothering You, but this bug influences testing the other bug #851 . It can happen that after using a file filter (with --match-f) and on the next run use cloc without --match-f there is still a file filter applied (not 100% reproducible, I didn't find out where and when the "memory effect" is stored, tried with --sdir, without success).
cloc; OS; OS version
Step1: apply an "all inclusive" file filter
Output:
OK => All files listed, as requested.
Step2: apply no file filter
Output:
FAIL => Without a file filter all files should be counted (right?), resulting in the same result as Step1. This output definitely depends on previous cloc runs. I often used the file filter--match-f='^./[a-z][a-z0-9]*.[a-z]+$' which seems to be the filter applied for this output under the hood.
The question comes up, how to clear or even better how to avoid this "memory effect"?
Step3: apply a different file filter
Output:
OK = >File filter applied as requested.
Step4: apply no file filter
Output:
FAIL => Resulting in the exact same numbers as in Step2 although no file filter requested. The under the hood memory does not get cleaned up by applying a different file filter.