Closed billschereriii closed 1 month ago
Yeah...lcov/2.0 (and newer) does a lot of additional error and consistency checking that previous versions did not. We use it during verification/validation - and care deeply that our results are correct/consistent/valid.
The warnings you see are related to unexpected/inconsistent data in your gcov output. This isn't entirely 'unexpected' - as we have seen it before (...which is why the message exists); it is dependent on your toolchain version - and may or may not disappear if you swap to a newer (or older) gcc, or if you swap to a newer (or older) LLVM. (Yeah - I agree that this sucks a bit.)
If you decide to just ignore those messages, then you can add
ignore_errors = gcov,mismatch # <- turn these messages into non-fatal warnings, OR
ignore_errors = gcov,gcov,mismatch,mismatch # <- tell me twice, and I will shut up about it
or you can add
geninfo_unexecuted_blocks = 1
to set the hit count for those blocks to zero (silently).
Or you can use the --keep-going
flag (issue error/warning but don't stop) flag to the tool, or add
stop_on_error = 0
to your lcovrc file.
(Generally: almost all options are available from either the tool command line or the options file. This is somewhat described in the man pages.)
While Memory::Process is recommended - it isn't required. If not available, the tool will try to figure it out, itself (less portably). You don't need it at all, if you are not using parallelism and/or if you don't need throttling in your compute farm. Similarly, you don't need Devel::Cover unless you are trying to collect coverage data for Perl (..and specifically, for the lcov package itself). If your project is huge and you think you might see lcov performance issues (sadly, not impossible) - then you may want python and the xlsxwriter package.
Thank you very much for the information. This was enough to get us going!
Feel free to close this issue if desired. gitHub CI/CD might make a good addition to your examples if you're so inclined.
Hi - sounds like a good idea, but I'm not sure what a CI/CD pipeline example would look like - but I am open to suggestions, and especially to pull requests :-)
In case you are interested, the paper referenced in the README contains a description of some approaches (including 2 of the ones we happen to use internally, on 2 different projects). The upshot is that you can use differential categorization or dates/ages (or both) to automate your coverage analysis task such that there is no manual effort/no manual review - as long as each build meets your coverage criteria. For example: now unexercised new or changed code/no losses (without signoff) - and so forth. This turns your coverage job into just another regression test/as sleeping policeman rather than a separate task that somebody has to do. You can also do a bit of automated analysis such that Jenkins can send hate mail to the most likely culprit(s).
If it is all the same to you: I typically prefer the original filer to close the issue - given that I might not have completely understood the request, and so might not have actually resolved it.
Cleaning up the issue list - so going ahead to close this issue now.
If there is still something to do/some un-addressed problem: feel free to reopen this issue or file a new one.
In our SmartRedis project (https://github.com/CrayLabs/SmartRedis), we use LCOV as part of our CI/CD pipeline to track code coverage across a series of test cases. With LCOV version 1.12, this is working perfectly: we install LCOV locally in the gitHub runner and invoke it. With LCOV version 2.0, we've run into several difficulties, and would welcome your insight.
In the runner, we set up the perl environment to version 5.30 and install the dependencies as follows:
(
Memory::Process
doesn't seem to have a corresponding library toapt install
so we had to install it this way.)The actual LCOV installation is done via our Makefile and boils down to a git clone and
make PREFIX=
pwd/install install
.When we then run the test cases and go to process the coverage results with LCOV, we see errors such as the following:
(Again, we did not encounter any errors of this form with LCOV 1.12.)
Any insights as to what might be going wrong would be welcome and gratefully received! Please let us know if there is any more information we can provide.