nedbat / coveragepy

The code coverage tool for Python
https://coverage.readthedocs.io
Apache License 2.0
2.96k stars 428 forks source link

Significant performance regression with coverage==4.4 #579

Closed nedbat closed 7 years ago

nedbat commented 7 years ago

Originally reported by Jacopo Notarstefano (Bitbucket: jacquerie, GitHub: jacquerie)


Hi, we're experiencing a significant performance regression in our test suite when using pytest-cov==2.4.0 and coverage==4.4.

In fact, one of our builds times out with this version (https://travis-ci.org/inspirehep/inspire-next/builds/229944296), but works just fine with the previous coverage==4.4b1 (https://travis-ci.org/inspirehep/inspire-next/builds/230095840).

I didn't research further what could be causing this bug, but I wanted to give you a heads up that something might be wrong, either here or in the interaction with pytest-cov.


nedbat commented 7 years ago

4.4.1 is on PyPI now.

nedbat commented 7 years ago

Original comment by Akihiro Motoki (Bitbucket: amotoki, GitHub: amotoki)


Could you release a new version like 4.4.1 for this fix? Even though it is a packaging problem, it is recommended to use a new version. Otherwise, if users cache or mirror a python package, they cannot recognize the replaced version.

nedbat commented 7 years ago

Original comment by Jacopo Notarstefano (Bitbucket: jacquerie, GitHub: jacquerie)


Ah, yes, this must be it: we probably cached the bad wheel during the abnormally slow last build of the Docker image (https://travis-ci.org/inspirehep/inspire-docker/builds), so we keep getting that one instead of the good one.

Thanks a lot!

nedbat commented 7 years ago

There was a problem earlier with bad linux wheels, which meant you could have been using the Python tracer instead of the C tracer. This would significantly slow down the test run. I thought I had removed the bad wheels before that linked build was run, but maybe there's caching? Try using 4.4 again, and we can look to see if it's installed from a wheel or not.