coverlet-coverage / coverlet

Cross platform code coverage for .NET
MIT License
2.99k stars 386 forks source link

[BUG] Upgrading v6.0.1 to v6.0.2 increases instrumentation time #1649

Closed erichiller closed 2 months ago

erichiller commented 7 months ago

After upgrading from v6.0.1 to v6.0.2 the maximum time it takes to instrument increases from 26 seconds in v6.0.1 to >1.5 minutes. (note: I'm measuring from test command began to first test starting) This occurs when only changing the coverlet.collector package's version (otherwise identical repo). Normally this would just be annoying, but since I'm using --blame-hang-timeout 90s when testing, the test host crashes before any tests begin. The problem is compounded because I'm using GitHub Actions and for whatever reason the process crash isn't detected and the test will keep running for as long the test runner lets it.

Going through the differences from v6.0.1 to v6.0.2 it looks like System.Text.Json use was reverted. I didn't see any specific code that appeared to be the culprit, however I am not familiar with coverlet's code.

Let me know if you'd like any other information.

Environment: .net8.0 Coverlet: v6.0.1 / v6.0.2 OS: Ubuntu 22.04 Arch: x64

daveMueller commented 7 months ago

@erichiller thanks for reporting this. There is another performance issue reported #1646 for which already a PR exists. Maybe it is the same reason. I let you know once this is merged and you can give it a try with the nightly.

daveMueller commented 7 months ago

@erichiller The PR I mentioned is now merged and can be consumed with our nightly. Maybe you can give it a try. But I guess this here is another issue and as you have already noticed there weren't many changes between the two version v6.0.1 - v6.0.2. We would really appreciate if somebody could provide a repro that shows those different instrumentation times? This would speed up the work on this issue.

WyrmUK commented 6 months ago

We've seen the same performance issue so this should be fairly easy to replicate. It's significant for large solutions. I would hazard a guess that the change back to Newtonsoft from System.Text.Json is the culprit (Newtonsoft is a lot slower for large and numerous objects). You may want to consider providing a specific .NET 8 version of the tool that uses the V8 System.Text.Json then in the build target determine which version to run depending on the installed framework.

daveMueller commented 6 months ago

The issue says something about increased instrumentation time but e.g. we don't use Json serialization/deserialization for the instrumentation at all. In fact we only use it in 1-2 places for generating reports and reading the runtime.config file. Even if I don't want to rule it out yet, I somehow think the problem could be elsewhere. I tried to create a repro today but it doesn't seem to be really easy. How do the affected solutions look like? @WyrmUK what categorizes a large application. Amount of source code? Amount of source files? Amount of assemblies?

piccit commented 6 months ago

The issue says something about increased instrumentation time but e.g. we don't use Json serialization/deserialization for the instrumentation at all. In fact we only use it in 1-2 places for generating reports and reading the runtime.config file. Even if I don't want to rule it out yet, I somehow think the problem could be elsewhere. I tried to create a repro today but it doesn't seem to be really easy. How do the affected solutions look like? @WyrmUK what categorizes a large application. Amount of source code? Amount of source files? Amount of assemblies?

In my own experience, it's files/sloc. Our pipeline runs tests per project, so I don't think amount of assemblies is a factor. I'll update with specific file and sloc count when I get back to my desk, but I was seeing about 3-5 minutes added to overall execution time (per project)

WyrmUK commented 6 months ago

The issue says something about increased instrumentation time but e.g. we don't use Json serialization/deserialization for the instrumentation at all. In fact we only use it in 1-2 places for generating reports and reading the runtime.config file. Even if I don't want to rule it out yet, I somehow think the problem could be elsewhere. I tried to create a repro today but it doesn't seem to be really easy. How do the affected solutions look like? @WyrmUK what categorizes a large application. Amount of source code? Amount of source files? Amount of assemblies?

In our case we have a solution containing 6 projects containing 12,000 lines of which 6,000 are coverable/covered with 1,200 branches. Each project has an associated unit test project. We found the time taken to perform the tests increased from 1min 15s up to 2min 40sec. We haven't looked any further as to whether it is the instrumentation or the report generation/merge.

Thomathan commented 4 months ago

I'm having the same issue. I have multiple solutions and 25+ projects overall, with up to 85k coverable lines with a large majority being covered. Similar to above, we also have a unit test project per project.

When running the complete set of tests with clean solutions on 6.0.2, it takes around 7 minutes. With 6.0.1 it only takes 4 minutes. The tests themselves only take about 3 minutes when not generating code coverage.

I also attempted to test the nightly build. However, it doesn't seem to be collecting any coverage at all.

Bertk commented 3 months ago

@daveMueller I used dotnet.benchmark and analyzed a number of commits. Looks like the performance impact occurs with commit "Add regex evaluation timeout (#1630)"

Without commit image With commit #1630 image

drdamour commented 3 months ago

hm, docs on that https://learn.microsoft.com/en-us/dotnet/api/system.text.regularexpressions.regex.-ctor?view=net-8.0#system-text-regularexpressions-regex-ctor(system-string-system-text-regularexpressions-regexoptions-system-timespan) say

We recommend that you set the matchTimeout parameter to an appropriate value, such as two seconds.

and

If you disable time-outs by specifying InfiniteMatchTimeout, the regular expression engine offers slightly better performance.

guess we've been banking on that slightly better performance?

Bertk commented 3 months ago

I reverted the updates in IsLocalMethod and IsTypeFilterMatch. The performance is back again.

image

Using Regex.InfiniteMatchTimeout did not improve the performance.

regex with timeout shall not be used for IsLocalMethod and IsTypeFilterMatch method. RegexOptions.IgnoreCase configuration shall be removed as well.