Open octatone opened 7 years ago
Hi there, is there any updates? I face this problem as well - it takes unacceptable long time to publish code coverage results.
Seeing 1700+ files take over 3 minutes to publish. I'd really like to use the code coverage feature of pipelines but as it stands it's taking longer to publish the coverage than it takes to build and test our project.
We're able to upload far larger artifacts with the publish:
yaml step in far less time (e.g. 140M in ~3s)
This is completely unusable. We have a simple .NET Framework solution and this took 20 minutes until I gave up and cancelled it. It was written 70mb+ HTML files for every single class and would have taken HOURS.
What is wrong with this? It seems very broken.
It would be nice to just publish coverage results like normal artifacts but have a way to mark up on the publish
step that they should be used as coverage results. Normal publish
steps are more than fast enough to upload.
Agree this is unusable. Even though it's using Task.WhenAll and possibly working in parallel, that's a lot of individual HTTP requests that have to be processed.
Couldn't this be changed to use the PipelineArtifactServer like the publish pipeline artifact task does?
This becomes more and more a problem. It's hard when to upload times takes over the build and test times. This is even worse on premises with the added latency.
Task performance is indeed disappointing. The task takes over three minutes for a project with ≈2K files. That’s about a third of how long the tests take to run 😞 Not much joy for PR authors and reviewers.
Would like to echo what everyone else has said. Adding code coverage to our pipeline (.NET Core 2.1, coverlet.collector, cobertura) added 2 minutes to what was a 3 minute pipeline (1463 files).
I don't suppose there is a way to publish a coverage file from one build to another..? Could create an async workflow:
For now, I think we will disable report generation as suggested by @danielpalme https://github.com/microsoft/azure-pipelines-tasks/issues/10354#issuecomment-491582757:
disable.coverage.autogenerate: 'true'
We have this problem where publishing the code coverage report increased the build by 50%. Disabling the autogenerate didnt help decrease the time.
Currently to get around this we zip and upload the whole coverage report as an artifact of the build. Thus allowing engineers to access it. All of this takes less than a 1 minute. While publishing the coverage report(8500+ files) takes 10 minutes.
Currently to get around this we zip and upload the whole coverage report as an artifact of the build.
Does this mean that you don’t have the Code Coverage tab in Azure UI and your devs need to download the archive?
Currently to get around this we zip and upload the whole coverage report as an artifact of the build.
Does this mean that you don’t have the Code Coverage tab in Azure UI and your devs need to download the archive?
Yes, the devs have to download the report if they want more detail. We do have the Code Coverage tab in the Azure UI, it just empty. We do fail our builds for code coverage and the devs only care about the report when the builds fails.
Adding my voice to this, code coverage is taking around 6 minutes to upload 649 files, we're going to remove this step because its really slowing down our pipeline.
You can use the following approach to speed things up:
Generate coverage report with this task: https://marketplace.visualstudio.com/items?itemName=Palmmedia.reportgenerator
Use the PublishCodeCoverageResults
task only to publish the report and disable report regeneration.
Relevant YAML looks like this:
variables:
disable.coverage.autogenerate: 'true'
- task: reportgenerator@4
inputs:
reports: 'coverage.cobertura.xml'
targetdir: 'coveragereport'
- task: PublishCodeCoverageResults@1
displayName: 'Publish code coverage results'
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: 'coveragereport/Cobertura.xml'
reportDirectory: 'coveragereport'
@danielpalme We generate the report with nyc
and use PublishCodeCoverageResults
only to upload files. Because this task sends the HTMLs individually and not in the form of a single zip archive, the upload takes minutes. Each file is a separate HTTP request, so it’s thousands of requests to make.
This seems to be a limitation of the API endpoint, so we can’t fix it without a change in the Azure backend. When it can accept a zipped bundle of htmls and still show the result in the code coverage tab, the problem is solved.
Huge +1 to this issue -- if this were task changes I'd happily put up a PR, but it seems like this might not be fixable without server-side changes.
In my case, although I only have about 1600 source files, it takes 1m30s to upload the Cobertura XML file and generate the coverage HTML; if I generate the HTML files myself with nyc and disable autogeneration as per the suggestion above, it takes longer (2m30s) due to uploading all those generated HTML files individually.
I would really, really like "click into" HTML coverage reports from my builds for the developers, but not at that cost. It seems like the workaround at this point might be uploading the files myself (maybe to an S3 bucket via s3 sync) and just linking to it from the Azure coverage report.
Another +1, we just enabled coverage report, 3400 files, takes 5 min, for a build of 5 min, so not acceptable. Please make this a zip upload or so.
FWIW, for my company's monorepo (where we were having this problem), to get around this issue, at the end of our PR pipeline we are doing the following:
nyc
) all the HTML coverage output (and the combined, single-file Cobertura XML).index.html
in Azure Blob Storage.PublishCodeCoverageResults@1
Azure task, but it only uploads the single XML file and the tiny HTML file we generated in Step 3.At around ~2000 covered source files (and thus 2000 HTML files), these steps take:
So under 10 seconds for what was taking over 3 minutes.
(I'm throwing this out there mostly to show the kinds of lengths people need to go to when really the task itself could be fixed to use azcopy
under the hood. It's also a possible workaround for other folks out there, although the task of correctly configuring the Azure Blob Storage, authenticating it so your developers can reach it and view the coverage reports, getting the Azure creds into your pipeline, it's all a real hassle.)
This issue was opened in 2017 and not done yet?! 😱
Has there been any updates around zipping these files yet?
Another +1 from our side, we're having the exact same problem. Uploading the results takes up about 50% of our total build time (around 12 minutes). So cutting this down would make a huge difference and does not seem to be that hard. There really should be a way to upvote these issues btw, give some sort of feedback as to how many community members need something to be fixed (I also understand this would not necessarily be the same priority as the product team has).
@nadesu @ganesp could you please take a look?
Any updates??
I also find uploading the code coverage result to be extremely slow making me even considering to remove it.
I would like to add my voice here as well.
The task takes minutes to complete. Is there any update on this issue? Simply throwing the files in a package and uploading it as a single file and unpacking on the other end should fix this issue. The package doesn't even need to be compressed (though, that would be nice as well).
At the current state, I am also considering in disabling this feature, because it is unusably slow.
We have moved code coverage results to a dedicated build because it takes so long, hoping it will be fixed eventually
How is this thread over 5 years old and this is still an issue?? hello?
Nothing yet?
It's still very slow as of today
It's 6 years now...
+1 too slow )
Really slow, did any of you get an alternative running ?
ignore game is strong
PublishCodeCoverageResults randomly skip last few files after few minutes as a result no html report on available on Azure DevOps tab.
While for task where files are less then 500 it work fine
Adding this parameter made it work for me:
DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.ExcludeByAttribute=GeneratedCodeAttribute
Without it I even got to agent pipeline hard disk to run out of space
+1, please improve!
Looks like no contributor cares about this.
I have this problem too, uploading 1754 files takes 2:36 minutes.
resolved with task v2
PublishCodeCoverageResults@2
resolved with task v2
PublishCodeCoverageResults@2
OK, good to hear. Unfortunately, PublishCodeCoverageResults@2 does not support detailed code coverage reports, it only shows a short summary.
PublishCodeCoverageResults randomly skip last few files after few minutes as a result no html report on available on Azure DevOps tab.
While for task where files are less then 500 it work fine
seems there is a 2min timeout on underling CoveragePublisher.Console see https://github.com/microsoft/azure-pipelines-coveragepublisher/blob/master/src/CoveragePublisher/ArgumentsProcessor.cs
unfortunately, AzDevOps task wrapping CoveragePublisher.Console does not seem to handle this timeout
parameter
see https://github.com/microsoft/azure-pipelines-tasks/blob/master/Tasks/Common/coveragepublisher/coveragepublisher.ts
related: https://github.com/Microsoft/vsts-tasks/issues/1271
Hi there, we have a project whose coverage report (like the above closed issue) generates over 900 files as its HTML coverage summary output. The PublishCodeCoverageResults task takes about as much time as it takes to generate the coverage report effectively doubling the time to complete our CI tasks. It takes over 2 minutes to publish our code coverage output (which in total is around 4MB).
Is it possible that the task is uploading each file one at a time? Can this be changed to zip the files (which are all text files), then upload the zip, and finally expand on the VSTS side?