Closed abatishchev closed 10 months ago
I am seeing a similar problem. It appears to be that --collect "Code coverage" sometimes gets parsed incorrectly and "coverage" ends up as a separate parameter that is interpreted as a second project. Below are the msbuild commands output with detailed verbosity enabled (irrelevant parameters removed):
Successful build:
C:\__t\dotnet\sdk\6.0.412\MSBuild.dll
...
-property:VSTestCollect="Code coverage"
C:\__w\1\s\MySolution.sln
Failed build:
c:\__t\dotnet\sdk\6.0.412\MSBuild.dll
...
-property:VSTestCollect="Code"
C:\__w\1\s\MySolution.sln
coverage
Failures happen seemingly at random with no code changes so I'm a bit perplexed as to what could be causing "Code coverage" to be parsed incorrectly
@abatishchev Can you please add the variable "system.debug" and set the value to "True" in your pipeline and share the success and failure logs?
The latest failure: https://dev.azure.com/msazure/One/_build/results?buildId=77731414&view=logs&j=15dfcb1a-0989-5cf6-3160-3e181e44de87&s=6884a131-87da-5381-61f3-d7acc3b91d76&t=45dce97e-2b8e-5d98-6d10-a1ed23901c37&l=41 In a different repo now, what means the issue persists and even spreads.
@v-mohithgc I'll try to produce both. The problem is that the failure occurs sporadically.
Since last week this has been occurring consistently in some of our builds. (In the Office ADO tenant)
Relevant versions: dotnet ADO task - 2.221.0 Microsoft (R) Test Execution Command Line Tool Version 17.6.3 (x64). Repros with both .NET 7.0.306 and 7.0.400
The below excerpt from the ADO build log shows the issue: *I'm wondering if there is some encoding applied to the "Code coverage" value which leads to it being split, as the command displayed works fine locally.
C:\hostedtoolcache\windows\dotnet\dotnet.exe test D:\a_work\1\s\Omex.sln --logger trx --results-directory D:\a_work_temp -c Release "-p:Platform=Any CPU" --no-build --no-restore --collect "Code coverage" --filter FullyQualifiedName!~CloudTests MSBUILD : error MSB1008: Only one project can be specified. Full command line: 'c:\hostedtoolcache\windows\dotnet\sdk\7.0.306\MSBuild.dll -maxcpucount -verbosity:m -target:VSTest -nodereuse:false -nologo -property:VSTestTestCaseFilter="FullyQualifiedName!~CloudTests" -property:VSTestLogger="trx" -property:VSTestNoBuild=true -property:VSTestResultsDirectory="D:\a_work_temp" -property:VSTestCollect="Code" -property:Configuration=Release D:\a_work\1\s\Omex.sln -p:Platform=Any CPU coverage -p:UseSharedCompilation=false -p:EmitCompilerGeneratedFiles=true -property:VSTestArtifactsProcessingMode=collect -property:VSTestSessionCorrelationId=6036_971960d5-00bc-428a-ac56-b17c8925fee4 -distributedlogger:Microsoft.DotNet.Tools.MSBuild.MSBuildLogger,c:\hostedtoolcache\windows\dotnet\sdk\7.0.306\dotnet.dll*Microsoft.DotNet.Tools.MSBuild.MSBuildForwardingLogger,c:\hostedtoolcache\windows\dotnet\sdk\7.0.306\dotnet.dll' Switches appended by response files: Switch: CPU
Can share the full build internally, if necessary.
@K-Cully are you able to produce a build with debug tracing turned on?
I'm not, I ran 20+ builds manually, all of them have succeeded. Meanwhile automatically triggered builds continue to fail from time to time (across repos, without making any related changes).
Yes, it reproduces in one of our pipelines for every run. I'm not going to share the logs here though - The Office org is very sensitive about that sort of thing. Diagnostic logs are available here, if you have access: https://dev.azure.com/office/OC/_build/results?buildId=22098168&view=logs
I encountered the same problem, any updates now?
We meet the same error, but in AzureCLI@2
task. It seems that the bug lies in a fundamental component within Azure Pipeline, but is not limited to a specific task.
We also encounter the same build error. Request to prioritize it. Thanks!
https://onebranch.visualstudio.com/Support/_workitems/edit/172772
It seems to me it is also a known issue in OneBranch and disabling CodeQL might be a workaround.
We are also seeing the same issue on our side sporadically. It does seem like it is treating the coverage option as a project. I also ask if this can be prioritized.
We have found another workaround by encoding the space in “Code Coverage” as %20 so that the option is --collect:Code%20coverage
@v-mohithgc may I ask what it's waiting for?
@v-mohithgc may I ask what it's waiting for?
awaiting response from OneBranch team.
@v-mohithgc they're offering a workaround of replacing ` with
%20` and that's about it. Please escalate.
We are facing the same intermittent issue, the %20 workaround worked for us.
My team was also broken by this for a number of internal repos.
@v-mohithgc please contact the team OneBranch pointed to as the source of the issue - CodeQL.
I am seeing a similar problem. It appears to be that --collect "Code coverage" sometimes gets parsed incorrectly and "coverage" ends up as a separate parameter that is interpreted as a second project. Below are the msbuild commands output with detailed verbosity enabled (irrelevant parameters removed):
Successful build:
C:\__t\dotnet\sdk\6.0.412\MSBuild.dll ... -property:VSTestCollect="Code coverage" C:\__w\1\s\MySolution.sln
Failed build:
c:\__t\dotnet\sdk\6.0.412\MSBuild.dll ... -property:VSTestCollect="Code" C:\__w\1\s\MySolution.sln coverage
Failures happen seemingly at random with no code changes so I'm a bit perplexed as to what could be causing "Code coverage" to be parsed incorrectly
We are seeing this happen in many pipelines in dev.azure.com/microsoft tenant as well. The logging from the task seems to indicate that the issue is actually in the dotnet test
code as the commandline that is logged for the dotnet test command appears correct, but the subsequent (transformed) commandline that runs msbuild
is where the parsing error has occurred.
We have tried multiple different options for passing in the --collect
argument, none of which have been consistently successful. {--collect:"Code coverage"
; --collect "Code coverage"
; etc.}.
IMO the bug is in the parsing/transformation when the value is quoted and includes a space, but that is only hypothesis.
Can everyone confirm it stopped failing for me? The CodeQL reportingly has fixed the issue.
The CodeOL team has reported this issue has been fixed, will be closing this issue for now, please feel free to reopen if any recent occurrence found. Thanks
Task name
DotNetCoreCLI
Task version
2.221.0
Environment type (Please select at least one enviroment where you face this issue)
Azure DevOps Server type
dev.azure.com (formerly visualstudio.com)
Operation system
OS: Windows Container image: onebranch.azurecr.io/windows/ltsc2019/vse2022:latest
Task log
When works:
Url: https://dev.azure.com/msazure/One/_build/results?buildId=77041080&view=logs&j=fa1208be-7322-5e86-1637-d1f5bee81ceb&t=6b44f5bc-7ea1-5a94-5862-d9bc9577db38&l=117
When fails:
Url: https://dev.azure.com/msazure/One/_build/results?buildId=77263266&view=logs&j=15dfcb1a-0989-5cf6-3160-3e181e44de87&t=d80a394b-d4ad-5cf4-e035-192d3dcb341d&l=41
Relevant log output
Aditional info
Account: msazure/One Pool: OneBranchPipelines on 1ES Hosted Pool
Task definition: