microsoft / testfx

MSTest framework and adapter
MIT License
714 stars 253 forks source link

Incorrect test result view in Azure DevOps after upgrading MSTest to 2.2.4 and higher #1026

Closed mabadev closed 1 year ago

mabadev commented 2 years ago

Description

After I updated MSTest.Framework and MSTest.Adapter to 2.2.4 and higher, the test results in Azure DevOps are displayed incorrectly. Each test interation is displayed in a new line. Before that, the interations were shown under a test case.

For example: A Test with 2 iterations

Before: TestCase12

Currently

This a screenshot for the test results on Azure DevOps For example: Test Case 1129 image


TRX-File look like different After the upgrade, the trx file does not contain an <InnerResults> element

Hier is a screenshot fom trx-file image


I use VS 2019 16.11.6 I'm not sure if this is a bug or a new configuration. This is my test configuration: Configure unit tests by using a .runsettings file

Legoldos commented 2 years ago

Hey, I think this issued I recently posted is related/same #1024

mabadev commented 2 years ago

Hey, I think this issued I recently posted is related/same #1024

I think the main problem is the same. thanks for that

nixdagbibts commented 2 years ago

Any progress on this topic? We cannot upgrade from 2.2.3 to 2.2.8 until this is fixed :(

nohwnd commented 2 years ago

This is not a bug, mstest now reports data driven tests as single tests. This was done to bring parity to how xunit and nunit does it.

@Haplois There are few duplicates of this issue. Could you link them all to a single one, and post an update on when the work on AzDO will be finished, please?

Also correct my statement above if I am wrong :)

TroyWalshProf commented 2 years ago

@nohwnd
The ONLY reason I and many other engineers use MSTest is because of the seamless integration with Azure DevOps Test Hub. If you cannot maintain this integration you are just undermining the project.

Here are my concerns:

  1. The team broke this integration a year ago and it has yet to be addressed
  2. The interesting (wrong) assertion that this is "not a bug"
  3. The trx report format has fundamentally changed and that doesn't seem to justify a major or minor version bump
  4. I know this project and the Azure DevOps Visual Studio Test task are separate projects, but this kind of thing has the stink of the bad old days of Microsoft. - AKA Microsoft cannot seem to even work with itself

As far as I can tell here are the related/duplicate issues:

nohwnd commented 2 years ago

Duly noted. Believe me, I am not happy about this either. We are looking into conditionally reverting this change until it is fixed in azdo.

TroyWalshProf commented 2 years ago

@nohwnd - If I may make a suggestion: I would love to see splitting or consolidating tests being optional, much like parallelization. Maybe using an attribute such as "DataDrivenToSingleTest". This way users have the option of how they want to group such tests. *This would follow the same basic pattern as the "DoNotParallelize" attribute.

Example:

        [DataTestMethod]
        [DataDrivenToSingleTest] // Allow users to set this on the test, test class, and in the run settings file like you can for parallel
        [DataRow("ConfigJsonEnvRunOverride", "OVERRIDE")]
        [DataRow("OverrideOnly", "OVERRIDE")]
Legoldos commented 2 years ago

This is not a bug, mstest now reports data driven tests as single tests. This was done to bring parity to how xunit and nunit does it.

@Haplois There are few duplicates of this issue. Could you link them all to a single one, and post an update on when the work on AzDO will be finished, please?

Also correct my statement above if I am wrong :)

Hello @nohwnd Can you please provide link for some kind of documentation where this new behavior is explained? I couldnt find anything in MSTEST release notes. I would like something to grasp on when I explain this new behavior to my team.

mabadev commented 2 years ago

Duly noted. Believe me, I am not happy about this either. We are looking into conditionally reverting this change until it is fixed in azdo.

Hoping to resolve this obstacle or to find an alternative solution for it. Our project has suffered because of that.

ChristoWolf commented 2 years ago

Hello @nohwnd Can you please provide link for some kind of documentation where this new behavior is explained? I couldnt find anything in MSTEST release notes. I would like something to grasp on when I explain this new behavior to my team.

I would also like to know, as I did not find the related changes by glancing over 2.2.4's release notes and commits.

MonzT commented 2 years ago

Are there any updates to this as yet? also getting very frustrated and using work arounds is less than ideal

nohwnd commented 2 years ago

@Haplois @Evangelink did you get any update from AzDo about the progress of the fix? I know we talked with them a while ago.

Haplois commented 2 years ago

@nohwnd We didn't get an update - I pinged them.

Haplois commented 2 years ago

They have a fix, and expect to ship it next sprint. I asked about exact date.

Nidolai commented 2 years ago

@Haplois Thanks for the update.

charlesdwmorrison commented 2 years ago

Getting the next sprint date would be great. As far as I can see, this change significantly breaks all test harnesses which are heavily Data Driven. I.e., test harness which have many rows in Excel, Access, or a Database. Teams following that methodology may have 100 test cases in one spreadsheet, with differ data rows of the spreadsheet testing different features, all going through one test method. While this is not ideal, a lot of teams/organizations perform testing. If one of these rows fails, the entire test fails? That does really skew the results.

Related well-reasoned posts on this issue include: https://developercommunity.visualstudio.com/t/vstest-test-publication-miscounts-test-cases-and-c/909375 https://developercommunity.visualstudio.com/t/tests-tab-on-build-results-page-shows-wrong-test-c/1385811

Looking forward to seeing what the fix is. If is necessary to bring MSTest.exe in line with NUnit and XUnit, there should be a command line flag to allow for backward compatibility with the old approach.

Haplois commented 2 years ago

I spoke with the product team again, the fix in release pipeline and will be deployed within this sprint (which will be ending on 12th of September). I couldn't get an exact date.

/cc @dogirala from the product team.

charlesdwmorrison commented 1 year ago

So when will milestone 2.3.0 be pushed out?

Evangelink commented 1 year ago

Hi @charlesdwmorrison! My mistake, I should not have put any milestone on this ticket, as far as we know, sprint ended on 12th of September and roll out is supposed to have happened (@dogirala could you confirm?).

If you have tested and you are still experiencing the issue, please let us know and we will ping Azure DevOps team.

dogirala commented 1 year ago

@Evangelink , @charlesdwmorrison I see the fix has been deployed but somehow it didn't reflect in microsoft which is strange. For microsoft org I still see old PTR task version which is 2.203.0, ideally it should be 2.210.0. I am checking with my team and concerned deployments team. Will let you know as soon as I find out something on this.

image

Other ex pipeline: image

dogirala commented 1 year ago

Got to know that this is expected as the deployments are still not completely done and they are in progress. Eta is by end of next week.

dogirala commented 1 year ago

The fix is deployed now. All deployments are completed

Evangelink commented 1 year ago

I will move forward and close this ticket. Please if anyone is still experiencing issue either comment below or create a new ticket and we will handle it.

MattBussing commented 1 year ago

I'm confused. I thought there would be a new version of MSTest released to fix this issue. I'm not seeing a new version on NuGet.org.

https://www.nuget.org/packages/MSTest.TestFramework/2.3.0-preview-20220810-02#versions-body-tab

charlesdwmorrison commented 1 year ago

I am still seeing the same test case count as I have been for the past two years. In the test shown in the screenshot below, there are 126 data driven tests (in the form of an Excel spreadsheet). But the test results are only showing 1 test. Where would the data driven counts show up?

image

ChristoWolf commented 1 year ago

I'm confused. I thought there would be a new version of MSTest released to fix this issue. I'm not seeing a new version on NuGet.org.

https://www.nuget.org/packages/MSTest.TestFramework/2.3.0-preview-20220810-02#versions-body-tab

I'm also a bit confused. My understanding now is that the discussed fix from above is not a fix to MSTest but the test results view in the build results, i.e. that the new test results format will now be visualized similarly to before the MSTest test results format change, right? How can I make use of this when using Azure DevOps Server (i.e. on-prem)?

dogirala commented 1 year ago

For on-prem I think you'll need to download the latest agent and configure it for the new changes to be included

Evangelink commented 1 year ago

Hey there! Sorry the messages weren't super clear.

The issue raised in this ticket wasn't linked to a problem in MSTest per say but it was linked to some change of behavior in MSTest (it should not have been done in a patch version) that wasn't reflected on AzDo side. AzDo team updated the behavior on their side and we were waiting for it to be deployed (according to @dogirala it was done 2 days ago).

Regarding on-prem instances, @dogirala could you please let us know what's the process?

@charlesdwmorrison Could you confirm that you are using non on-prem AzDo?

I will make some more manual tests to confirm issue is correctly fixed and will post my results here. In the meantime, I will reopen the ticket.

charlesdwmorrison commented 1 year ago

I believe that is correct, Our AzDo instance is hosted by Microsoft, but we have our own agent machines. So that is "not on prem."

dogirala commented 1 year ago

For on-prem instances, as per my knowledge you need to configure a new Agent or update the existing Agent in the AgentPool section of organization for the latest changes to pick up:

new agent: image

image

Update agent:

image

charlesdwmorrison commented 1 year ago

OK. I upgraded to agent version 2.210.1. I do see some changes, but the donut is still showing 1 test case for 126 data driven test cases. We can even see in the log that there are 126 cases. Is there another place I could see the full test case count? Is there more info I can give you?

image

dogirala commented 1 year ago

Ohh looks like you need to update your MStest adapter version as well. What is the MStest adapter version you are using?

charlesdwmorrison commented 1 year ago

Hi, sorry I've been gone for a while. Was in the hospital. We are using Visual Studio 2017. Typically we do not using a Nuget Package for either MSTest.exe, We use the MSFramework.dll that is installed on our local hard drives. We typically are not using any TestAdapater. We are typically on .Net framework 4.52. I have tried several versions of the MSTest.TestFramework in combination with various versions of the MSTest.TestAdapter (for instance 2.2.10) . And when I run it in the pipeline with the reccomended test agent I get an error about "test adapter not found." So I have several questions:. 1.. If we stick with Visual Studio 2017 is there any Nuget Package combination that will allow us to get a correct DataDriven count?

  1. What's the lowest version of Visual Studio and .Net framework that you have tested the data driven test functionality?

I did upgrade one of our test solutions to .Net 4.6 and I got the following result:

image

dogirala commented 1 year ago

Hi, I feel TestPlatform team can answers these questions better. @nohwnd can you or someone from your team please help clarify the above queries. To give you context, we have recently made a bug fix to resolve the duplicate testcase names observed for unittests with Mstest Adapter version 2.2.4+, wherein heirarchical display for data driven test is replaced with flat structure. @charlesdwmorrison is still seeing heirarchical display and I think he is trying to move to flat structure.

charlesdwmorrison commented 1 year ago

Just to be clear, what I am looking for is some method for Azure DevOps to give an accurate count of the tests executed. Since our tests are data driven by an Excel spreadsheet, we are looking, in effect, for Azure DevOps to measure the number of data rows we pass through a test method. In the above screenshot, the number next to the doughnut, in our opinion, should NOT say "1", it should say "3" because we passed three different rows of Excel data through the test method. Those rows are three different tests.

charlesdwmorrison commented 1 year ago

Any update on this issue?

Evangelink commented 1 year ago

Thanks for the ping @charlesdwmorrison, I haven't managed to work on this issue yet. I will try at the end of the week and worst case scenario it would be early next week.

charlesdwmorrison commented 1 year ago

Any update on this issue? It seems to me that this was the topic of this particular bug fix. Did the fix fix the issue?

Evangelink commented 1 year ago

@dogirala Could we setup some meeting/call to discuss this? As far as I can see, the main difference is that up to 2.2.3 we (MSTest) were using <InnerResults/> for storing the instances of a parameterized test (e.g. DataRow) which was leading to some incorrect count. For example, assuming you have a test method with 3 datarows, you will see 4 items (UnitTestResult) in the .trx file: 1 for the "test method" and 3 as children of the "test method".

Real example with MSTest 2.2.3:

<Results>
    <UnitTestResult executionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod" computerName="LAPTOP" duration="00:00:00.0034511" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="7c368e52-7c43-4b1a-8696-e12d5003d061" resultType="DataDrivenTest">
        <InnerResults>
            <UnitTestResult executionId="6deb9d3f-02f8-4592-a7ad-da2c307b48d8" parentExecutionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod (0)" computerName="LAPTOP" duration="00:00:00.0001003" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="6deb9d3f-02f8-4592-a7ad-da2c307b48d8" dataRowInfo="1" resultType="DataDrivenDataRow" />
            <UnitTestResult executionId="1b8b1f06-a581-40ad-b8fd-af918c42b9b8" parentExecutionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod (1)" computerName="LAPTOP" duration="00:00:00.0000113" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="1b8b1f06-a581-40ad-b8fd-af918c42b9b8" dataRowInfo="2" resultType="DataDrivenDataRow" />
            <UnitTestResult executionId="c9490162-9341-4c9c-9be8-b34528d4b793" parentExecutionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod (2)" computerName="LAPTOP" duration="00:00:00.0000042" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="c9490162-9341-4c9c-9be8-b34528d4b793" dataRowInfo="3" resultType="DataDrivenDataRow" />
        </InnerResults>
    </UnitTestResult>
</Results>

Real example with MSTest 2.2.4+:

<Results>
    <UnitTestResult executionId="61cba81d-ea31-49ba-90e2-392492ded38a" testId="fbcb88c4-070c-4e3c-a6fc-fac91aafc616" testName="DataRowTestMethod (1)" computerName="LAPTOP" duration="00:00:00.0000037" startTime="2022-11-08T15:22:22.6686938+01:00" endTime="2022-11-08T15:22:22.6688093+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="61cba81d-ea31-49ba-90e2-392492ded38a" />
    <UnitTestResult executionId="c953f320-0876-425b-9cef-1fa6a3fbfd2f" testId="face297a-edd7-48ce-9aaa-65a72fe2c332" testName="DataRowTestMethod (0)" computerName="LAPTOP" duration="00:00:00.0000577" startTime="2022-11-08T15:22:22.6647551+01:00" endTime="2022-11-08T15:22:22.6686331+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="c953f320-0876-425b-9cef-1fa6a3fbfd2f" />
    <UnitTestResult executionId="79730bea-94e1-42e5-81ee-544788341d60" testId="22390aeb-07a5-4cbe-bdcd-2fe6c90b72f7" testName="DataRowTestMethod (2)" computerName="LAPTOP" duration="00:00:00.0000034" startTime="2022-11-08T15:22:22.6688280+01:00" endTime="2022-11-08T15:22:22.6690112+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="79730bea-94e1-42e5-81ee-544788341d60" />
</Results>

What would be interesting, is to understand how the grouping and count is done. I would expect that the className and name fields of the TestDefinitions/UnitTest/TestMethod nodes are used.

I am not sure how things are working on AzDo side so I cannot really tell if there is something unexpected on MSTest side or if issue is on AzDo side.

charlesdwmorrison commented 1 year ago

I am good anytime for a meeting. What meeting platform would we use? Teams?

dogirala commented 1 year ago

Hi @Evangelink yes I am available to connect tmr around 3-4 IST if that works for everyone. To answer your question,

for MSTEST 2.2.3+ version datadriven tests, from what I know each entry in 'testdefinition' tag is treated as one individual testcase and uniquetestname for a testcase will be className + Name as you expected. And once the testcases are defined, for each testcase 'innerresults' tag are parsed and stored as a subresult for that particular testcase. They are not treated as individual testcases. If you see even the 'testId' attribute will be same for all entries in innerresults tags and it has the same Id as the parent testcase where it is part of

image

This was the existing behaviour in the mstest 2.2.3 and we had discussions with TestPlatform team around modifying this behaviour in 2.2.3 to send the testcases with individual testids and to remove the innerresults tag. but we considered it would be an extensive change and it might break a lot of existing scenarios. So the count is coming as expected and it is the current behaviour at the moment. as there is no way to treat the innerresults as individual testcases at the moment.

On the other hand with the changes done in mstest 2.2.4+, for datadriven tests each of them will be considered an individual test and they all have individual runIDs, so if you wish to count each of the datarow for datadriven test as an individual testcase you'd need to update the mstest adapter to latest version

image

Evangelink commented 1 year ago

@charlesdwmorrison I will start by having a first call with @dogirala to sync on how things are working and what can be (or not) changed. If there are some floating point I will setup another meeting between the 3 of us and if things look ok, I will post back results of our discussion here.

Evangelink commented 1 year ago

Hey there. Thanks for bearing with us. We did more investigation with @dogirala and here are the results.

Code example

When running the tests for the following code:

namespace MSTestProject
{
    [TestClass]
    public class UnitTest1
    {
        [TestMethod]
        public void TestMethod1()
        {
        }

        [DataRow(0)]
        [DataRow(1)]
        [DataRow(2)]
        [TestMethod]
        public void DataRowTestMethod(int i)
        {
        }
    }
}

MSTest up to 2.2.3 (included)

Result file (.trx)

The test result file (.trx) would look like this:

<?xml version="1.0" encoding="utf-8"?>
<TestRun id="89196f25-1585-46cb-9429-10e74cc201f4" name="currentUser@MyMachine 2022-11-08 15:25:41" runUser="currentUser" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
  <Times creation="2022-11-08T15:25:41.2734026+01:00" queuing="2022-11-08T15:25:41.2734030+01:00" start="2022-11-08T15:25:39.4674650+01:00" finish="2022-11-08T15:25:41.2866232+01:00" />
  <TestSettings name="default" id="a8af139d-1d5f-4b27-a369-5c0ee9dcc696">
    <Deployment runDeploymentRoot="currentUser_MyMachine_2022-11-08_15_25_41" />
  </TestSettings>
  <Results>
    <UnitTestResult executionId="6c135de3-73eb-4cbe-8f39-337340a6d650" testId="12713494-ea68-505c-1ae3-a995253433bd" testName="TestMethod1" computerName="MyMachine" duration="00:00:00.0077905" startTime="2022-11-08T15:25:40.8935789+01:00" endTime="2022-11-08T15:25:40.9730048+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="6c135de3-73eb-4cbe-8f39-337340a6d650" />
    <UnitTestResult executionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod" computerName="MyMachine" duration="00:00:00.0034511" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="7c368e52-7c43-4b1a-8696-e12d5003d061" resultType="DataDrivenTest">
      <InnerResults>
        <UnitTestResult executionId="6deb9d3f-02f8-4592-a7ad-da2c307b48d8" parentExecutionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod (0)" computerName="MyMachine" duration="00:00:00.0001003" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="6deb9d3f-02f8-4592-a7ad-da2c307b48d8" dataRowInfo="1" resultType="DataDrivenDataRow" />
        <UnitTestResult executionId="1b8b1f06-a581-40ad-b8fd-af918c42b9b8" parentExecutionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod (1)" computerName="MyMachine" duration="00:00:00.0000113" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="1b8b1f06-a581-40ad-b8fd-af918c42b9b8" dataRowInfo="2" resultType="DataDrivenDataRow" />
        <UnitTestResult executionId="c9490162-9341-4c9c-9be8-b34528d4b793" parentExecutionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" testName="DataRowTestMethod (2)" computerName="MyMachine" duration="00:00:00.0000042" startTime="2022-11-08T15:25:40.9841452+01:00" endTime="2022-11-08T15:25:40.9899959+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="c9490162-9341-4c9c-9be8-b34528d4b793" dataRowInfo="3" resultType="DataDrivenDataRow" />
      </InnerResults>
    </UnitTestResult>
  </Results>
  <TestDefinitions>
    <UnitTest name="TestMethod1" storage="c:\src\mstest-playground\mstestproject\mstestproject\bin\debug\net6.0\mstestproject.dll" id="12713494-ea68-505c-1ae3-a995253433bd">
      <Execution id="6c135de3-73eb-4cbe-8f39-337340a6d650" />
      <TestMethod codeBase="C:\src\mstest-playground\MSTestProject\MSTestProject\bin\Debug\net6.0\MSTestProject.dll" adapterTypeName="executor://mstestadapter/v2" className="MSTestProject.UnitTest1" name="TestMethod1" />
    </UnitTest>
    <UnitTest name="DataRowTestMethod" storage="c:\src\mstest-playground\mstestproject\mstestproject\bin\debug\net6.0\mstestproject.dll" id="cffe872d-a94b-a72a-efd2-1fb9ec28c26b">
      <Execution id="7c368e52-7c43-4b1a-8696-e12d5003d061" />
      <TestMethod codeBase="C:\src\mstest-playground\MSTestProject\MSTestProject\bin\Debug\net6.0\MSTestProject.dll" adapterTypeName="executor://mstestadapter/v2" className="MSTestProject.UnitTest1" name="DataRowTestMethod" />
    </UnitTest>
  </TestDefinitions>
  <TestEntries>
    <TestEntry testId="12713494-ea68-505c-1ae3-a995253433bd" executionId="6c135de3-73eb-4cbe-8f39-337340a6d650" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="cffe872d-a94b-a72a-efd2-1fb9ec28c26b" executionId="7c368e52-7c43-4b1a-8696-e12d5003d061" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
  </TestEntries>
  <TestLists>
    <TestList name="Results Not in a List" id="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestList name="All Loaded Results" id="19431567-8539-422a-85d7-44ee4e166bda" />
  </TestLists>
  <ResultSummary outcome="Completed">
    <Counters total="5" executed="5" passed="5" failed="0" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0" />
  </ResultSummary>
</TestRun>

What's important to note is:

Azure DevOps

image

Azure DevOps is parsing the trx file and using the results to populate their UI. For the previous trx file, display will be:

MSTest from 2.2.4 (included)

Result file (.trx)

The test result file (.trx) would look like this:

<?xml version="1.0" encoding="utf-8"?>
<TestRun id="f3a5ab2c-96f6-4c7f-a5c7-6f239db27664" name="currentUser@MyMachine 2022-11-08 15:22:22" runUser="currentUser" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
  <Times creation="2022-11-08T15:22:22.8553679+01:00" queuing="2022-11-08T15:22:22.8553683+01:00" start="2022-11-08T15:22:21.9392535+01:00" finish="2022-11-08T15:22:22.8715965+01:00" />
  <TestSettings name="default" id="d505dfd8-9081-4ad5-b030-af6556d675c2">
    <Deployment runDeploymentRoot="currentUser_MyMachine_2022-11-08_15_22_22" />
  </TestSettings>
  <Results>
    <UnitTestResult executionId="61cba81d-ea31-49ba-90e2-392492ded38a" testId="fbcb88c4-070c-4e3c-a6fc-fac91aafc616" testName="DataRowTestMethod (1)" computerName="MyMachine" duration="00:00:00.0000037" startTime="2022-11-08T15:22:22.6686938+01:00" endTime="2022-11-08T15:22:22.6688093+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="61cba81d-ea31-49ba-90e2-392492ded38a" />
    <UnitTestResult executionId="c953f320-0876-425b-9cef-1fa6a3fbfd2f" testId="face297a-edd7-48ce-9aaa-65a72fe2c332" testName="DataRowTestMethod (0)" computerName="MyMachine" duration="00:00:00.0000577" startTime="2022-11-08T15:22:22.6647551+01:00" endTime="2022-11-08T15:22:22.6686331+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="c953f320-0876-425b-9cef-1fa6a3fbfd2f" />
    <UnitTestResult executionId="79730bea-94e1-42e5-81ee-544788341d60" testId="22390aeb-07a5-4cbe-bdcd-2fe6c90b72f7" testName="DataRowTestMethod (2)" computerName="MyMachine" duration="00:00:00.0000034" startTime="2022-11-08T15:22:22.6688280+01:00" endTime="2022-11-08T15:22:22.6690112+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="79730bea-94e1-42e5-81ee-544788341d60" />
    <UnitTestResult executionId="d6e08bfa-b602-497a-8477-3d0711620467" testId="6349b831-af4e-4046-87c2-ec8e3ecffc93" testName="TestMethod1" computerName="MyMachine" duration="00:00:00.0031628" startTime="2022-11-08T15:22:22.6412291+01:00" endTime="2022-11-08T15:22:22.6584147+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="d6e08bfa-b602-497a-8477-3d0711620467" />
  </Results>
  <TestDefinitions>
    <UnitTest name="TestMethod1" storage="c:\src\mstest-playground\mstestproject\mstestproject\bin\debug\net6.0\mstestproject.dll" id="6349b831-af4e-4046-87c2-ec8e3ecffc93">
      <Execution id="d6e08bfa-b602-497a-8477-3d0711620467" />
      <TestMethod codeBase="C:\src\mstest-playground\MSTestProject\MSTestProject\bin\Debug\net6.0\MSTestProject.dll" adapterTypeName="executor://mstestadapter/v2" className="MSTestProject.UnitTest1" name="TestMethod1" />
    </UnitTest>
    <UnitTest name="DataRowTestMethod (1)" storage="c:\src\mstest-playground\mstestproject\mstestproject\bin\debug\net6.0\mstestproject.dll" id="fbcb88c4-070c-4e3c-a6fc-fac91aafc616">
      <Execution id="61cba81d-ea31-49ba-90e2-392492ded38a" />
      <TestMethod codeBase="C:\src\mstest-playground\MSTestProject\MSTestProject\bin\Debug\net6.0\MSTestProject.dll" adapterTypeName="executor://mstestadapter/v2" className="MSTestProject.UnitTest1" name="DataRowTestMethod" />
    </UnitTest>
    <UnitTest name="DataRowTestMethod (0)" storage="c:\src\mstest-playground\mstestproject\mstestproject\bin\debug\net6.0\mstestproject.dll" id="face297a-edd7-48ce-9aaa-65a72fe2c332">
      <Execution id="c953f320-0876-425b-9cef-1fa6a3fbfd2f" />
      <TestMethod codeBase="C:\src\mstest-playground\MSTestProject\MSTestProject\bin\Debug\net6.0\MSTestProject.dll" adapterTypeName="executor://mstestadapter/v2" className="MSTestProject.UnitTest1" name="DataRowTestMethod" />
    </UnitTest>
    <UnitTest name="DataRowTestMethod (2)" storage="c:\src\mstest-playground\mstestproject\mstestproject\bin\debug\net6.0\mstestproject.dll" id="22390aeb-07a5-4cbe-bdcd-2fe6c90b72f7">
      <Execution id="79730bea-94e1-42e5-81ee-544788341d60" />
      <TestMethod codeBase="C:\src\mstest-playground\MSTestProject\MSTestProject\bin\Debug\net6.0\MSTestProject.dll" adapterTypeName="executor://mstestadapter/v2" className="MSTestProject.UnitTest1" name="DataRowTestMethod" />
    </UnitTest>
  </TestDefinitions>
  <TestEntries>
    <TestEntry testId="fbcb88c4-070c-4e3c-a6fc-fac91aafc616" executionId="61cba81d-ea31-49ba-90e2-392492ded38a" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="face297a-edd7-48ce-9aaa-65a72fe2c332" executionId="c953f320-0876-425b-9cef-1fa6a3fbfd2f" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="22390aeb-07a5-4cbe-bdcd-2fe6c90b72f7" executionId="79730bea-94e1-42e5-81ee-544788341d60" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="6349b831-af4e-4046-87c2-ec8e3ecffc93" executionId="d6e08bfa-b602-497a-8477-3d0711620467" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
  </TestEntries>
  <TestLists>
    <TestList name="Results Not in a List" id="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestList name="All Loaded Results" id="19431567-8539-422a-85d7-44ee4e166bda" />
  </TestLists>
  <ResultSummary outcome="Completed">
    <Counters total="4" executed="4" passed="4" failed="0" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0" />
  </ResultSummary>
</TestRun>

What's important to note is:

Azure DevOps

image

Azure DevOps is parsing the trx file and using the results to populate their UI. For the previous trx file, display will be:

Reasoning for the change in 2.2.4

This change was done to overcome some issues on MSTest (e.g. if datarows are displayed as children they cannot have a different test ID and so are not expanded in Visual Studio - leading to impossibility to run/debug only 1 case) and also to have a better parity with results of NUnit and xUnit test frameworks.

What's next

I am going to start a discussion with AzDo team to improve the UX by allowing some grouping by test while keeping the test count, test filter and other functionalities working properly.

Evangelink commented 1 year ago

Hi, sorry I've been gone for a while. Was in the hospital. We are using Visual Studio 2017. Typically we do not using a Nuget Package for either MSTest.exe, We use the MSFramework.dll that is installed on our local hard drives. We typically are not using any TestAdapater. We are typically on .Net framework 4.52. I have tried several versions of the MSTest.TestFramework in combination with various versions of the MSTest.TestAdapter (for instance 2.2.10) . And when I run it in the pipeline with the reccomended test agent I get an error about "test adapter not found." So I have several questions:. 1.. If we stick with Visual Studio 2017 is there any Nuget Package combination that will allow us to get a correct DataDriven count? 2. What's the lowest version of Visual Studio and .Net framework that you have tested the data driven test functionality?

I did upgrade one of our test solutions to .Net 4.6 and I got the following result:

image

@charlesdwmorrison We have run some tests and we are wondering if there might be some configuration issue on your end because you are saying that you are using the agent 2.210 and MSTest 2.2.10 but your screenshot is showing some grouping which should not happen in this combination. Could you double check the versions? Ensure MSTest adapter and framework are using the same version. Could you also confirm that you are not using [assembly: TestDataSourceDiscovery(TestDataSourceDiscoveryOption.DuringExecution)]?

charlesdwmorrison commented 1 year ago

Just want to mention I am working on making sure I am giving you the correct information. We have some very legacy elements in our system, including the fact that our ADO is hooked up to a Team Foundation server as the source repository. I am making some changes to ensure that I am giving you the right info.

Evangelink commented 1 year ago

I don't know what's possible on your end but if you could create a simple MSTest project outside of any existing solution and pipeline and do a pipeline just building, testing publishing results of this project could help checking you have the same behavior by eliminating influence of some other factors.

charlesdwmorrison commented 1 year ago

Following @Evangelink's suggestion, I got some good results:

image

So what I did to get the above results was:

  1. upgrade my Visual Studio to the latest VS 2022.
  2. Upload one of my VS solutions to a private GitHub repository.
  3. Ensure that my solution was using the latest MSTest.TestAdapter and MSTest.TestFramework (installed to solution via nuget - see screenshot below.)
  4. Ran the test in a new pipeline, pulling the source code from the new GitHub repository

image

So this proves to me that ADO CAN count data driven test cases stored in Excel correctly, and this is great news!

Doing the above steps was much quicker and less painful than the other fiddling around I have been doing (trying to get the correct test adapters, etc.) It was much easier to upgrade to the latest versions of everything.

I think that my problem has been (and your eyes will roll when I tell you), that my team has been stuck on some old software which includes:

  1. Visual Studio 2017
  2. .Net 4.6
  3. Team Foundation Server as our Source code repository (I don't know what version, I have no access to this server)
  4. But ADO is hooked to team foundation server

There is something about this combination that was not letting an ADO pipeline build my solution. I think what might have been happening is that TFS was providing an incorrect MSBuild.exe When I tried to use the latest TestAdapter and TestFramework, I was getting a lot of errors about other assemblies in my test solution not being found.

My team is overdue for upgrading all this. I think they have kept an on-premise TFS for reasons of security. But it really is time to get away from it. In the meantime, maybe there are some weird, very old combinations similar to my team's current setup that needs to be put in notes as "not supported." I understand we are using some very old stuff and will try to help my team upgrade.

charlesdwmorrison commented 1 year ago

P.S. It also could be that our TFS server is just not configured correctly. I would guess that its configuration has not been revisited since ~2015. It seems like it just doesn't deliver the source code developed in VS 2022 to the Azure DevOps agent correctly. The easy solution is to move to GitHub. Will try to help the team do it.

Update 11/12/22: I was able to create a completely new folder (not a new repo, but a new folder) in our TFS repo, upload the 2022 version of my test solution with the latest adapters and latest test agent as shown above, and I got good results, with the correct number of test cases displayed in Azure DevOps. This means the older TFS will work as a sources repository, I just had to use a fresh folder. It must have been that there was something being retained in the older folder which contained the VS 2015 version of the test solution. For others still getting errors, I would recommend the same approach: try your old source repository but use a fresh folder and make sure you convert your solution to the latest Visual Studio and the latest adapters.

Evangelink commented 1 year ago

@charlesdwmorrison Awesome news! Thanks for keeping us posted with your results.

I will keep this ticket open as I do think there is some UX improvement that needs to be done on AzDo and will discuss with the their team how we can proceed.

charlesdwmorrison commented 1 year ago

Regarding Needed UX improvements here are some observations. Things are not working when executing from a Test Plan. What I am seeing that with the latest MSTest.TestAdapter, and the latest MsTest.TestFramework data driven tests ARE being reported correctly when executed from a "build" pipeline or a release pipeline. However, the counting is NOT working correctly when the same tests are executed from a Test Plan. The test Plan is another layer on top of the build and release pipelines. My guess is there is some event that is not being passed from the Test Plan, back to the release pipeline, or vice-versa.

Steps to Repro

  1. Create build and release pipelines according to this article: https://dev.azure.com/JHA-5/EPSQA/_releaseDefinition?definitionId=62&_a=environments-editor-preview
  2. Create a new test plan.
  3. Create a new test case in the test plan. Grab the test case ID.
  4. From the Visual Studio Test solution, associate a VS test case to the new test plan using the test case ID. In ADO, if you now open the test case, and move to the tab "Associated Automation" you will see your test assembly.
  5. Run the test case from the test plan. Choose "Run with Options". The release pipeline you created in step 1 should show up. Note that (per documentation) the release pipeline property "Choose tests using" needs to be set to "Test Run" to get tests to execute from the test plan.
  6. When the test finishes, look at the run summary . We can also go back the release log .

Bug==> Both the summary and the release log should enumerate the data driven results. But They do not. They just enumerate the number of test methods executed. In addition, there are two more problems: a. The release summary doughnut does not show any color, it is still gray. Also the result count is in "other" rather than in passed or failed. b. If we go back to the test plan from which we executed the test, the test we executed still shows "In Progress" even though both the test agent and the pipeline indicate that the test is complete.

Additional Information/Tests

It seems like an event or bit of data indicating the number of data driven tests is not passed from the test plan to the release pipeline. For a further test, we can reuse the same pipeline created above by changing the "Select Tests Using" property from "Test Run" to "Test Assemblies." We can filter the tests by putting something like "TestCategory=QA_Smoke" as the test filter. In other words, we are now by-passing the Test Plan. Now manually press "Create Release" on the release pipeline. Result ==> We DO get the correct results of a data driven test.

Here are some screenshot illustrating what I am seeing when executing from a test plan.

image

image

Here is the same pipeline, but where I have changed "Select Tests Using" to "test assemblies," put in the filter "TestCategory=QASmoke," and executed the pipeline by pressing "Create Release." This approach is only a change to by-pass the test plan and correctly counts data-driven tests.

image

It would be good for my team if we could get this investigated as we heavily rely on Test Plans.