teemtee / tmt

Test Management Tool
MIT License
86 stars 130 forks source link

Implement import and export with Polarion #888

Closed thrix closed 2 years ago

thrix commented 3 years ago

Seems for Automotive it would be a big selling point for tmt if we could integrate it with Polarion.

For start, it would be great if we could export a test to polarion and import it back maybe?

There is a nice python library to do this reasonably: https://github.com/RedHatQE/pylero

Not sure if this should be a internal only feature or not ...

psss commented 3 years ago

If the pylero library could be used the implementation should not be that hard. I don't see it packaged for Fedora though. Seems it is available as github project only?

thrix commented 3 years ago

@psss I will need to ask :(

thrix commented 3 years ago

we are getting pings for this, so proposing for 2.0

rasibley commented 3 years ago

Test plan import/export would be a nice feature but even more important use case is importing TMT test runs into Polarion.

kkaarreell commented 2 years ago

@rasibley Are you doing the actual test failure analysis in Polarion? I am asking because Report portal is being presented as a smart tool for reviewing test results and if a review is happening in Report portal there is no point importing test runs to Polarion from by tmt as it would be missing data from the actual review (waivers, comments, linked bugs etc.). From this perspective it would make more sense to import data to Report Portal and then reviewed runs from Report Portal to Polarion.

KwisatzHaderach commented 2 years ago

IMO the export should be done to both services, polarion for the real result (in progress by me) and RP for further investigation (not as part of this issue). After investigation and fixes you can always rerun the TC and export to polarion again so your last run is nice and green.

rasibley commented 2 years ago

@kkaarreell yes we would like to use TFA classifier and RP for this, I have been promoting this for my team, and we'd like to follow what RHEL is doing. I don't believe RP has the capability to provide a single dashboard to show our executed test runs against requirements for a given release. We need the traceability which means we will most likely need to have it in both places as @KwisatzHaderach mentioned. I tagged you both in a document Jenny started to capture the requirements, maybe we can discuss the needs further there.

jgalipea commented 2 years ago

@rasibley @kkaarreell Correct Rachel ... Report Portal (RP) is for test failure analysis and not for product quality metrics. So the ability to track per product requirement does not exist in RP. Two different tools, two different purposes and really two different target audiences.

kkaarreell commented 2 years ago

So what we are doing today is that we have a pipeline of tools in our automation (e.g. EWA) and within these tools and artifacts (e.g. beaker jobs, TCMS runs etc.) we are able to store additonal relevant metadata (e.g. errata number and build NVR in TCMS runs) which allows us to continue with the next step in a pipeline directly from a previous step and avoid repeating the whole process since the beginning. So for tmt, ReportPortal and Polarion it would mean that Requirement relevant metadata originating in tmt would be stored along with other data in respective ReportPortal runs so that we can later import things from ReportPortal directly to Polarion. So the pipeline would be tmt -> ReportPortal ->Polarion for tests and test review results.

kkaarreell commented 2 years ago

I don't believe RP has the capability to provide a single dashboard to show our executed test runs against requirements for a given release.

Just to clarify, I wasn't suggesting to present traceability in ReportPortal. I just meant that from my perspective it make sense to import data to Polarion from ReportPortal where all the waivers, comments, linked issues etc. are and not from tmt which could provide only unreviewed test results at best.

jgalipea commented 2 years ago

ack ... exactly!