Closed zpetrace closed 3 months ago
Thanks for the notes @ptoscano!
For the configuration of testimony
- I can definitely make some of the field choices, that's not a problem (and it would actually be better) and I can definitely add a reference
field with string type (as that will be changing from test to test). For the other fields that will always be the same - yes, they will. I made them global for easier readability of the code (so they don't repeat in every test case) but (I think from what I understood) we need those fields for traceability for Polarion. When the test cases will be imported into Polarion they need to have those fields (we know it is automated and it's upstream but we need to see that in Polarion as well as not all test cases added will be from this repo so we need which ones are/aren't automated there). Maybe @Lorquas will have some additional answer to that.
For the testimony validate
- I personally would stick to the 3rd option - do not merge the job until all the tests have valid docstring comments; this means running testimony manually in PRs with changes related to this to check the results but that could also take a month so we should consider if we want to have a PR open for that long but IMHO it seems like the best option so far.
I can definitely make some of the field choices, that's not a problem (and it would actually be better) [...] For the other fields that will always be the same - yes, they will.
Thanks!
I can definitely add a
reference
field with string type (as that will be changing from test to test).
Thanks! I'd not make it required thought, as there may not be references for a test, and that's OK (not ideal, still OK).
For the
testimony validate
- I personally would stick to the 3rd option - do not merge the job until all the tests have valid docstring comments; this means running testimony manually in PRs with changes related to this to check the results but that could also take a month so we should consider if we want to have a PR open for that long but IMHO it seems like the best option so far.
OK, makes sense. In this case, what do you think about splitting the betelgeuse job (and its config) in its own PR? That one seems to work fine already, and we can run it in new PRs to validate the result/output.
I'd not make it required thought, as there may not be references for a test, and that's OK (not ideal, still OK).
Yeah sure:)
OK, makes sense. In this case, what do you think about splitting the betelgeuse job (and its config) in its own PR? That one seems to work fine already, and we can run it in new PRs to validate the result/output.
Yeah, that makes sense, I will leave this PR to betelgeuse only then and I will create a separate PR for testimony that I will leave as draft for now.
Sounds good.
One thing I'd add here is the README.md
as currently added in #270, as IMHO it fits this as more general "Betelgeuse enablement". Don't forget to update it according to the changes that were done here to run betelgeuse properly.
Also, please explain a bit more the changes as commit message, so the content of the commit in this PR is a bit less cryptic, including what it is for.
Lastly: please rebase this branch on top of master
, while you are pushing new changes.
Thanks!
Rebased, commit message added and README.md added also :)
/packit retest-failed
/packit retest-failed
Thanks for the changes so far, it runs the jobs properly now :)
More notes from my side:
testimony validate
job right now fails because no test has the testimony docstring comments; sadly github does not really have a way to mark a job as "expected to fail", so we need to think about what to do in the meanwhile:testimony
ignoring its return status (something likefoo || true
); ugly as well, and will need manual checking to see whether there were new problems or the tests were fixedtestimony
manually in PRs with changes related to this to check the resultstestimony
, IMHO some of the fields would be better aschoice
type, setting the valid values:SubSystemTeam
would have onlysst_csi_client_tools
as value for now, and more team names would be added as neededTier
seems like it would acceptTier 0
,Tier 1
, and so onUpstream
field for? in the currently open PRs, it seems always set toYes
CaseAutomation
field for? in the currently open PRs, it seems always set toAutomated
CaseComponent
field for? in the currently open PRs, it seems always set toinsights-client
id
field?References
(or whichever name is better) to list any references for a test? for example bug tracker IDs (e.g. bugzilla, Jira, etc), generic URL to documentation, and so on; this way, it would be easy to "link" resources to tests