Closed sagudev closed 2 months ago
Hi there! Happy to see others using this tool. What version/commit are you reproducing this from?
I am using https://github.com/sagudev/moz-webgpu-cts/tree/servo (based off main) on servo.
GitHubTools to analyze and adjust Web Platform Tests coverage in Firefox, currently catered exclusively towards the WebGPU team. - GitHub - sagudev/moz-webgpu-cts at servo
I think it's happening because it's not expecting FAIL
as expectation for test but only for subtests. Relevant code: https://github.com/ErichDonGubler/moz-webgpu-cts/blob/a82d347bb86f3dfb801b1d56fb22422f476715f4/moz-webgpu-cts/src/metadata.rs#L978-L984
So we need to add FAIL
to TestOutcome
.
https://github.com/ErichDonGubler/moz-webgpu-cts/pull/80/commits/e90940f21efb0748346c2e87c9d3954417392711 now fixes metadata parser. The only thing missing is analyzer stuff for tests_with_fails.
Still fails on:
[canvas_composite_alpha_bgra8unorm_opaque_draw.https.html]
expected: [PASS, FAIL, CRASH]
so we will probably need to handle such cases as subtest altogether.
Also failure on disabled
(because it's valid to have any value):
[cts.https.html?q=webgpu:api,operation,adapter,requestDevice:features,known:*]
disabled: reasons for being disabled
@sagudev Just FYI, Erich's on vacation until next Wednesday, so you probably won't get a response here until then.
@sagudev Just FYI, Erich's on vacation until next Wednesday, so you probably won't get a response here until then.
Thanks for letting me know.
Thanks for the assist, @jimblandy! 🙂
Sigh, the lack of a good diagnostic is definitely https://github.com/ErichDonGubler/moz-webgpu-cts/issues/30. Sorry about that!
@sagudev: RE: analyzer stuff: Does Servo actually use that tool? I had not expected it to be used by anybody but Mozilla. 😅
@sagudev: RE: adding FAIL
to the test outcome data model: This is an interesting weirdness in WPT. My understanding is that expected
outcomes in tests are a strict superset of the ones that can be discovered in subtests (CC @jgraham for clarity). I think it's clear that we need to add it. PRs welcome! I don't think I'm going to be able to prioritize it for a while, but I should have bandwidth for review. The code you linked to (https://github.com/ErichDonGubler/moz-webgpu-cts/commit/e90940f21efb0748346c2e87c9d3954417392711) LGTM (minus a nit or two I'd present in review), so I bet we can start with that.
@sagudev: Can you please clarify what you mean by the following (original link)?
so we will probably need to handle such cases as subtest altogether.
Does Servo actually use that tool? I had not expected it to be used by anybody but Mozilla
Currently no, but I am very pleased with experiments I did with it (it's so much faster than ./mach update-wpt
we have), although I needed wptreports we don't normally do in CI. I am currently investigating some rare CRASHes, before I can return to my PR here.
What I discovered is that all valid values for subtests are also valid for test that have no subtests. You can take a look at PR how I hacked around this.
I think that's not quite true; you can't ever have NOTRUN
as a top-level test status (for tests without subtests the test is always at least started); see https://searchfox.org/mozilla-central/source/testing/mozbase/mozlog/mozlog/logtypes.py#192-222 for details.
Example expectation:
gives