iree-org / iree-test-suites

Test suites for IREE and related projects
Apache License 2.0
1 stars 3 forks source link

Rework expected results config files in ONNX ops test suite #25

Open ScottTodd opened 5 days ago

ScottTodd commented 5 days ago

See the previous issue for this at https://github.com/nod-ai/SHARK-TestSuite/issues/253

Config files for the ONNX operator tests use this schema: https://github.com/iree-org/iree-test-suites/blob/03f10e99d5f80696107038f3e8da8525aa31d50a/onnx_ops/conftest.py#L26-L53

(aside: that schema could be encoded in a file for validation/reference, rather than just included in a comment)

Right now test cases are included in one of these lists or not mentioned at all

While this lets us add new tests without needing to update existing files, it doesn't make it clear how many tests are included and which are passing.

Now that test results can be automatically reflected back into config files using https://github.com/iree-org/iree-test-suites/blob/main/onnx_ops/update_config_xfails.py, we could for example

A) also list passing tests:

"passing_tests": [
  "test_abs",
]

B) list test statuses directly:

"tests": {
  "test_abs": "pass",
  "test_add": "skip_compile",
  "test_div": "skip_run",
  "test_mul": "fail_compile",
  "test_sub": "fail_run",
}

I like option B, and I've started in a similar direction with https://github.com/iree-org/iree-test-suites/pull/23. That has a single test function per model that runs all stages (import, compile, run). Tests set their expected result using for example @pytest.mark.xfail(raises=IreeRunException) or @pytest.mark.xfail(raises=IreeCompileException)