nod-ai / SHARK-TestSuite

Temporary home of a test suite we are evaluating
Apache License 2.0
5 stars 35 forks source link

Refactor pytest collection to anchor on test case files, not .mlir files. #282

Closed ScottTodd closed 4 months ago

ScottTodd commented 4 months ago

This allows us to detect test cases that store their .mlir or .mlirbc file remotely, since the test_data_flags.txt or test_cases.json file will always exist locally. This also changes test case names in summaries from e.g.

PASSED onnx/node/generated/test_sub_uint8/model.mlir::cpu_llvm_sync_test
PASSED onnx/node/generated/test_sub_example/model.mlir::cpu_llvm_sync_test
PASSED onnx/node/generated/test_sub_bcast/model.mlir::cpu_llvm_sync_test

XFAIL pytorch/models/opt-125M/opt-125M.mlirbc::cpu_llvm_task_splats - Expected compilation to fail (included in 'expected_compile_failures')

to e.g.

PASSED onnx/node/generated/test_sub_example/test_data_flags.txt::model.mlir::cpu_llvm_sync
PASSED onnx/node/generated/test_sub_uint8/test_data_flags.txt::model.mlir::cpu_llvm_sync
PASSED onnx/node/generated/test_sub_bcast/test_data_flags.txt::model.mlir::cpu_llvm_sync

XFAIL pytorch/models/opt-125M/test_cases.json::opt-125M.mlirbc::cpu_llvm_task::splats - Expected compilation to fail (included in 'expected_compile_failures'

(anchored on the .txt or .json so added back the .mlir, also dropped the "test")

This is a bit hacky either way. I took a look at https://docs.pytest.org/en/stable/example/customdirectory.html as an alternative to https://docs.pytest.org/en/stable/example/nonpython.html and that could help.