nod-ai / SHARK-TestSuite

Temporary home of a test suite we are evaluating
Apache License 2.0
5 stars 35 forks source link

Add test for open-llama-3b-v2-f16 model through sharktank. #272

Closed ScottTodd closed 5 months ago

ScottTodd commented 5 months ago

Progress on https://github.com/nod-ai/sharktank/issues/22

This adds one test for a llama model running through https://github.com/nod-ai/sharktank. That project is still getting set up, so new docs for this particular workflow are coming in at https://github.com/nod-ai/sharktank/pull/69 and tests in that repo are in https://github.com/nod-ai/sharktank/pull/70.

Specifically, this exercises:

Ideas for future work:

saienduri commented 5 months ago

Nice, is there a reason that sharktank needs its own config file? Pytorch models and sharktank have the same starting point of mlir, so maybe just having one overarching models configuration could work.

ScottTodd commented 5 months ago

Nice, is there a reason that sharktank needs its own config file? Pytorch models and sharktank have the same starting point of mlir, so maybe just having one overarching models configuration could work.

I'm going back and forth on that, thanks for noticing too.

With multiple files we keep the test lists separate. The lists are short, but unqualified right now. I think I want them to be qualified for merging, so this: https://github.com/nod-ai/SHARK-TestSuite/blob/4751fab06fdc08818ee30530e0cdfc38fc6ce4b7/iree_tests/configs/config_pytorch_models_cpu_llvm_task.json#L10-L14

would be

    "skip_compile_tests": [
      "pytorch/models/sdxl-scheduled-unet-3-tank",
      "pytorch/models/sdxl-vae-decode-tank",
      "pytorch/models/sdxl-prompt-encoder-tank"
    ],

then we'd also have new models:

    "skip_compile_tests": [
      "pytorch/models/sdxl-scheduled-unet-3-tank",
      "pytorch/models/sdxl-vae-decode-tank",
      "pytorch/models/sdxl-prompt-encoder-tank",
      "sharktank/llama/open-llama-3b-v2-f16",
    ],

Let me check if that works... the test names are still a bit awkwardly passed through pytest / conftest.py.

ScottTodd commented 5 months ago

I'm also debating naming/grouping

PR currently:

test suite exists in iree_tests? config name
onnx ops yes onnx
pytorch ops no pytorch
pytorch models yes pytorch_models
sharktank models yes (this PR) sharktank

One alternative (grouping ops across frameworks and models across frameworks):

test suite exists in iree_tests? config name
onnx ops yes ops
pytorch ops no ops
pytorch models yes models
sharktank models yes (this PR) models
ScottTodd commented 5 months ago

... and deciding how to run the tests:

By directory, reusing configs:

pytest iree_tests/pytorch/models --config-files=models.json
pytest iree_tests/sharktank --config-files=models.json

By directory, separate configs:

pytest iree_tests/pytorch/models --config-files=pytorch_models.json
pytest iree_tests/sharktank --config-files=sharktank.json

If we ran pytest iree_tests/ --config-files=models.json, that would go down into onnx/node/.

Ah! Nevermind -- we can run pytest dir1 dir2. Maybe not super convenient for local use though - would need to know which configs map to which test suite subfolders.

saienduri commented 5 months ago

Let me check if that works... the test names are still a bit awkwardly passed through pytest / conftest.py.

Yup would be nice to have the them qualified and merge.

image

I think you would just have to change it to check that the test_directory relative path to repo_root is in the config file rather than just the test_directory name as it is now

saienduri commented 5 months ago

... and deciding how to run the tests:

By directory, reusing configs:

pytest iree_tests/pytorch/models --config-files=models.json
pytest iree_tests/sharktank --config-files=models.json

I think of the options, by directory reusing configs looks like the cleanest

ScottTodd commented 5 months ago

Sorry, building up a stack of merge conflicts with https://github.com/nod-ai/SHARK-TestSuite/pull/271 here x_x

ScottTodd commented 5 months ago

Splitting off a few smaller PRs from this: