Accelergy-Project / timeloop-accelergy-exercises

Exercises for exploring the Fibertree, Timeloop and Accelergy tools
MIT License
80 stars 28 forks source link

Issue with running MM #47

Closed sunwooyoo closed 4 months ago

sunwooyoo commented 4 months ago

I want to run the MM in layer_shape using the /run_example_designs.py code, but an error occurs because the instance of archi.yaml in example_designs/ is different. So, when I corrected the instance and ran the code, an error occurred again in top.yaml.jinja2. Are you sharing the code to run MM that I couldn't find?

tanner-andrulis commented 4 months ago

I'm not sure what you are asking.

What specific commands are you running? Have you modified any files?

sunwooyoo commented 4 months ago

To be exact, I wanted to run the files in the timeloop-accelergy-exercises/workspace/example_designs/layer_shapes/MM/ResNet50_repr/ folder. I ran the file timeloop-accelergy-exercises/workspace/example_designs/run_example_designs.py. (For example, python3 run_example_designs.py --problem MM/ResNet50_repr --architecture simple_weight_stationary) However, an error occurred because the architecture file loaded from run_example_designs.py and the instance of the problem file did not match. So I modified the instance of timeloop-accelergy-exercises/workspace/example_designs/example_designs /simple_weight_stationary/arch.yaml and its associated permutations. But another error occurred in top.yaml.jinja2. So I'm wondering if there is an arch.yaml and run file that can run the example MM.

tanner-andrulis commented 4 months ago

Those specific layers only work with the sparse tensor core.

Try the following:

  1. Update the sparse tensor core K=32 to K=16 (see latest commit in this repostiory)
  2. python3 run_example_designs.py --architecture sparse_tensor_core_like --problem MM/ResNet50_repr/M1024-K512-N256.yaml
sunwooyoo commented 4 months ago

Thank you for your reply : )

tanner-andrulis commented 4 months ago

No problem! Feel free to close this issue if all is resolved. Happy to answer any other questions you may have.