tensil-ai / tensil

Open source machine learning accelerators
https://www.tensil.ai
Other
348 stars 28 forks source link

Support MatMul in ONNX frontend #51

Closed petrohi closed 2 years ago

petrohi commented 2 years ago

This is a quick change to add support for MatMul. The change to follow will be to enable ONNX frontend to work with models having variable batch size. The frontend will freeze the input dimensions to batch size specified in the compiler arguments. With this change we will add compiler tests for ResNet20 with variable batch size. Such model contains MatMul/Add combination instead of Gemm, thus making sure MatMul is tested.

shortcut-integration[bot] commented 2 years ago

This pull request has been linked to Shortcut Story #448: Support MatMul in ONNX frontend.