Closed edwardnguyen1705 closed 2 years ago
Hi @edwardnguyen1705 thanks for telling us about this new model and your issue.
We were able to reproduce your issue and are working on a fix with the nanodet-plus-m_320.onnx pre-exported from this release: https://github.com/RangiLyu/nanodet/releases/tag/v1.0.0-alpha-1
We'll let you know when we have a release ready for you to test so you can be unblocked. Thanks!
Hi @mgoin thanks for letting me know. It is great to hear you said that.
Hi @edwardnguyen1705 we just released DeepSparse 0.11.1, which contains a fix for the assertion that you ran into. Hopefully this fixes your issue, and if you try it, please let us know how it goes. Thank you!
Hi @tlrmchlsmth , Thanks for the fix. After install new verion:
deepsparse 0.11.2
sparseml 0.11.1
sparsezoo 0.11.0
sparsify 0.11.0
It seems that the old err is solved, but I got another err, please the the attached image. r
Hi @tlrmchlsmth ,
I am able to fix the error. However, It would be nice if sparsify
supports pytorch
model (pth). It is really hard to find the mapping between onnx
and pytorch
. Thanks for your time.
Hi @edwardnguyen1705, I've opened up https://github.com/neuralmagic/sparseml/issues/670 to continue the discussion on this. These issues will happen outside of DeepSparse and we'll continue to support through that new thread.
Thanks!
Dear @jeanniefinks ,
Firstly, thanks for sharing your work. We are trying to apply sparseml to NanoDet-Plus-m that is considered the most suitable for edge devices til now.
Here are some steps I have been trying:
pytorch (.pth)
model then convert toonnx
model. I even tried: sparseml.onnx_export, I was able to convert tomodel.onnx
, but still failed in the next step.onnx
model todeepsparse
. It is similar to the issue #218I already tried on varying environments:
Code to produce error:
Error:
It seems that you have your own onnxruntime? Could you examine the NanoDet-Plus-m model? I really appreciate your time.