Open andro-galexy opened 3 years ago
As in this code-base direct .
mnn
weight files are provided which are parsed using the custom CPP module and then infered using MNN inference engine, but one of you comments you mentioned of first converting the .pt file toonnx
format and later it's been converted to the .mnn
weights, these conversion scripts from .pt
toonnx
would be extremely helpful for converting the same for some other inference engine frameworks.
I don't plan to publish onnx files recently. In fact, you can get onnx file by referring to my project.
As in this code-base direct .
mnn
weight files are provided which are parsed using the custom CPP module and then infered using MNN inference engine, but one of you comments you mentioned of first converting the .pt file toonnx
format and later it's been converted to the .mnn
weights, these conversion scripts from .pt
toonnx
would be extremely helpful for converting the same for some other inference engine frameworks.