Closed interesaaat closed 4 years ago
@interesaaat Will this include save/load of the models using the ONNX format?
@mdabros that may require some work. Apparently ONNX support in libTorch is limited for the moment. My hope is that in future release they will provide the torch.onnx.export
functionality also in libTorch. If they don't we will have to write our own to-onnx-exporter.
I see, that makes it more complicated. I was hoping the functionality was already there in libTorch. I hope they will add it soon. ONNX support in TorchSharp would really help make it a perfect choice for developing deep learning models for production. But I guess it is a separate issue from this one then.
I totally agree. Maybe we can open an issue on the main PyTorch repo? Actually let me ask on the PyTorch's slack channel.
This is covered by #146
@dsyme does #146 include onnx support (save as)?
@dsyme does #146 include onnx support (save as)?
It's not mentioned there - I'll add it - I presume it's a pre-requisite. If you'd like to contribute a PR for this that would be great
This will probably require a custom save logic or hooks into JIT save.