Open lombardm opened 3 years ago
Hi @chrischoy,
is there any plan to provide onnx support for this library? I would like to use this library in a deployment setting without running a python interpreter, thus onnx seems to be the only proper option :)
I am also happy to contribute here, let me know what you think.
Best, Magnus
+1 on this, any updates? Happy to contribute
Any updates? Error: RuntimeError: ONNX export failed: Couldn't export Python operator MinkowskiConvolutionFunction. Happy to contribute here
Hi @lombardm @NUS-WWS @Magnusgaertner @chrischoy @mariusud
Any updates on ONNX Export?
RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: SparseTensor
Happy to collaborate and contribute here.
Sorry for the "Bug Report" label. Actually this is not a real bug concerning thisfantastic work, but just a demand for help.
I would like to export a model based on ME (specifically FCGF) using the Pytorch functionality
torch.onnx
. The idea is then to pass it to TensorRT to try to boost the performances of such architecture. I wrote a simple code that I report in a snippet in the following:Here the net_input is a sparse Tensor, created using the Minkowski library, starting from a regular array of coordinates and features. Unfortunately the
torch.onnx.export
throws the RuntimeError:Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: SparseTensor
How can we deal with this custom input then? I know this is something that should be better to ask in a Pytorch forum, however I have already checked there and I only found a couple of not answered questions similar to mine. Therefore I was wondering if this kind of necessity has been already encountered by someone else using this library and if anyone is aware of how to solve this problem.
Many thanks in advance and have a nice day, Marco ``