mit-han-lab / torchsparse

[MICRO'23, MLSys'22] TorchSparse: Efficient Training and Inference Framework for Sparse Convolution on GPUs.
https://torchsparse.mit.edu
MIT License
1.16k stars 132 forks source link

Does torchsparse support conversion to ONNX? #240

Open FengYuQ opened 10 months ago

FengYuQ commented 10 months ago

Great work, I would like to ask if torchsparse supports conversion to ONNX format? Because recently I wanted to export a model to ONNX format, but the export failed because MinkowskiEngine was used in the model. So if torchsparse supports ONNX conversion, I plan to convert MinkowskEngine to torchsparse for export.

zhijian-liu commented 9 months ago

It does not yet. It's on our roadmap to support easier model deployment. Stay tuned!

Byte247 commented 6 months ago

Any update on this?