aiqm / torchani

Accurate Neural Network Potential on PyTorch
https://aiqm.github.io/torchani/
MIT License
459 stars 126 forks source link

.ONNX model format? #568

Closed JosePereiraUA closed 3 years ago

JosePereiraUA commented 3 years ago

Not sure if this is the correct place to post such a question, but what would we need to do to export the model (of type torchani.models.BuiltinEnsemble) to a .ONNX format?

My first attempt was to use: torch.onnx.export(model, (coordinates, species), "test.onnx", verbose = True), which of course didn't work. I have no idea where to start. Has anyone considered/attempted this?

IgnacioJPickering commented 3 years ago

It is not possible to export the model to ONNX currently. There is planned support for this in the future but we depend on pytorch and ONNX both implementing all necessary operations, which is currently not the case.

JosePereiraUA commented 3 years ago

Thank you for your answer. My objective is to implement this model in the Julia language. Do you have any suggestion?

IgnacioJPickering commented 3 years ago

@JosePereiraUA I'm not very familiar with the Julia language so I don't think I can offer many recommendations regarding this. I can say that if you want to integrate the model with existing code that uses Julia, then this is probably a good idea, but if you are looking for speed improvements however keep in mind that pytorch's backend is actually C++/CUDA so I think there won't be many improvements in that regard by porting the code to Julia, but I may be wrong.