NVIDIA / mxnet_to_onnx

MxNet to ONNX Exporter
Apache License 2.0
56 stars 12 forks source link

Porting this functionality into ONNX-MXNet repo #1

Closed lupesko closed 6 years ago

lupesko commented 6 years ago

Hey Nvidia team,

Thanks for this work adding functionality to MXNet! Not sure whether your know, but there is already a repo for MXNet/ONNX compatibility: ONNX-MXNet. It would be best for MXNet and ONNX users to have this functionality there, rather than having two separate repos.

I highly recommend to contribute a PR merging this functionality there.

Thanks!

mkolod commented 6 years ago

@lupesko Yes, we are aware of that. However, we have other requests from the AWS team at the moment, and no bandwidth to do the plumbing to integrate the two projects at this time. This may need to wait a few weeks. If Amazon has someone on their team who wouldn't mind taking the time to do the integration, then please feel free to do so. We will come back to this as soon as possible if no one else volunteers, but right now we're working on delivering other features to MxNet that Amazon product management itself determined to be of a higher priority.

mkolod commented 6 years ago

@lupesko Also, please note that the current project is serialized MxNet model-to-ONNX, and given our recent discovery of the instability of the serialized MxNet representation (frequent field renaming, say "param" to "params" and so on), maintaining the compatibility layer would be painful (this converter works with MxNet 1.0, but not 0.8, for example, due to serialized format brakages in MxNet). We only discovered this while working on the implementation. However, since NNVM provides that compatibility layer even as serialized form breakages occur, it may make sense to provide an NNVM-to-ONNX exporter, the way ONNX-MxNet is based on importing from ONNX directly into NNVM (as far as I can tell). This will provide forward and backward compatibility. Since this is something that NVIDIA may contribute to anyway, that would probably be the version to be ultimately integrated into ONNX-MxNet. However, if you feel like integrating the current exporter into ONNX-MxNet as an intermediate solution is acceptable (with a coming replacement via NNVM in the works), please feel free to pick it up and integrate. Otherwise, stay tuned for NNVM-to-ONNX exporter for ONNX-MxNet.

Roshrini commented 6 years ago

@mkolod I agree that NNVM to ONNX makes more sense. This avoids users to have to checkpoint the model first after the training. Easier user experience would be to just export the model into onnx format after training with “MXNet module”. But this functionality(converting from model,params file) you have implemented is also useful for those who have pretrained models. I think We can provide both ways for users.

mkolod commented 6 years ago

@Roshrini This makes perfect sense. I'll be happy if both converters end up being useful! :slightly_smiling_face: It's just that I won't have time to deal with merging the current one with ONNX-MxNet (refactoring for API consistency, etc.) until the NNVM-ONNX one is done. If anyone at Amazon or the community would like to attempt that, please feel free to do so. Otherwise, I will come back to this task later on, after the NNVM-ONNX work is complete.