onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.39k stars 630 forks source link

Multiple model #133

Open shbnm21 opened 5 years ago

shbnm21 commented 5 years ago

How to convert multiple models to onnx format.

I have trained model which is having Gan model as well as encoder decoder models.

How do I convert all the models to onnx format

askhade commented 5 years ago

Which framework did you use to create these models? There are converters and exporters which can convert models from tensorflow, keras, pytorch and such to onnx You can start from here : https://github.com/onnx

shbnm21 commented 5 years ago

Thanks for the response,I was training a project on GitHub.It is pytorch based model There are 4 .pth files.Out of which 3 files I have successfully converted to onnx and then to caffe2.But I am facing issue with one of the .pth file. Not able to convert into caffe2. I have asked on caffe2 forum but might be nobody noticed it.

prasanthpul commented 5 years ago

Are you able to convert the 4th model to ONNX? You should consider using ONNX Runtime which runs ONNX models directly and supports all of ONNX. I do not believe Caffe2 supports all of ONNX.

shbnm21 commented 5 years ago

Thank u for the reply. Yes I am able to convert 4 th model to onnx. I will look into onnx runtime . Because of the 4 th model I am stuck and not able to deploy the model on Android .

kalyangvs commented 4 years ago

Hi @askhade / @prasanthpul , sorry I do not know whether is this the right place to ask .. Model : I am using a bert model and 2 different classifiers upon it, trained using NeMo.

Usually for text-classification, which has bert and a single classifier they jit trace each separately, convert to individual onnx files and use a method to attach both the onnx files by manipulating node names etc. When I use the above to export the model to onnx and check the model I get Graph must be in single static assignment (SSA) form, however 'hidden_states' has been used as graph input names multiple times.

Suppose if I change the input name of a classifier to hidden_states_1 how does the IR know both the input variables of each classifier are the same output of bert model? (same error applies to output variable logits)?