pytorch / pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration
https://pytorch.org
Other
83.75k stars 22.59k forks source link

Import ONNX model to Pytorch #21683

Closed hossein1387 closed 3 years ago

hossein1387 commented 5 years ago

🚀 Feature

Importing ONNX models into Pytorch.

Motivation

Almost all other frameworks already support this. Importing ONNX models into Pytorch makes Pytorch much more flexible.

Pitch

In torch.onnx, a function should be created to take the ONNX model and outputs a Pytorch model.

cc @BowenBao @neginraoof

davidtranno1 commented 5 years ago

Hi everyone, As i know caffe2 fully supports ONNX import/export and caffe 2 is a part of PyTorch recently. However, I can not import onnx model file to Pytorch and run inference properly. How can I resolve this issue?

olegmikul commented 5 years ago

Agreed. I was kind a shocked when I have learnt PyTorch doesn't have that.

rahul99 commented 4 years ago

@soumith, any plans to roll out this functionality anytime soon?

gordinmitya commented 4 years ago

+ to motivation I need to convert an onnx model to torchscript (.pt) so I can run it on pytorch mobile. How can I do this now? Any workarounds?

jcowles commented 4 years ago

+1, not supporting ONNX import feels like half of the ONNX implementation is missing

kooyunmo commented 4 years ago

Is there any ongoing PR for this issue?

borhanMorphy commented 4 years ago

As a heavy torch user, this feature sounds nice.

IguanaAzul commented 4 years ago

This feature is relevant, it should be implemented.

LyricZhao commented 4 years ago

Looking forward to have it in torch

JVGD commented 4 years ago

This feature is very relevant, my motivation:

solarflarefx commented 4 years ago

@gordinmitya Have you found a solution to your problem? I am trying to do something similar.

gordinmitya commented 4 years ago

@solarflarefx no ): pytorch mobile suddenly stoped working without any changes from my side, so I decide to just skip it until mobile version become stable

touchwolf commented 4 years ago

onnxruntime may helps

Franzis39 commented 4 years ago

Anyone figured a simple way to do this ? (without onnxruntime)

gordinmitya commented 4 years ago

@Franzis39 how does onnxruntime help?

ay27 commented 4 years ago

I think it will be a huge challenge to restore the runtime graph in PyTorch. But if we focus on the model parameters, likes that we only load the parameters stored in an onnx file rather than load the full model, maybe more easy.

Maybe this snippet will help:

import onnx
from onnx import numpy_helper
import torch
from torchvision import models

model = models.resnet18()
torch.onnx.export(model, torch.randn(1, 3, 224, 224), 'resnet18.onnx')

onnx_model = onnx.load('resnet18.onnx')

graph = onnx_model.graph
initalizers = dict()
for init in graph.initializer:
    initalizers[init.name] = numpy_helper.to_array(init)

for name, p in model.named_parameters():
    p.data = (torch.from_numpy(initalizers[name])).data
ghost commented 4 years ago

Found a not-perfect but working solution. https://gist.github.com/qinjian623/6aa777037534c1c1dccbb66f832e93b8

zczjx commented 3 years ago

+1

WilliamTambellini commented 3 years ago

Now 151 likes but no sign of step forward. I dont know how the pytorch project works but is a minimum of reply from the project maintainers to be expected ?

matfeb commented 3 years ago

A temporary not perfect time consuming idea I'm going to try is to convert onnx -> keras (onnx2keras) then keras -> pytorch (MMdnn).

ShairozS commented 3 years ago

It is absolutely ridiculous that this issue is still open honestly. I exported a very time consuming model to Onnx for use with RT and now need to make some modifications to it in Pytorch and I realize that for some reason this is a missing capability. I fully expected that if I can export to onnx format from pytorch that I should also be able to import it - rarely have I ever encountered a framework that lets you export to a format without also being able to read that format and it is frankly bad practice.

joshuachough commented 3 years ago

@gchanan any updates on if/when this will be implemented in the future?

gineshidalgo99 commented 3 years ago

+1 to this request, any ETA?

madhavajay commented 3 years ago

This looks like a pretty new and solid effort to achieve onnx to pytorch conversion: https://github.com/ToriML/onnx2pytorch

ThatAIGeek commented 3 years ago

Any updates? almost 200 votes already.

fumihwh commented 3 years ago

Hi folks, I create a repo for generating pytorch code from onnx. https://github.com/fumihwh/onnx-pytorch Different from ToriML/onnx2pytorch, my repo GENERATE pytorch code, not Module instance in stack.

calvinmccarter-at-lightmatter commented 3 years ago

Just a heads-up that v0.4 of onnx2pytorch (https://github.com/ToriML/onnx2pytorch) was released today. In addition to now supporting Loop, LSTM, and a lot more, this release has improved memory management by removing activations that are no longer required by any following operations. We're nearing full ONNX operator coverage, and now all the MLPerf Datacenter Inference ONNX models can be successfully imported into PyTorch.

garymm commented 3 years ago

There are no plans to add this directly to PyTorch, but seems like there are at least two solutions for this already linked to in this thread.

mike-burl commented 2 years ago

Another +1 for this feature. I've gotten pretty far into a project on the assumption that if I could export to onnx that I'd be able to go the other direction. I don't think it's unreasonable to think this functionality should be included in pytorch itself

h-grieve commented 2 years ago

Another +1 from me, really surprised it isn't a feature.

allemanenti commented 2 years ago

I'm very surprised this can't be done 😮 +1

Thunfischpirat commented 2 years ago

Just ran into the issue, and I'm surprised that this hasn't been addressed in more than three years...

ZeroAda commented 2 years ago

There is another lib: onnx2torch It works for me and I can use it to continue train on my onnx model.

natwille1 commented 1 year ago

Would love this feature!

ngoanpv commented 1 year ago

+1

chr4ss12 commented 1 year ago

+1

akshaytrikha commented 1 year ago

Why was this issue closed? It seems like there is a lot of support from the community for it.

Sandjan commented 1 year ago

Whats the current status of this issue? Why was it closed?

kolabit commented 11 months ago

This feature is obviously non-trivial, but would be great to have

santoshcoder23 commented 10 months ago

any reason why this issue is closed ?

2proveit commented 6 months ago

I just meet a trouble in getting the network structure from a onnx file, should i visualize the structure and write in pytorch by hand ?

vdoom commented 6 months ago

Why this issue is closed?! It's MUST be resolved!

ashvardanian commented 6 months ago

+1

vladimirrotariu commented 6 months ago

+1

tnoe1 commented 4 months ago

+1

benoitboidin commented 4 months ago

I would like to use this feature too, does someone have any news about this issue?

garrettbyrd commented 2 months ago

@BowenBao @neginraoof Bumping this yet again, and echoing that I am not sure why this was closed.