Open xiaodaoyoumin opened 5 years ago
Extracting the weights in numpy from pytorch and use it to initializing a corresponding model in cntk. That's how i did it to convert BERT from tensorflow to cntk.
You can check out how i do it for BERT.
@delzac thank you , have you tried C.Function.load() function before ? do you know why in my case , it fialed?
RuntimeError: ONNX (Shape) is not supported in CNTK
The Onnx OpShape
is not supported in CNTK, so it fails.
You can check out here for the list of Onnx op that is supported.
@delzac I can export it into ONNX , but it fails during load into CNTK , in this function, C.Function.load() . Do you know why this happend? thank you
Because CNTK doesn't support the use of the ONNX Op Shape
. When you export from pytorch, Shape
Onnx Op was used.
@delzac thank you so much
What the scenario that you are trying to achieve? For inference, please use ONNX Runtime (https://github.com/microsoft/onnxruntime)
`` import torch import torch.nn as nn
from torch.autograd import Variable import torch.onnx as torch_onnx
class Model(nn.Module): def init(self): super(Model, self).init() self.rnn = torch.nn.LSTM(3,5)
input_shape = (3, 5, 3) model_onnx_path = "torch_model-lstm.onnx" model = Model() model.train(False)
dummy_input = Variable(torch.randn(*input_shape))
print(model(dummy_input)) output = torch_onnx.export(model, dummy_input, model_onnx_path, verbose=False)
import cntk as C
z = C.Function.load("torch_model.onnx", device=C.device.cpu(), format=C.ModelFormat.ONNX) ``
RuntimeError: ONNX (Shape) is not supported in CNTK
It seems that cntk don`t support pytorch LSTM , so how to load a pretrained model in Pytorch ?