Open bigsheep2012 opened 5 years ago
how to export decoder onnx
RuntimeError: Expected 4-dimensional input for 4-dimensional weight [128, 512, 3, 3], but got 3-dimensional input of size [512, 63, 63] instead
Have you solved this problems? And there are two models including encoder and decoder, how can you connect this two model when exporting?
anyone solve it? please help my qq is 531232693
also want to export it to onnx ,but got no idea. any help?
how to convert to onnx ? getting issue ONNX export of operator adaptive_avg_pool2d, output size that are not factor of input size. please help
Hello CSAILVision team, Thanks for sharing this work.
I trained a model with 'resnet18dilated' and 'ppm_deepsup'. Trying to export this model to a onnx model and get errors (I can export a normal ResNet50 to a onnx model normally).
The PSPNet contains your source files: lib folder, models.py and resnet.py I can train the net successfully.
Please check the code for producing the error: from PSPNet import ModelBuilder, SegmentationModule builder = ModelBuilder() net_encoder = builder.build_encoder(arch='resnet18dilated', fc_dim=512, weights='') net_decoder = builder.build_decoder(arch='ppm_deepsup',fc_dim=512, num_class=150, weights='', use_softmax=False)
crit = nn.NLLLoss(ignore_index=-1) model = SegmentationModule(net_encoder, net_decoder, crit)
dummy_input = torch.randn(1, 3, 640, 640, device='cuda') model = model.cuda()
state_dict = torch.load('./weights/PSP_scratch.pth') model.load_state_dict(state_dict)
input_names = [ "input" ] output_names = [ "output" ] torch.onnx.export(model, dummy_input, "model.onnx", verbose=True, input_names=input_names, output_names=output_names)
Many thanks
the errors: File "transform_to_onnx.py", line 53, in
torch.onnx.export(model, dummy_input, "model.onnx", verbose=True, input_names=input_names, output_names=output_names)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/init.py", line 27, in export
return utils.export(*args, kwargs)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 104, in export
operator_export_type=operator_export_type)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 281, in _export
example_outputs, propagate)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 227, in _model_to_graph
graph = _optimize_graph(graph, operator_export_type)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 155, in _optimize_graph
graph = torch._C._jit_pass_onnx(graph, operator_export_type)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/init.py", line 52, in _run_symbolic_function
return utils._run_symbolic_function(*args, *kwargs)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 504, in _run_symbolic_function
return fn(g, inputs, attrs)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/symbolic.py", line 88, in wrapper
args = [_parse_arg(arg, arg_desc) for arg, arg_desc in zip(args, arg_descriptors)]
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/symbolic.py", line 45, in _parse_arg
raise RuntimeError("ONNX symbolic expected a constant value in the trace")
RuntimeError: ONNX symbolic expected a constant value in the trace
(py27) yanglin@yanglin-pc:/media/yanglin/Lindisk/Robosense/git_projects/mjj2/sti_predict/src/cnn_detection/python/pixel_based/apollo_fcn_box$ python transform_to_onnx.py
Traceback (most recent call last):
File "transform_to_onnx.py", line 53, in
torch.onnx.export(model, dummy_input, "model.onnx", verbose=True, input_names=input_names, output_names=output_names)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/init.py", line 27, in export
return utils.export(*args, kwargs)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 104, in export
operator_export_type=operator_export_type)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 281, in _export
example_outputs, propagate)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 227, in _model_to_graph
graph = _optimize_graph(graph, operator_export_type)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 155, in _optimize_graph
graph = torch._C._jit_pass_onnx(graph, operator_export_type)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/init.py", line 52, in _run_symbolic_function
return utils._run_symbolic_function(*args, *kwargs)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/utils.py", line 504, in _run_symbolic_function
return fn(g, inputs, attrs)
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/symbolic.py", line 88, in wrapper
args = [_parse_arg(arg, arg_desc) for arg, arg_desc in zip(args, arg_descriptors)]
File "/home/yanglin/anaconda3/envs/py27/lib/python2.7/site-packages/torch/onnx/symbolic.py", line 45, in _parse_arg
raise RuntimeError("ONNX symbolic expected a constant value in the trace")
RuntimeError: ONNX symbolic expected a constant value in the trace