openvinotoolkit / openvino

OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
https://docs.openvino.ai
Apache License 2.0
6.84k stars 2.18k forks source link

Convert DeeplabV3 to IR #751

Closed AlexanderSlav closed 2 years ago

AlexanderSlav commented 4 years ago

Hi, I'm trying to convert DeepLabV3 with ResNet18 encoder (the implementation was taken from https://github.com/qubvel/segmentation_models.pytorch) to IR format. I've already converted pytorch model to onnx(opset version 11) format using following code:

batch_size = 1 
x = torch.randn(batch_size, 3, 480, 832, requires_grad=True)
model.to(device)
torch.onnx.export(model,               
                  x,                         
                  "deeplab_resnet18.onnx",   
                  export_params=True,        
                  opset_version=11,          
                  do_constant_folding=True,  
                  input_names = ['0_encoder.Conv2d_conv1'],   
                  output_names = ['83_segmentation_head.2.Sigmoid_activation']) 

My OpenVino version is 2020.2 There's an error occurred when i try to run mo.py with next command:

python mo.py --path_to_model/deeplab_resnet18.onnx --input_shape [1,3,480,832] --output_dir output_dir/deeplab_IR

_Error: Cannot infer shapes or values for node "266"._ I've already checked this in official documentation:

  1. What does the message "Stopped shape/value propagation at node" mean? Model Optimizer cannot infer shapes or values for the specified node. It can happen because of a bug in the custom shape infer function, because the node inputs have incorrect values/shapes, or because the input shapes are incorrect

But still, I don't get a point where i made a mistake.

The MO logs:

[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:128 ]  --------------------
[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:129 ]  Partial infer for 266
[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:130 ]  Op: Upsample
[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:131 ]  Inputs:
[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:31 ]  input[0]: shape = [  1 256   1   1], value = <UNKNOWN>
[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:31 ]  input[1]: shape = [0], value = []
[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:31 ]  input[2]: shape = [0], value = []
[ 2020-06-03 18:10:44,381 ] [ DEBUG ] [ infer:31 ]  input[3]: shape = [4], value = [  1 256  60 104]
[ ERROR ]  Cannot infer shapes or values for node "266".
[ ERROR ]  operands could not be broadcast together with shapes (4,) (0,) 
[ ERROR ]  
[ ERROR ]  It can happen due to bug in custom shape infer function <function UpsampleOp.upsample_infer at 0x7f8e889b7170>.
[ ERROR ]  Or because the node inputs have incorrect values/shapes.
[ ERROR ]  Or because input shapes are incorrect (embedded to the model or passed via --input_shape).
[ 2020-06-03 18:10:44,382 ] [ DEBUG ] [ infer:196 ]  Node "266" attributes: {'pb': input: "254"
input: "258"
input: "258"
input: "265"
output: "266"
op_type: "Resize"
attribute {
  name: "coordinate_transformation_mode"
  s: "pytorch_half_pixel"
  type: STRING
}
attribute {
  name: "cubic_coeff_a"
  f: -0.75
  type: FLOAT
}
attribute {
  name: "mode"
  s: "linear"
  type: STRING
}
attribute {
  name: "nearest_mode"
  s: "floor"
  type: STRING
}
, 'kind': 'op', '_in_ports': {0: {'control_flow': False}, 1: {'control_flow': False}, 2: {'control_flow': False}, 3: {'control_flow': False}}, '_out_ports': {0: {'control_flow': False}}, 'name': '266', 'op': 'Upsample', 'in_ports_count': 2, 'out_ports_count': 1, 'infer': <function UpsampleOp.upsample_infer at 0x7f8e889b7170>, 'mode': 'linear', 'dim_attrs': ['spatial_dims', 'axis', 'channel_dims', 'batch_dims'], 'shape_attrs': ['window', 'stride', 'pad', 'output_shape', 'shape'], 'IE': [('layer', [('id', <function Op.substitute_ie_attrs.<locals>.<lambda> at 0x7f8eb7a08440>), 'name', 'type', 'version'], [('data', ['height_scale', 'width_scale', 'mode'], []), '@ports', '@consts'])], 'is_output_reachable': True, 'is_undead': False, 'is_const_producer': False, 'is_partial_inferred': False}
[ ERROR ]  Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "266" node. 
 For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #38. 
[ 2020-06-03 18:10:44,382 ] [ DEBUG ] [ main:317 ]  Traceback (most recent call last):
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/middle/passes/infer.py", line 134, in partial_infer
    node.infer(node)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/extensions/ops/upsample.py", line 66, in upsample_infer
    node.out_node().shape = input_shape * node.in_node(1).value
ValueError: operands could not be broadcast together with shapes (4,) (0,) 

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 288, in apply_transform
    for_graph_and_each_sub_graph_recursively(graph, replacer.find_and_replace_pattern)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/middle/pattern_match.py", line 58, in for_graph_and_each_sub_graph_recursively
    func(graph)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/extensions/middle/PartialInfer.py", line 32, in find_and_replace_pattern
    partial_infer(graph)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/middle/passes/infer.py", line 198, in partial_infer
    refer_to_faq_msg(38)) from err
mo.utils.error.Error: Stopped shape/value propagation at "266" node. 
 For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #38. 

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/main.py", line 307, in main
    return driver(argv)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/main.py", line 272, in driver
    ret_res = emit_ir(prepare_ir(argv), argv)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/main.py", line 237, in prepare_ir
    graph = unified_pipeline(argv)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/pipeline/unified.py", line 29, in unified_pipeline
    class_registration.ClassType.BACK_REPLACER
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 334, in apply_replacements
    apply_replacements_list(graph, replacers_order)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 324, in apply_replacements_list
    num_transforms=len(replacers_order))
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/utils/logger.py", line 124, in wrapper
    function(*args, **kwargs)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo/utils/class_registration.py", line 304, in apply_transform
    )) from err
mo.utils.error.Error: Exception occurred during running replacer "REPLACEMENT_ID" (<class 'extensions.middle.PartialInfer.PartialInfer'>): Stopped shape/value propagation at "266" node. 
 For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #38. 

Any chance of getting help on this? I'll be glad of any help or advice.

ilya-lavrenov commented 4 years ago

@lazarevevgeny could you please help?

avitial commented 4 years ago

@AlexanderSlav looks like this model uses ONNX* Resize Opset-11 version which is not supported by the Model Optimizer. Just as stated in documentation, Resize Opset-10 version is supported. This would explain the error on such node (266), if you can share the .onnx model file it would be great.

AlexanderSlav commented 4 years ago

@avitial Hi, my first thought was to downgrade opset version to 10, but i had following error during export to onnx:

ONNX export failed: Couldn't export operator aten::upsample_bilinear2d

And solution is to upgrade the opset version to 11 :
https://github.com/pytorch/pytorch/issues/29980#issuecomment-554725530

Should I use your email to share .onnx model ?

lazarevevgeny commented 4 years ago

@AlexanderSlav currently OpenVINO does not support Resize-11 operation which is produced by onnx when you convert the PyTorch model. As a workaround you can try to use other PyTorch resize-like operation and convert the model with opset-10.

AlexanderSlav commented 4 years ago

@lazarevevgeny Thanks, I'll try!