openvinotoolkit / openvino

OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
https://docs.openvino.ai
Apache License 2.0
7.38k stars 2.31k forks source link

Encounter an exception in "LoadNetwork()" with openvino_2020.1.033 #4033

Closed higher127 closed 2 years ago

higher127 commented 3 years ago

Hi, There, a classification model have been optimized which have "innerProduct" layer, and load its network with "LoadNetwork()" encounter exception. If I removed "innerProduct" from the model, "LoadNetwork()" function is OK. As document reads, openvino supports "FullyConnected" layer. Where is the exception from ? Thanks a lot !

ilya-lavrenov commented 3 years ago

Hi @higher127 Could you please provide more logs? Since IR v10 we have opsets which don't have FullyConnected operation, but have MatMul which is more generic.

higher127 commented 3 years ago

Hi @higher127 Could you please provide more logs? Since IR v10 we have opsets which don't have FullyConnected operation, but have MatMul which is more generic.

Hi, @ilya-lavrenov Thanks for your reply. I do not know how to save logs, just encounter an exception in run time. Besides, I think model optimization have convert "FullyConnected" to "MatMul", and "Reshape"(Dynamic batch not supported) is new generated. Relationship between caffe model and IR xml below:

  1. caffe model layer { bottom: "res5b" top: "pool5" name: "pool5" type: "Pooling" pooling_param { kernel_size: 4 stride: 1 pool: AVE } } layer { bottom: "pool5" top: "fc1000" name: "fc1000" type: "InnerProduct" inner_product_param { num_output: 4 } }
  2. IR xml layer id="113" name="fc1000/flatten_fc_input" type="Reshape" version="opset1" data special_zero="True"/ input port id="0" dim>1</dim dim>512</dim dim>1</dim dim>1</dim /port port id="1" dim>2</dim /port /input output port id="2" precision="FP32" dim>1</dim dim>512</dim /port /output /layer layer id="114" name="fc1000/WithoutBiases/1_port_transpose5915_const" type="Const" version="opset1" data element_type="f32" offset="44672784" shape="4,512" size="8192"/ output port id="1" precision="FP32" dim>4</dim dim>512</dim /port /output /layer layer id="115" name="fc1000/WithoutBiases" type="MatMul" version="opset1" Is the model optimization right ? Thanks !
ilya-lavrenov commented 3 years ago

Ok, have you been able to execute the model with MatMul on IE device? The IR snippet looks good.

higher127 commented 3 years ago

Ok, have you been able to execute the model with MatMul on IE device? The IR snippet looks good.

Execute the IR model with exception still, as the IR snippet reads: A "Reshape" layer generated while converting "InnerProducted" to "MatMul", and dynamic batch of openvino cannot support. I think the exception comes from the "Reshape" of IR model. Am I right ? Thanks.

ilya-lavrenov commented 3 years ago

I think the exception comes from the "Reshape" of IR model. Am I right ? Thanks.

We need logs from Inference Engine with error message of the exception. It's hard to say without logs. Could you please upload IR (xml and bin files) to any file server and provide a link?

higher127 commented 3 years ago

I think the exception comes from the "Reshape" of IR model. Am I right ? Thanks.

We need logs from Inference Engine with error message of the exception. It's hard to say without logs. Could you please upload IR (xml and bin files) to any file server and provide a link?

Hi, You can get the IR model with: https://pan.baidu.com/s/1AeOjbCStrYxHJk8cG9rtcw passport: 9bs5 Thanks.

jgespino commented 2 years ago

Hi @higher127

Did you manage to solve the issue? If you still need help, could you provide the caffe model for me tot take a look?

Regards, Jesus