openvinotoolkit / openvino

OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
https://docs.openvino.ai
Apache License 2.0
6.87k stars 2.19k forks source link

convert pb to IR success, but when load IR model(.bin & .xml) , engine infers got the wrong result, What did I missed? #1139

Closed turboLIU closed 4 years ago

turboLIU commented 4 years ago

I got the frozen pb model, and tested using "sess.graph.get_tensor_by_name('pfld_inference/fc/BiasAdd:0')" ,,get the right result. then I use mo_tf.py to convert pb to bin&xml, it's also SUCCESS: [ SUCCESS ] Generated IR version 10 model. [ SUCCESS ] XML file: E:\OpenVINO\openvino_toolkit\install\openvino_2020.3.194\deployment_tools\model_optimizer.\inference_graph.xml [ SUCCESS ] BIN file: E:\OpenVINO\openvino_toolkit\install\openvino_2020.3.194\deployment_tools\model_optimizer.\inference_graph.bin [ SUCCESS ] Total execution time: 11.13 seconds.

But, when I load bin&xml for inference, the engine shows me the wrong result (different from the result got from pb model) I have test the same image, but these two model(IRmodel, pbmodel) gives me different results!!!! SO , where is the problem????

convert_command:
python3 mo_tf.py --input_model .\inference_graph.pb --batch 1 --mean_values [256,256,256] --data_type FP32 SUCCESS

ilya-lavrenov commented 4 years ago

@turboLIU could you please attach the model? What device do you use to perform the inference?

turboLIU commented 4 years ago

inference_graph this model graph , and model running on win10-PC compared the result (IRmodel_result & PBmodel_result) 1 2 It seems the mean&varance in BN got wrong value,, but I'm not sure~

turboLIU commented 4 years ago

Finally,, after some trial and error, I found the bug location wrong: python .\mo_tf.py --input_model .\inference_graph.pb --model_name inference_graph_false_pb --input_shape [1,112,112,3] --input "org_batch" --output "pfld_inference/fc/BiasAdd" --mean_values [256,256,256] --data_type FP32 right: python .\mo_tf.py --input_model .\inference_graph_false.pb --model_name inference_graph_false_pb --input_shape [1,112,112,3] --input "org_batch{f32}" --output "pfld_inference/fc/BiasAdd" --data_type FP32 So, I think it may be some sth wrong in Preprocess of input_node in OpenVINO~

avitial commented 4 years ago

Hi @turboLIU, were you able to resolve the issue? Please attach .pb model file. If your model needs pre-processing, make sure to use the input image mean/scale parameters (--scale and –mean_values) with the Model Optimizer. These parameters allow the MO tool to bake the pre-processing into the IR to get accelerated by the Inference Engine.

Which plugin are you using for inference? Are you running OpenVINO (v2020.3) on Windows 10?

avitial commented 4 years ago

Closing this since there hasn't been any activity, I hope previous responses were sufficient to help you proceed or resolve the issue. Feel free to reopen and ask additional questions related to this topic.