PINTO0309 / MobileNetV2-PoseEstimation

Tensorflow based Fast Pose estimation. OpenVINO, Tensorflow Lite, NCS, NCS2 + Python.
https://qiita.com/PINTO
MIT License
105 stars 20 forks source link

Unsupported primitive of type: Interp name: base/layer_14/output/upsample with the CPU models. #1

Closed rakidedigama closed 2 years ago

rakidedigama commented 5 years ago

Hi. Great work. Still, I am getting an error with the CPU models on Windows with Openvino. Can you help me with this please? Thanks.

flerkenvn commented 5 years ago

I got the same issue.

PINTO0309 commented 5 years ago

Please tell me everything below.

  1. what kind of model file was used
  2. what kind of program was run
  3. openvino version 1.0 or 1.0.1 or other
flerkenvn commented 5 years ago

Please tell me everything below.

  1. what kind of model file was used
  2. what kind of program was run
  3. openvino version 1.0 or 1.0.1 or other

I used model file: "models/train/test/openvino/mobilenet_v2_1.4_224/FP32/frozen-model.xml" with command "python3 openvino-usbcamera-cpu-ncs2-sync.py -d CPU" with OpenVino 2019 R1.133

rakidedigama commented 5 years ago

I used almost the same configurations. However, I was testing this with openvino windows Release 1.0.1, and it seems that the version does not support certain layers.

Shape inference for Interp layer works for almost all cases, except for Caffe models with fixed width and height parameters (for example, semantic-segmentation-adas-0001).

https://software.intel.com/en-us/articles/OpenVINO-RelNotes - known issues #21 :

Here's a screenshot of the error. image

PINTO0309 commented 5 years ago

You must use CPU extensions. lib/libcpu_extension.so Also, OpenVINO for Windows seems to be unstable. I strongly recommend the use of Ubuntu.

if "CPU" == args.device:
    if platform.processor() == "x86_64":
        plugin.add_cpu_extension("lib/libcpu_extension.so")
    if args.boost == False:
        model_xml = "models/train/test/openvino/mobilenet_v2_1.4_224/FP32/frozen-model.xml"
    else:
        model_xml = "models/train/test/openvino/mobilenet_v2_0.5_224/FP32/frozen-model.xml"

elif "GPU" == args.device or "MYRIAD" == args.device:
    if args.boost == False:
        model_xml = "models/train/test/openvino/mobilenet_v2_1.4_224/FP16/frozen-model.xml"
    else:
        model_xml = "models/train/test/openvino/mobilenet_v2_0.5_224/FP16/frozen-model.xml"

else:
    print("Specify the target device to infer on; CPU, GPU, MYRIAD is acceptable.")
    sys.exit(0)
PINTO0309 commented 5 years ago

For Windows. Please show below. https://github.com/PINTO0309/OpenVINO-DeeplabV3/issues/1

Ealuthwala commented 4 years ago

I am getting the exact same error, I tried removing the if platform.processor() == "x86_64": to make it use the extension, but still no use, image now I am getting this error, anyway, I don't think I was supposed to do that. so how can I fix this?

I am using windows 10 with the openvino 2019.1.133. tried using the tflite one too. that just produce random dots

Ealuthwala commented 4 years ago

and the platform.processor() is 'Intel64 Family 6 Model 60 Stepping 3, GenuineIntel'