Closed letdivedeep closed 3 years ago
Yes it does, it's under /opt/intel/openvino_2021
@PINTO0309 @kuang-wei thanks for the reply I was able to create an openvino model
But while running this cmd
openvino2tensorflow \
--model_path openvino/mbv2_opt.xml \
--model_output_path saved_model \
--output_saved_model \
--output_integer_quant_tflite \
--output_full_integer_quant_tflite\
--output_integer_quant_type uint8 \
--output_tftrt \
--output_edgetpu \
--output_float16_quant_tflite
i am getting an error in the saving the saved model
ERROR: Message tensorflow.SavedModel exceeds maximum protobuf size of 2GB: 2763310261
attached the snapshot below
Unfortunately, files that are larger than 2 GB in size after conversion cannot be converted. This is not a limitation of openvino2tensorflow, but of a file format called Protocol Buffers developed by Google. Try outputting to a pb file. --output_pb
$ openvino2tensorflow \
--model_path xxxx.xml \
--output_saved_model \
--output_pb
If you still get an error, the model size is too large and cannot be converted.
hi @khursani8 @PINTO0309
I was following the blog
I have used the docker setup provided on the repo
when I want to convert the onnx model to openvino, I am not able to get the installation dir of openvino to be input here in the {INTEL_OPENVINO_DIR} path
Does the docker comes with openvino setup .. if so what path should be inputed