google-coral / tflite

Examples using TensorFlow Lite API to run inference on Coral devices
https://coral.withgoogle.com
Apache License 2.0
181 stars 67 forks source link

ValueError: Didn't find op for builtin opcode 'STRIDED_SLICE' version '4' #30

Closed moritz157 closed 2 years ago

moritz157 commented 4 years ago

So I converted a tensorflow model into a tflite model and tried loading it using tflite_runtime:

#! /usr/bin/env python
# coding=utf-8
#================================================================
#   Copyright (C) 2019 * Ltd. All rights reserved.
#
#   Editor      : VIM
#   File name   : test.py
#   Author      : YunYang1994
#   Created date: 2019-07-19 10:29:34
#   Description :
#
#================================================================

import cv2
import os
import time
import shutil
import numpy as np
import tensorflow as tf
import core.utils as utils
from core.config import cfg
from core.yolov3 import YOLOv3, decode

INPUT_SIZE   = 416
NUM_CLASS    = len(utils.read_class_names(cfg.YOLO.CLASSES))
CLASSES      = utils.read_class_names(cfg.YOLO.CLASSES)

# Build Model
input_layer  = tf.keras.layers.Input([INPUT_SIZE, INPUT_SIZE, 3])
feature_maps = YOLOv3(input_layer)

bbox_tensors = []
for i, fm in enumerate(feature_maps):
    bbox_tensor = decode(fm, i)
    bbox_tensors.append(bbox_tensor)

model = tf.keras.Model(input_layer, bbox_tensors)
model.load_weights("./yolov3")

converter = tf.lite.TFLiteConverter.from_keras_model(model)
# This enables quantization
converter.optimizations = [tf.lite.Optimize.DEFAULT]

tflite_model = converter.convert()

with open('./model.tflite', 'wb') as f:
    f.write(tflite_model)
import tensorflow_runtime.interpreter as tflite
tflite.Interpreter('./model.tflite')

but it gave me the following error:

ValueError: Didn't find op for builtin opcode 'STRIDED_SLICE' version '4'      
Registration failed

System information:

The model I used was from here: https://github.com/YunYang1994/TensorFlow2.0-Examples/tree/master/4-Object_Detection/YOLOV3

I also tried downgrading tensorflow to 2.2.0 or 2.1.0 but then the tflite-conversion didn't even work. You may also note, that I tried loading the tflite-model on my windows machine as well as a coral dev board - both with the exact same result.

I couldn't find someone with the same problem yet, but I noticed people getting a similar error like in #2

I'm pretty much a beginner with tensorflow so any suggesting in which direction to look to would be appreciated

Namburger commented 4 years ago

@moritz157 Hi! This gist of the issue boils down to the fact that our released runtime package is not up to date with the newer version of tensorflow. Therefore there is no guarantee that your model will works until our next release, even if you were able to convert the model. I suggest sticking with 2.2 or below for now. For yolov3, I suggest checking out this repo: https://github.com/guichristmann/edge-tpu-tiny-yolo I was able to reproduce the model with an older version of tensorflow!

hjonnala commented 3 years ago

@moritz15 are you still facing the issue?

hjonnala commented 2 years ago

closing due to inactivity. Feel free to reopen if the issue still persists.