Closed charleslparker closed 3 years ago
@unmonoqueteclea / @sdesimone - In the dict of settings you pass when creating the model, you can now pass output_unfiltered_boxes: True
, which will give you a model that spits out all possible boxes and their associated max-scores / classes. I'm doing an end-to-end test of inference in tflite in python and everything seems to work. See the test here:
https://github.com/charleslparker/sensenet/blob/master/tests/test_pretrained.py#L123
Also, if non-max suppression is eliminated, I'm able to convert the model to TFLite without enabling the extended TF operator set, so maybe we can save @sdesimone the hassle of having to recompile TFLite.
Let me know if you get output with this
@unmonoqueteclea - Apparently I hadn't yet merged this, but this should fix your problem: You were passing into that function the wrapped model (I think), but it expects you to pass in the actual tf.keras.Model object. I've created a sort of wrapper in the same name space (to_tflite
) that will take anything - keras model, wrapper class, raw dict or path to model file. Please use that (I've also eliminated the instance method in ObjectDetector). Let me know if it works so I can merge this.
Ready to be merged, thank you
@sdesimone / @unmonoqueteclea - I've changed the definition of the tinyYOLOv4 topology to be (hopefully) compatible with TFLite. At the very least, I'm able to save the model as a tf.lite flatBuffer and reload it in the python TFLite interpreter. If the iOS one is the same-ish, everything should work out.
See wrappers.tflite_export for the implementation. I've also put a convenience function in the model wrapper itself.
Two caveats:
I had to do some fiddling around to get non-max suppression to work. It seems like I'm pulling in a "restricted" op, but python's tf lite handles it fine so I'm hoping it's okay.
This only works with tensor inputs; not with string paths to image files. TFLite apparently is unable to read files so that's not going to happen as far as I can tell.