Closed matthieucoquet closed 3 years ago
The NNEF is correct if I use ONNX (using keras-onnx) or tf (using this) instead of tflite. So I found a way to export my model.
Should I close the issue? From tf.keras, tflite is the easiest way to export since tf's saved_model.pb doesn't work with nnef_tools.convert
script.
I believe this is an issue with the TFLite conversion, the correct shape should be [1,32] as you say. I'll investigate this, no need to close. It's understandable that it works through ONNX since that's a separate converter.
Indeed, the conversion was wrong, I have pushed a fixed, can you check again with TFLite?
It works well now. Thank you!
I converted a simple graph from Keras (tf 2.4) -> tflite -> NNEF:
If I use the C++ parser on it and call infer_shapes, it produces an exception (out of range access to vector) during:
So I think the bias generated from the conversion are wrong (should be [1, 32] and [1, 128]). Is that correct?
Or is it a bug in infer_shapes?
Here is how I generated the tflite: