Closed farazBhatti closed 3 years ago
Hello. As mentioned in the FAQ, running inference on python TF Lite interpreter requires the signature runner functionality, which is available starting TensorFlow 2.5. You'll have to install the release candidate by specifying the version manually (pip install tensorflow==2.5.0rc2).
Alternatively, if you really need to stick to 2.4.1, you can use this tag, which points to the state of the repository at which everything was compatible with 2.4.1, including inference on TF Lite.
@lshug Ive installed required packages and specially tensorflow==2.5rc2 but still getting tensorflow error
ValueError: SignatureDef method_name is None and model has 0 Signatures. None is only allowed when the model has 1 SignatureDef
and i tried your 2.4.1 version suggestion, but still getting the same error.
AttributeError: 'Interpreter' object has no attribute 'get_signature_runner'
I ve made python env and installed all the module specified in requirements.txt file. I am using Python3.6
That's weird. I'm reopening the issue and will try to replicate the issue in a bit. Meanwhile, can you try with 2.5rc0?
Concerning "model has 0 Signatures" error, tf lite files generated by build.py should by default have 1 signature, named ""serving_default" (inherited from the SavedModel's default signature). Can you open the generated kp_detector tflite (e.g. tflite/vox/kp_detector.tflite) in Netron and tell me the name of the input tensor? If the signature is "serving_default" as it's supposed to be, then the input tensor name will be serving_default_img:0.
Concerning "no attribute 'get_signature_runner'", that's definitely not supposed to be happening in the linked tag, as tf lite inference method in utils.py for the 2.4.1 tag does not call get_signature_runner method. You're probably in the main branch. Please make sure to fetch the tag (git fetch --all --tags
) and checkout (git checkout tags/tf2.4.1
).
That's really weird. Not only is the converted model missing a signature, it also seems to be missing one of the outputs, which is not supposed to happen unless you're building with estimate_jacobian=False. Are you by any chance modifying the code and/or doing savedmodel conversion manually instead of running build.py? If not, can you run the following from the main directory after running build.py and tell me the output? (Assuming model you're building is vox; otherwise, change the model variable)
import tensorflow as tf
model = 'vox'
kp_detector_loader = tf.saved_model.load("saved_models/" + model + "/kp_detector")
print(kp_detector_loader.signatures)
If the savedmodel itself is alright and the problem is with tf lite conversion, then kp_detector_loader.signatures should be a _SignatureMap mapping 'seriving_default' to the concrete function of kpdetector tf.Module's __call_\.
Closing issue due to inactivity. If you do manage to find out what was going on here though, please inform me.
I am trying to get inference for tflite model but getting the following error. Any idea?