google-ai-edge / LiteRT

LiteRT is the new name for TensorFlow Lite (TFLite). While the name is new, it's still the same trusted, high-performance runtime for on-device AI, now with an expanded vision.
https://ai.google.dev/edge/litert
Apache License 2.0
169 stars 13 forks source link

The Whisper Hybrid encoder model with dynamic quantization functions properly, but it fails when using a full int8 model with post-training quantization. #136

Open pkgoogle opened 3 days ago

pkgoogle commented 3 days ago

Original Issue: https://github.com/tensorflow/tensorflow/issues/59716 Opening on behalf of @nyadla-sys

1. System information

2. Code

Refer the below colab to reproduce the issue .. https://colab.research.google.com/drive/1S_3bVlwRZkMaYvvKwtPfWlyXQLS0Bvxa?usp=sharing

(You can paste links or attach files by dragging & dropping them below)
- Provide links to your updated versions of the above two colab notebooks.
- Provide links to your TensorFlow model and (optionally) TensorFlow Lite Model.

Option B: Paste your code here or provide a link to a custom end-to-end colab

(You can paste links or attach files by dragging & dropping them below)
- Include code to invoke the TFLite Converter Python API and the errors.
- Provide links to your TensorFlow model and (optionally) TensorFlow Lite Model.

3. Failure after conversion

If the conversion is successful, but the generated model is wrong, then state what is wrong:

4. (optional) RNN conversion support

If converting TF RNN to TFLite fused RNN ops, please prefix [RNN] in the title.

5. (optional) Any other info / logs

Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.

gaikwadrahul8 commented 2 days ago

This issue originally reported by @nyadla-sys has been moved to this dedicated repository for LiteRT to enhance issue tracking and prioritization. To ensure continuity, we have created this new issue on your behalf.

We appreciate your understanding and look forward to your continued involvement.