Open gaikwadrahul8 opened 2 days ago
This issue originally reported by @SantiagoMoreno-UdeA has been moved to this dedicated repository for ai-edge-torch to enhance issue tracking and prioritization. To ensure continuity, we have created this new issue on your behalf.
We appreciate your understanding and look forward to your continued involvement.
System information Linux 20.04 pip Tensorflow==2.12.0 using tranformers WhisperForConditionalgeneration
I'm trying to convert from TF to tflite and quantized to int8 Whisper, using the whisper model from tranformers WhisperForConditionalGeneration. At some point the conversion crash. Here is the colab for more details: Colab: https://colab.research.google.com/drive/1oAVoUxRFZLkS1uqqFN8HdgRVk0IWAlsN?usp=sharing
Also I attach the Error Trace from my server running in CPU and also running in GPU (TITAN RTX 24GB). CPU: TraceTflite.txt
GPU: TraceTflite_GPU.txt