huggingface / distil-whisper

Distilled variant of Whisper for speech recognition. 6x faster, 50% smaller, within 1% word error rate.
MIT License
3.33k stars 238 forks source link

NotImplementedError #60

Open wntg opened 6 months ago

wntg commented 6 months ago

when I fintune fellow https://github.com/huggingface/distil-whisper/tree/main/training. NotImplementedError: The model type whisper is not yet supported to be used with BetterTransformer.

sanchit-gandhi commented 6 months ago

Could you make sure you have the latest version of Transformers and Optimum installed?

pip install --upgrade transformers optimum