huggingface / distil-whisper

Distilled variant of Whisper for speech recognition. 6x faster, 50% smaller, within 1% word error rate.
MIT License
3.57k stars 287 forks source link

distil-whisper-turbo #154

Open simpthy opened 2 weeks ago

simpthy commented 2 weeks ago

Pretty please :)

https://github.com/openai/whisper/discussions/2363

I would do it, but I'm a bit short on knowledge ...and H100s.

mitchelldehaven commented 1 week ago

The speed improvement from 4 decoder layers to 2 decoder layers would probably be negligible.