huggingface / distil-whisper

Distilled variant of Whisper for speech recognition. 6x faster, 50% smaller, within 1% word error rate.
MIT License
3.33k stars 238 forks source link

The evaluation stuck after unwrap_model during distillation training #69

Closed HackGiter closed 5 months ago

HackGiter commented 6 months ago

image image Should I add accelerator.prepare after unwarp?

sanchit-gandhi commented 5 months ago

Hey @HackGiter! Sorry for the delay here - resolved in #74! Let me know if you encounter any other issues - best of luck with your distillation efforts!