Closed o-alexandre-felipe closed 7 months ago
Hey @o-alexandre-felipe - I've added instructions to the model card: https://huggingface.co/distil-whisper/distil-large-v2#running-whisper-in-openai-whisper
Let me know if that fixes your issue!
Hey @o-alexandre-felipe - I've added instructions to the model card: https://huggingface.co/distil-whisper/distil-large-v2#running-whisper-in-openai-whisper
Let me know if that fixes your issue!
@sanchit-gandhi It works for me.
Great! Thanks for confirming @madroidmaq. Going to close this one for now - feel free to open a new issue or re-open this if the problem persists @o-alexandre-felipe.
Hey @o-alexandre-felipe - I've added instructions to the model card: https://huggingface.co/distil-whisper/distil-large-v2#running-whisper-in-openai-whisper
Let me know if that fixes your issue!
The section was renamed https://huggingface.co/distil-whisper/distil-large-v2#running-distil-whisper-in-openai-whisper
If one of the goals of distil-whisper is to be a drop in replace of whisper models (1) it would be interesting to be able to cast it to an object of type whisper.Whisper(2), so that it could be used with any custom decoder implemented for whisper.
Practical issue
I faced a few problems when trying to use the model in stable_whisper(3).
The first issue is that it doesn't have the
dims
andis_multilingual
properties.That gives
AttributeError: 'BaseModelOutput' object has no attribute 'dtype'
Next I tried to load the state_dict to a whisper model, but it doesn't work either
In summary
What would it take to cast a the distil model to a whisper.Whisper so that they can be a drop in alternative for a broader set of applications?