danielgross / teleprompter

MIT License
326 stars 39 forks source link

Would Whisper require M1 GPU support? #2

Open scottleibrand opened 1 year ago

scottleibrand commented 1 year ago

To use Whisper for an app like this, I think we’d first want it to add M1 GPU support, as running even the tiny model on CPU is barely above 1x speed for transcription.

https://github.com/openai/whisper/pull/382 isn’t yet merged, and it’s unclear to me what is required to get it working.

Thoughts?

scottleibrand commented 1 year ago

Maybe not, in light of https://github.com/danielgross/teleprompter/pull/1 ?

pavanagrawal123 commented 1 year ago

@scottleibrand, Yeah, I believe with the smaller models the whisper.cpp implementation is good at running at real-time speed. I ran it on CPU only and got pretty good performance, I believe this is very doable without GPU support.