niedev / RTranslator

Open source real-time translation app for Android that runs locally
Apache License 2.0
4.96k stars 367 forks source link

Consider switching to Faster Whisper from Whisper #11

Closed prathamdby closed 1 week ago

prathamdby commented 1 week ago

The title is rather self-explanatory, but the recommendation is to switch to it because faster whisper delivers significant speed advantages over OpenAI's standard whisper model.

Faster Whisper repository: https://github.com/SYSTRAN/faster-whisper

niedev commented 1 week ago

Hi! Thanks for the suggestion. OnnxRuntime (the technology I use) also speeds up the execution of Whisper. Some time ago I evaluated Faster Whisper, but which of the 2 is the fastest option is not known (the benchmarks I found on Faster Whisper are only for a batch size of 5, I use a batch size of 1 for the Conversation mode and 2 for the WalkieTalkie mode), however I doubt there are big differences between the 2 technologies, OnnxRuntime, however, has much more support and documentation than CTranslate2 and therefore easier to implement and maintain, for this reason at the time I excluded Faster Whisper.

prathamdby commented 1 week ago

Hi! Thanks for the suggestion. OnnxRuntime (the technology I use) also speeds up the execution of Whisper. Some time ago I evaluated Faster Whisper, but which of the 2 is the fastest option is not known (the benchmarks I found on Faster Whisper are only for a batch size of 5, I use a batch size of 1 for the Conversation mode and 2 for the WalkieTalkie mode), however I doubt there are big differences between the 2 technologies, OnnxRuntime, however, has much more support and documentation than CTranslate2 and therefore easier to implement and maintain, for this reason at the time I excluded Faster Whisper.

I see, well alright. Thanks for the reply!