Open lewdon opened 8 months ago
Any update?
I actually have the same issue, I would prefer to use the "medium" model but when I use either of the following commands the webserver won't even open for me once the container starts.
However running this command does work and the webserver starts immediately:
What I'd love to see would be a model-switcher so I can save whichever model(s) I want to use and then select it in the UI or specify the model to use in the endpoint
same here, it doesn't work: ASR_MODEL=large-v3
docker command: docker run -d --gpus all -p 9000:9000 -e ASR_MODEL=large-v3 -e ASR_ENGINE=openai_whisper onerahmet/openai-whisper-asr-webservice:latest-gpu
docker log output: RuntimeError: Model large-v3 not found; available models = ['tiny.en', 'tiny', 'base.en', 'base', 'small.en', 'small', 'medium.en', 'medium', 'large-v1', 'large-v2', 'large']
any update ?
@lewdon I will try to take a look at this, just chanced on this repo
I want to add new model as ASR_MODEL=base ,but I try to download the Belle-distilwhisper-large-v2 to the Model Path.But it can't run rightly ?