Recently, I'm working on a repo which provides openai compatible APIs for LLM and whisper model. And I want to use this awesome use-whisper project in my app with my own deployed whisper service.
The openai package supports setting apiBase with openai.api_base = "http://localhost:8000/api/v1" to use other openai services like Azure OpenAI services.
It might be better to support apiBase configuration to enhance this awesome project.
With this PR, we can use this repo with a private whisper service.
Recently, I'm working on a repo which provides openai compatible APIs for LLM and whisper model. And I want to use this awesome
use-whisper
project in my app with my own deployed whisper service.The
openai
package supports settingapiBase
withopenai.api_base = "http://localhost:8000/api/v1"
to use other openai services like Azure OpenAI services.It might be better to support
apiBase
configuration to enhance this awesome project.With this PR, we can use this repo with a private whisper service.