Open p0n1 opened 1 year ago
However, @mudler from LocalAI said it maybe compatible with the OpenAI client.
Hi there, I'm just curious if there is any update on getting this to work? I would personally love to be able to use local tts with project. Thanks.
My situation is somewhat different from yours, and I'm not sure if it can help you. Since you mentioned
However, @mudler from LocalAI said it maybe compatible with the OpenAI client.
then I'll assume that you only need to change base_url
to your LocalAI address.
Here’s my example: I use Windows, and I didn't want to set up system variables, so I found audiobook_generator/tts_providers/openai_tts_provider.py
and replaced the definition part of the OpenAI
Class.
I modified it as follows:
self.client = OpenAI(
api_key="mykey",
base_url = 'https://xx.xxx/v1/' # my openai proxy site
)
This way, I could use a third-party OpenAI proxy. One more thing — these proxy sites help redirect traffic to OpenAI for regions not supported by OpenAI. Of course, you can also use similar methods to utilize some local TTS services compatible with the OpenAI API.
Split from the issue https://github.com/p0n1/epub_to_audiobook/issues/9#issuecomment-1808120642.
LocalAI TTS API https://localai.io/features/text-to-audio/ is defined even before the release of OpenAI. I think It's not full compatible with OpenAI TTS API https://platform.openai.com/docs/guides/text-to-speech because they are using different voices and models.
So changing the base url of OpenAI SDK to LocalAI instance will not work for TTS feature.
If we can support LocalAI, we can support many good local TTS engines at once.