Open wuhongsheng opened 1 week ago
You're such a goat! Thank you 🙏 With which base_url were you testing this? for me to reproduce
I tested it with deepseek url
from openai import OpenAI
client = OpenAI(api_key="
response = client.chat.completions.create( model="deepseek-chat", messages=[ {"role": "system", "content": "You are a helpful assistant"}, {"role": "user", "content": "Hello"}, ], stream=False )
print(response.choices[0].message.content)
a tip: this commit doesn't implement stream transmission.
Yes, for this to be useful it should stream the text from the API and create TTS for each chunked text stream. Too much lag if you have to wait for generation of the entire message.
There is an example of how streaming can be implemented in OpenAI Python library: https://github.com/openai/openai-python?tab=readme-ov-file#streaming-responses
You're such a goat! Thank you 🙏 With which base_url were you testing this? for me to reproduce