Closed lunaticsm closed 3 months ago
It's work when i use as cli or this
import asyncio
from duck_chat import DuckChat, ModelType
async def main():
async with DuckChat(model=ModelType.GPT4o) as chat:
print(await chat.ask_question("What is the capital of France?"))
await chat.ask_question("Tell me a long story...", stream=True)
asyncio.run(main())
But when i tried using as lib on my bot, the bot always send me a full message and not with stream method.
Hello, @lunaticsm
I refactored your code:
In ClI there is now two cmds:
But when i tried using as lib on my bot, the bot always send me a full message and not with stream method.
You used just return and not the generator in ask_question (see example below)
You can test with this
import asyncio
from duck_chat import DuckChat
async def main():
async with DuckChat() as chat:
async for message in chat.ask_question_stream(
"Tell me long story (60 sentences)"
):
print(message, end="", flush=True)
await asyncio.sleep(1)
async for message in chat.ask_question_stream(
"Summaries the story in one sentence"
):
print(message, end="", flush=True)
await asyncio.sleep(1)
async for message in chat.reask_question_stream(2):
print(message, end="", flush=True)
asyncio.run(main())
If you have any questions or suggestions, feel free to tell me ^^
Have a good day =)
I also created another funcs because they are async generators (use yield) and the other funcs use return
Also, write if it is okey - I will merge then
Yup, that's fine. since i'm not on computer right now I'll try later, you can merge it. ^^
Description:
This pull request introduces a new
stream
option to theDuckChat
class, allowing responses to be streamed as they are received. The key changes include:Streaming Option in
ask_question
:stream
parameter has been added to theask_question
method, enabling real-time streaming of responses.stream=True
, the response is printed in chunks as they arrive. Ifstream=False
, the response is handled as before, being fully assembled before returning.CLI Support for Streaming:
--stream
flag with user input. When this flag is used, the response is streamed in real-time.Why This Change is Necessary:
The new streaming functionality improves the responsiveness and interactivity of the bot, especially for long or complex queries that take time to process. Users can now see parts of the response immediately, making the bot feel more dynamic and engaging.
Testing:
The new streaming functionality has been tested with various queries to ensure that it behaves as expected, both in normal and streaming modes.
Documentation: