Open santiOcampo01 opened 1 month ago
Hey @santiOcampo01 , did you succeed to make it stream? Did you try with QuestionAnswering class ?
Since the package wait the complete response from model, for now there is no way to stream response (like ollama did on the cli)
I want to create a chatbot that answers in real-time or streams responses like ChatGPT. However, I'm having trouble getting the answer stream to work correctly. I'm using embeddings, and my only issue is with the streaming of the responses. At the moment, I tried using it on the console, but it only gives the complete answer instead of streaming it.
Console:
console: Enter your prompt: hi Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?
Code: