theodo-group / LLPhant

LLPhant - A comprehensive PHP Generative AI Framework using OpenAI GPT 4. Inspired by Langchain
MIT License
766 stars 78 forks source link

I don't receive the answer as a stream when using Ollama. #176

Open santiOcampo01 opened 1 month ago

santiOcampo01 commented 1 month ago

I want to create a chatbot that answers in real-time or streams responses like ChatGPT. However, I'm having trouble getting the answer stream to work correctly. I'm using embeddings, and my only issue is with the streaming of the responses. At the moment, I tried using it on the console, but it only gives the complete answer instead of streaming it.

Console:

console: Enter your prompt: hi Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?

Code:


<?php
require 'vendor/autoload.php';

use LLPhant\OllamaConfig;
use LLPhant\Chat\OllamaChat;

$config = new OllamaConfig();
$config->model = 'llama2';
$config->stream = true;

$chat = new OllamaChat($config);
$prompt = readline("Enter your prompt: ");
$responseStream = $chat->generateStreamOfText($prompt);

foreach ($responseStream as $response) {
    echo $response . PHP_EOL;
}
MaximeThoonsen commented 1 month ago

Hey @santiOcampo01 , did you succeed to make it stream? Did you try with QuestionAnswering class ?

messi89 commented 3 weeks ago

Since the package wait the complete response from model, for now there is no way to stream response (like ollama did on the cli)