theodo-group / LLPhant

LLPhant - A comprehensive PHP Generative AI Framework using OpenAI GPT 4. Inspired by Langchain
MIT License
857 stars 92 forks source link

Feed called function results back into OpenAI #219

Open prykris opened 1 month ago

prykris commented 1 month ago

Since the conversation history is built using Message instances it doesn't allow me to return the called function result back into the conversation.

The error I receive: Invalid parameter: messages with role 'tool' must be a response to a preceding message with 'tool_calls'.

Route::get('/tools-test', function () {
    $config = new OpenAIConfig;
    $config->apiKey = config('services.openai.api_key');

    $chatSession = new ChatSession(
        new OpenAIChat($config),
        []
    );

    $chatSession->getChat()->addTool(FunctionBuilder::buildFunctionInfo(new class
    {
        /**
         * Returns current user's username
         */
        public function getUserName(): string
        {
            return auth()->user()->name;
        }
    }, 'getUserName'));

    dd($chatSession('What is my username?'));
});

Here is how I handle the tool call, and attempt to feed it back

public function generateResult(string $prompt): Result
    {
        $isFirstMessage = empty($this->results);
        $history = $this->buildChatHistory($prompt);

        $start = microtime(true);
        $responseOrFunction = $this->chat->generateChatOrReturnFunctionCalled($history);
        $end = microtime(true);

        $responseObject = $this->chat->getLastResponse();

        if ($responseOrFunction instanceof FunctionInfo) {
            $history[] = Message::toolResult(
                FunctionRunner::run($responseOrFunction)
            );

            $responseText = $this->chat->generateChat($history);
        } else {
            $responseText = $responseOrFunction;
        }

        $llmResult = new Result(
            new Generation($prompt, $responseText),
            new Metadata(
                [
                    'prompt_tokens' => $responseObject->usage->promptTokens,
                    'completion_tokens' => $responseObject->usage->completionTokens,
                    'total_tokens' => $responseObject->usage->totalTokens,
                ],
                $responseObject->choices[0]->finishReason,
                $this->chat->model,
                $start,
                $end,
            ),
            $isFirstMessage
        );

        $this->results[] = $llmResult;

        return $llmResult;
    }

And if I attempt to use functionCall instead, I get the following error: Missing parameter 'name': messages with role 'function' must have a 'name'.

I can't seem to make it "work" without hacking the implementation of OpenAIChat, which I DO NOT want to do. But I might have to extend and overwrite the method in order to get it working.

functions are deprecated and are replaced by tools it seems, but there yet does not seem to be a way that properly returns tool call data.

prykris commented 1 month ago

https://github.com/theodo-group/LLPhant/pull/194/files

prykris commented 1 month ago

221

f-lombardo commented 1 month ago

What do you think about using the same approach that has been implemented for Anthropic? https://github.com/theodo-group/LLPhant/blob/9ac8a5325605da1f6beca6fcb02929a756f8a3a9/src/Chat/AnthropicChat.php#L70 @MaximeThoonsen what is your opinion?

prykris commented 1 month ago

What do you think about using the same approach that has been implemented for Anthropic?

https://github.com/theodo-group/LLPhant/blob/9ac8a5325605da1f6beca6fcb02929a756f8a3a9/src/Chat/AnthropicChat.php#L70

@MaximeThoonsen what is your opinion?

While it does work, it comes with great drawback of it being stateless. We are unable to continue the conversation with "correct" history.

Could you be open to creating a stateful chat history solution while still being stateless by default?

https://github.com/theodo-group/LLPhant/blob/9ac8a5325605da1f6beca6fcb02929a756f8a3a9/src/Chat/AnthropicChat.php#L99

While I understand that responsibility falls more on the user side the current approach limits what kind of messages users can collect. I don't see why the user couldn't attach its own MessageBag (or whatever object is meant for collection). The extendability is lacking here

MaximeThoonsen commented 1 month ago

@prykris @f-lombardo My two main goals are:

After that I'm quite open on a lot of stuff :).

I feel we are on an important topic as agents and "chat with function being called" will be mainstream really soon.

We can create a new chat method that is stateful and we should indeed be more flexible to handle all the use cases. @prykris , in your message bag, what did you had in mind to put in? The function called?

For the stateful chat what do we need ?

Do you see anything else?

prykris commented 1 month ago

Simplicity remains achievable as long as users can choose between the core implementations for chat generation. My suggestion is to apply composition over inheritance and build on that foundation.

Copy code
class PersistentChat {
    protected ChatInterface $chat;
}

My key requirements are:

For storing messages, the LinkedList data structure is ideal, with each node pairing a message and its metadata. This structure simplifies implementing a sliding window or querying conversation history based on token usage. It also allows for applying rules to the MessageBag, adjusting for different AI providers, and handling invalid states efficiently.