Open killrawr opened 2 months ago
Hi @killrawr , there isn't such a feature in LLPhant.
IMHO, adding methods like $chat->addConnection
or $chat->addSQLTool
could mess up the ChatInterface
.
I think that one way of doing this would be creating some standard tools to add to the chat via the existing $chat->addTool
method.
As of now, the tools called are only being executed with no tool response added to the message array. Meaning that LLM can't process the tool call return value.
For that reason, I extended the functionality myself. What I did was to create a LinkedList with Nodes where each node holds Metadata
and the Message
objects, both of these objects don't care what you attach to them so it was easy to add more properties to Message
so it's included in requests. The message class was also extended and added new static methods to create messages straight from the CreateResponse
object, this was helpful.
Whenever I received a Message
with a ChatRole::Assistant
message it was added to the LinkedList
as the last message. It's very important to add the Assistant message to the history, and even more important is to have the original value of tool_calls
assigned to the message object. Otherwise, it will fail with an API error telling you that the next message with ChatRole::Tool
must precede the ChatRole::Assistant
message with the tool_calls
property.
Then I started to process each tool call (I initially made the mistake of assuming there is one tool call per message). The important bit is to have the tool_id
field from each tool call that you can find in the tool_calls
array. You will have to use it to answer properly with those tools
Having the function names and arguments in JSON that it wanted to execute was great, but they didn't make it easy to retrieve added tools from the OpenAIChat
class, those properties are private. And no method was exposed to retrieve the FunctionInfo
so I had to write a separate class to mirror those properties and expose my methods to other parts of my application.
I had to write this class since all the tasks belonged to OpenAIChat
. Here is what I came up with
class ToolManager
{
/** @var FunctionInfo[] */
protected array $tools = [];
private OpenAIChat $chat;
/**
* @param FunctionInfo[] $tools
*/
public function __construct(array $tools = [])
{
foreach ($tools as $tool) {
$this->addTool($tool);
}
}
public function addTool(FunctionInfo $functionInfo): void
{
$this->tools[$functionInfo->name] = $functionInfo;
// Make the underlying Chat class aware of the tools we have added
// if chat is not given, this class serves only as storage
if (isset($this->chat)) {
$this->chat->addTool($functionInfo);
}
}
public function getToolByName(string $name): ?FunctionInfo
{
return $this->tools[$name] ?? null;
}
/**
* @return array<string, array{FunctionInfo, array}>
*/
public function getToolsCalled(CreateResponse $response): array
{
$tools = [];
foreach (($response->choices[0]->message->toolCalls ?? []) as $toolCall) {
$functionName = $toolCall->function->name;
$tool = clone $this->getToolByName($functionName);
$tool->jsonArgs = $toolCall->function->arguments;
$tools[$toolCall->id] = $tool;
}
return $tools;
}
/**
* @return array<string, array{FunctionInfo, array}>
*/
public function getToolsCalledFromNode(MessageNode $messageNode): array
{
/** @var Message $message */
$message = $messageNode->message;
return $this->getToolsCalledFromMessage($message);
}
/**
* @return array<string, array{FunctionInfo, array}>
*/
public function getToolsCalledFromMessage(Message $message): array
{
$tools = [];
foreach ($message->tool_calls as $toolCall) {
$tool = clone $this->getToolByName($toolCall['function']['name']);
$tool->jsonArgs = $toolCall['function']['arguments'];
$tools[$toolCall['id']] = $tool;
}
return $tools;
}
public function setOpenAI(OpenAIChat $chat): void
{
$this->chat = $chat;
$this->chat->setTools($this->tools);
}
}
In the end, I had almost nothing of use from the original OpenAI wrapper this library offers. I am going into technical debt, but it doesn't bother me. It's still working. Here is my wrapper on top of LLPhant wrapper:
<?php
declare(strict_types=1);
namespace App\LLM\Chat;
use App\LLM\Chat;
use App\LLM\Chat\Message\MessageChain;
use App\LLM\Chat\Message\MessageNode;
use App\LLM\Chat\Message\Metadata;
use App\LLM\Tools\ToolManager;
use Exception;
use LLPhant\Chat\Enums\ChatRole;
use LLPhant\Chat\FunctionInfo\FunctionInfo;
use LLPhant\Chat\FunctionInfo\FunctionRunner;
use LLPhant\Chat\Message;
use LLPhant\Chat\OpenAIChat;
use Nette\InvalidStateException;
use OpenAI\Responses\Chat\CreateResponse;
class ChatSession
{
/** @var Chat\Observers\Observer[] */
protected array $observers = [];
protected ToolManager $toolManager;
protected bool $awaitsJson = false;
public function __construct(
protected OpenAIChat $chat,
protected MessageChain $messages,
?ToolManager $toolManager = null,
) {
$this->toolManager = $toolManager ?? new ToolManager;
$this->toolManager->setOpenAI($this->chat);
}
/**
* Add an observer that will be notified when a new message is added.
*/
public function addObserver(Chat\Observers\Observer $observer): void
{
$this->observers[] = $observer;
}
public function getObservers(string $class): array
{
return array_filter($this->observers, function (Chat\Observers\Observer $observer) use ($class): bool {
return $observer instanceof $class;
});
}
/**
* Notify all registered observers about a new message.
*/
protected function messageAdded(MessageNode $node): void
{
/** @var Chat\Observers\MessageObserver $observer */
foreach ($this->getObservers(Chat\Observers\MessageObserver::class) as $observer) {
$observer->messageAdded($node);
}
}
protected function toolCalled(FunctionInfo $tool, string $toolId): void
{
/** @var Chat\Observers\ToolObserver $observer */
foreach ($this->getObservers(Chat\Observers\ToolObserver::class) as $observer) {
$observer->toolCalled($tool, $toolId);
}
}
protected function toolResponded(MessageNode $node, FunctionInfo $tool, string $toolId): void
{
/** @var Chat\Observers\ToolObserver $observer */
foreach ($this->getObservers(Chat\Observers\ToolObserver::class) as $observer) {
$observer->toolResponded($node, $tool, $toolId);
}
}
public function getToolManager(): ToolManager
{
return $this->toolManager;
}
/**
* Add a message node to the chain and notify observers.
*/
public function addMessageNode(MessageNode $node): void
{
$this->messages->addNode($node);
$this->messageAdded($node);
}
/**
* Main entry point for invoking a prompt and generating a response.
*
* @return string|array if json is set to true then return value will be decoded json content
*
* @throws Exception
*/
public function __invoke(string $prompt, bool $json = false): string|array
{
$this->addMessageNode(new MessageNode(
Chat\Message\Message::user($prompt, $json),
null
));
$this->awaitsJson = $json;
$content = $this->processLastUserMessage()->message->content;
if ($json) {
return json_decode($content, true, flags: JSON_THROW_ON_ERROR);
}
return $content;
}
public function getMessageChain(): MessageChain
{
return $this->messages;
}
public function replaceMessageChain(MessageChain $chain): MessageChain
{
$old = $this->messages;
$this->messages = $chain;
return $old;
}
public function processLastUserMessage(): MessageNode
{
/** @var MessageNode $lastNode */
$lastNode = $this->messages->getTail();
if ($lastNode->message->role !== ChatRole::User) {
throw new InvalidStateException('Cannot process last user message since, last message in chain is not issued by role User');
}
return $this->processMessage();
}
/**
* Central method for processing messages.
*/
private function processMessage(): MessageNode
{
return $this->generateResponse();
}
/**
* Generates a response from the LLM using the conversation history.
*/
private function generateResponse(): MessageNode
{
$start = microtime(true);
// Enable json response
if ($this->awaitsJson) {
$this->chat->setModelOption('response_format', ['type' => 'json_object']);
}
$messageNode = $this->handleResponse(
$this->chat->generateChatOrReturnFunctionCalled($this->messages->all()),
$start,
microtime(true)
);
if ($this->awaitsJson) {
$this->chat->setModelOption('response_format', ['type' => 'text']);
$this->awaitsJson = false;
}
return $messageNode;
}
/**
* Returns the OpenAIChat instance.
*/
public function getChat(): OpenAIChat
{
return $this->chat;
}
private function handleResponse(FunctionInfo|string $responseOrFunctionInfo, float $start, float $end): MessageNode
{
$messageNode = $this->processResponse($this->chat->getLastResponse(), $start, $end);
/** @var Chat\Message\Message $message */
$message = $messageNode->message;
$this->addMessageNode($messageNode);
if ($responseOrFunctionInfo instanceof FunctionInfo) {
return $this->handleToolCalls($this->toolManager->getToolsCalledFromMessage($message));
}
return $messageNode;
}
private function handleToolCalls(array $toolCalls): MessageNode
{
foreach ($toolCalls as $toolId => $tool) {
$this->toolCalled($tool, $toolId);
$this->toolResponded($this->handleToolCall($tool, $toolId), $tool, $toolId);
}
return $this->generateResponse();
}
/**
* Execute the tool and push back the results onto the message chain
*/
private function handleToolCall(FunctionInfo $function, string $toolId): MessageNode
{
try {
$start = microtime(true);
$result = FunctionRunner::run($function);
$end = microtime(true);
// Only time we add node without processing it
$this->addMessageNode($node = new MessageNode(
Chat\Message\Message::toolResponse($result, $toolId),
Metadata::fromArray(['start' => $start, 'end' => $end, 'created_at' => time()])
));
return $node;
} catch (Exception $exception) {
return new MessageNode(
Message::assistant("Exception occurred while executing a tool '$function->name': ".$exception->getMessage()),
null
);
}
}
private function processResponse(CreateResponse $response, $start, $end): MessageNode
{
$message = Chat\Message\Message::fromResponse($response);
$metadata = Metadata::fromResponse($response, $start, $end);
return new MessageNode($message, $metadata);
}
}
So this code allows me to execute prompts and be confident that JSON will be returned. It's currently highly tailored for my use case. But if you find anything inspiring, then happy to help.
$data = $chat('Generate example JSON', true);
By the way, there are some rules that your message array must adhere to. For example, you can't two user messages one after another. So for me having those Message
objects stored into LinkedList
made sense. I can just apply rules directly to it since I extend the LinkedList
class with the MessageChain
class and domain-specific methods are implemented there. Just performing a filter on the linked list where each passed node has a reference to a message before and after is quite handy.
@prykris wow that is so awesome, as someone who works in computer science; projects like these make me really excited to implement AI as Tool into code I'm working on. Did you want to maybe setup REPO and I could contribute when I'm feeling inspired. (I'm quite busy at moment, but I'd love to contribute at some point).
@killrawr @prykris it's cool to see you guys starting using tools. In the AutoPHP code, you have some example of agentic behavior with tools. What you want to do @killrawr could be done with AutoPHP, you just need to add the tools and it will work. @prykris in autophp, we reuse the answer from the tool to check we have done the objective. I'm very open to have an easier way to use tools in LLPhant
I was wondering is there support for a table tool that can be used with the AI
A example of what I'm thinking in PHP.
The
$result
fromgenerateText
(SUCCESS) Ideally I think every result should be returned using PrimaryKeyThe
$result
fromgenerateText
(FAIL)(OR) The
$result
fromgenerateText
Another example
(SUCCESS) $result
I wasn't sure how to explain, but I hope this describes what I was hoping for.
Could someone possibly build this into the next iteration? or show me an existing example? :)