The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Was wondering if it was possible to pass in a callback function (similar to send_message_to_user_callback) to FunctionCallingAgent that fires whenever a tool is used?
My use case is that I want some sort of feedback on what the agent is doing on the client side beyond what the end result is
Was wondering if it was possible to pass in a callback function (similar to
send_message_to_user_callback
) toFunctionCallingAgent
that fires whenever a tool is used?My use case is that I want some sort of feedback on what the agent is doing on the client side beyond what the end result is