The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
The method FunctionCallingAgent.generate_response, it seems storing values in the variable 'result', but afterwards it doesn't return it. Actually, if that is not the intended purpose of 'result', it adds useful functionality, as now it is easy to programmatically use the response by the LLM.
The place I added the return statement is line 388, function_calling_agent.py
The method FunctionCallingAgent.generate_response, it seems storing values in the variable 'result', but afterwards it doesn't return it. Actually, if that is not the intended purpose of 'result', it adds useful functionality, as now it is easy to programmatically use the response by the LLM.
The place I added the return statement is line 388, function_calling_agent.py
A big thank you for your work.