Maximilian-Winter / llama-cpp-agent

The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Other
472 stars 42 forks source link

FunctionCallingAgent.generate_response lacking return statement #38

Closed ebarrragn closed 5 months ago

ebarrragn commented 5 months ago

The method FunctionCallingAgent.generate_response, it seems storing values in the variable 'result', but afterwards it doesn't return it. Actually, if that is not the intended purpose of 'result', it adds useful functionality, as now it is easy to programmatically use the response by the LLM.

The place I added the return statement is line 388, function_calling_agent.py

A big thank you for your work.

Maximilian-Winter commented 5 months ago

I will look at it and then decide.

Maximilian-Winter commented 5 months ago

@ebarrragn Actually it isn't intended to return a response, but I added it, like you did, for people who need it.

Maximilian-Winter commented 5 months ago

@ebarrragn Is in the latest release.