Maximilian-Winter / llama-cpp-agent

The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Other
472 stars 42 forks source link

Condition in calling_agent seems to be wrong #10

Closed palkokec closed 6 months ago

palkokec commented 8 months ago

Not 100% sure, but was experiencing the issues using llama_llm of type Llama and generation setting of type LlamaLLMGenerationSettings. This combination always throw error:

"Wrong generation settings for llama-cpp-python, use LlamaLLMGenerationSettings under llama_cpp_agent.llm_settings!"

Looking in the code it seems Line 131 here https://github.com/Maximilian-Winter/llama-cpp-agent/blame/a7442166e326645c5198113cc643bfbf00fe4ffa/src/llama_cpp_agent/function_calling_agent.py#L131

Should be changed to:

if (isinstance(llama_llm, Llama) or isinstance(llama_llm, LlamaLLMSettings)) and isinstance(
                llama_generation_settings, LlamaCppGenerationSettings):

(A or B) and C as current code executes as A or (B and C) without brackets

Maximilian-Winter commented 8 months ago

Thanks, I plan to release a new version soon and will incorporate that.

krlohnes commented 6 months ago

This also seems to be an issue with the StructuredOutputAgent.

Maximilian-Winter commented 6 months ago

Is fixed in the latest release