The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Not 100% sure, but was experiencing the issues using llama_llm of type Llama and generation setting of type LlamaLLMGenerationSettings.
This combination always throw error:
"Wrong generation settings for llama-cpp-python, use LlamaLLMGenerationSettings under llama_cpp_agent.llm_settings!"
Not 100% sure, but was experiencing the issues using llama_llm of type Llama and generation setting of type LlamaLLMGenerationSettings. This combination always throw error:
Looking in the code it seems Line 131 here https://github.com/Maximilian-Winter/llama-cpp-agent/blame/a7442166e326645c5198113cc643bfbf00fe4ffa/src/llama_cpp_agent/function_calling_agent.py#L131
Should be changed to:
(A or B) and C as current code executes as A or (B and C) without brackets