Replaced OpenAI client with llama_cpp for local LLaMA model support.
Updated model path to point to local LLaMA model directory.
Ensured compatibility with BaseAgent configuration for LLaMA.
This change allows the chatbot to run using a local LLaMA model instead of relying on OpenAI's API, enabling offline functionality and reducing dependency on external services.
This change allows the chatbot to run using a local LLaMA model instead of relying on OpenAI's API, enabling offline functionality and reducing dependency on external services.