Open anilmadishetty2498 opened 1 week ago
The Ollama models are not working with LocalLLM
because LocalLLM
is specifically designed to interface with OpenAI's API. The LocalLLM
class initializes an OpenAI client and uses it to generate chat completions. There is no implementation or support for Ollama models in this class.
class LocalLLM(LLM):
def __init__(self, api_base: str, model: str = "", api_key: str = "", **kwargs):
if not api_key:
api_key = "dummy"
self.model = model
self.client = OpenAI(base_url=api_base, api_key=api_key).chat.completions
self._invocation_params = kwargs
The self.client
is set to OpenAI(base_url=api_base, api_key=api_key).chat.completions
, which means it is configured to use OpenAI's chat completion endpoint. Therefore, it cannot work with Ollama models without modification to support Ollama's API [1].
To continue talking to Dosu, mention @dosu.
System Info
python=3.11.7
🐛 Describe the bug
from pandasai.llm.local_llm import LocalLLM
llm = LocalLLM(api_base="http://localhost:11434/v1", model="llama3")
db = Agent(scm_vc, config={"llm": llm})
not working