Libr-AI / OpenFactVerification

Loki: Open-source solution designed to automate the process of verifying factuality
https://loki.librai.tech/
MIT License
1.03k stars 45 forks source link

ollama LOCAL_API_URL not working #26

Open papiche opened 2 months ago

papiche commented 2 months ago

I ried to register my ollama node into api_config.yaml

SERPER_API_KEY: null
OPENAI_API_KEY: null
ANTHROPIC_API_KEY: null
LOCAL_API_KEY: anykey
LOCAL_API_URL: http://127.0.0.1:11434

But encounter an error

python webapp.py --api_config api_config.yaml
== Init decompose_model with model: gpt-4o
[INFO]2024-09-11 20:58:57,178 __init__.py:61: == LLMClient is not specified, use default llm client.
Traceback (most recent call last):
  File "/home/frd/workspace/OpenFactVerification/webapp.py", line 84, in <module>
    factcheck_instance = FactCheck(
                         ^^^^^^^^^^
  File "/home/frd/workspace/OpenFactVerification/factcheck/__init__.py", line 63, in __init__
    setattr(self, key, LLMClient(model=_model_name, api_config=self.api_config))
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/frd/workspace/OpenFactVerification/factcheck/utils/llmclient/gpt_client.py", line 15, in __init__
    self.client = OpenAI(api_key=self.api_config["OPENAI_API_KEY"])
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/frd/miniconda3/lib/python3.12/site-packages/openai/_client.py", line 105, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

What should I do ? thanks

PardonMySkillz commented 1 month ago

I am also encountering the same problem. Seems like ollama is currently not supportedd.