taketwo / llm-ollama

LLM plugin providing access to local Ollama models using HTTP API
Apache License 2.0
160 stars 10 forks source link

'Connection refused' Error #3

Closed haje01 closed 9 months ago

haje01 commented 9 months ago

After I installed llm-ollama, simple command raises httpx.ConnectError: [Errno 111] Connection refused

llm -m llama2:latest 'How much is 2+2?'
Traceback (most recent call last):
  File "/home/haje01/.local/bin/llm", line 8, in <module>
    sys.exit(cli())
  File "/home/haje01/.local/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/home/haje01/.local/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/home/haje01/.local/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/haje01/.local/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/haje01/.local/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/haje01/.local/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/home/haje01/.local/lib/python3.10/site-packages/llm/cli.py", line 799, in models_list
    for model_with_aliases in get_models_with_aliases():
  File "/home/haje01/.local/lib/python3.10/site-packages/llm/__init__.py", line 80, in get_models_with_aliases
    pm.hook.register_models(register=register)
  File "/home/haje01/.local/lib/python3.10/site-packages/pluggy/_hooks.py", line 501, in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
  File "/home/haje01/.local/lib/python3.10/site-packages/pluggy/_manager.py", line 119, in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
  File "/home/haje01/.local/lib/python3.10/site-packages/pluggy/_callers.py", line 138, in _multicall
    raise exception.with_traceback(exception.__traceback__)
  File "/home/haje01/.local/lib/python3.10/site-packages/pluggy/_callers.py", line 102, in _multicall
    res = hook_impl.function(*args)
  File "/home/haje01/.local/lib/python3.10/site-packages/llm_ollama.py", line 26, in register_models
    for model in ollama.list()["models"]:
  File "/home/haje01/.local/lib/python3.10/site-packages/ollama/_client.py", line 313, in list
    return self._request('GET', '/api/tags').json()
  File "/home/haje01/.local/lib/python3.10/site-packages/ollama/_client.py", line 53, in _request
    response = self._client.request(method, url, **kwargs)
  File "/home/haje01/.local/lib/python3.10/site-packages/httpx/_client.py", line 814, in request
    return self.send(request, auth=auth, follow_redirects=follow_redirects)
  File "/home/haje01/.local/lib/python3.10/site-packages/httpx/_client.py", line 901, in send
    response = self._send_handling_auth(
  File "/home/haje01/.local/lib/python3.10/site-packages/httpx/_client.py", line 929, in _send_handling_auth
    response = self._send_handling_redirects(
  File "/home/haje01/.local/lib/python3.10/site-packages/httpx/_client.py", line 966, in _send_handling_redirects
    response = self._send_single_request(request)
  File "/home/haje01/.local/lib/python3.10/site-packages/httpx/_client.py", line 1002, in _send_single_request
    response = transport.handle_request(request)
  File "/home/haje01/.local/lib/python3.10/site-packages/httpx/_transports/default.py", line 227, in handle_request
    with map_httpcore_exceptions():
  File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/home/haje01/.local/lib/python3.10/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 111] Connection refused

Thank you in advance!

taketwo commented 9 months ago

Please check that you can interact with Ollama server directly through HTTP API:

$ curl http://localhost:11434/api/chat -d '{
  "model": "llama2:latest",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    }
  ]
}'
haje01 commented 9 months ago

My Apologies...

Thanks!