nus-apr / auto-code-rover

A project structure aware autonomous software engineer aiming for autonomous program improvement. Resolved 30.67% tasks (pass@1) in SWE-bench lite with each task costs less than $0.7.
Other
2.37k stars 236 forks source link

Ollama support issue #35

Closed h-summit closed 2 months ago

h-summit commented 2 months ago

When testing the llama3 model and ollama, I encountered an error indicating that communication with the ollama server is unreachable:

httpx.ConnectError: [Errno 111] Connection refused

This issue arises because ollama.chat(model=self.name, messages=[]) invokes chat = _client.chat (located in site-packages/ollama/init.py), where _client = Client(). The Client() constructor defaults to 'http://localhost:11434', which, within a Docker container, refers to the container itself rather than the host machine, while I install ollama in the host.

To resolve this, I propose two options:

I hope the maintainers acknowledge this issue. Considering that llama3 is a cost-effective option, its popularity is likely to increase, potentially affecting many users with this connectivity problem.

yuntongzhang commented 2 months ago

Thank you for reporting this. I was only testing with ollama and ACR both running in the host machine. I will patch it soon, likely in the second way you mentioned.