nus-apr / auto-code-rover

A project structure aware autonomous software engineer aiming for autonomous program improvement. Resolved 30.67% tasks (pass@1) in SWE-bench lite and 38.40% tasks (pass@1) in SWE-bench verified with each task costs less than $0.7.
Other
2.72k stars 288 forks source link

Ollama support issue #35

Closed h-summit closed 6 months ago

h-summit commented 6 months ago

When testing the llama3 model and ollama, I encountered an error indicating that communication with the ollama server is unreachable:

httpx.ConnectError: [Errno 111] Connection refused

This issue arises because ollama.chat(model=self.name, messages=[]) invokes chat = _client.chat (located in site-packages/ollama/init.py), where _client = Client(). The Client() constructor defaults to 'http://localhost:11434', which, within a Docker container, refers to the container itself rather than the host machine, while I install ollama in the host.

To resolve this, I propose two options:

I hope the maintainers acknowledge this issue. Considering that llama3 is a cost-effective option, its popularity is likely to increase, potentially affecting many users with this connectivity problem.

yuntongzhang commented 6 months ago

Thank you for reporting this. I was only testing with ollama and ACR both running in the host machine. I will patch it soon, likely in the second way you mentioned.