Leon-Sander / Local-Multimodal-AI-Chat

GNU General Public License v3.0
136 stars 83 forks source link

ConnectionError: HTTPConnectionPool ( OLLAMA MODEL) #32

Closed Paramjethwa closed 1 month ago

Paramjethwa commented 1 month ago

ConnectionError: HTTPConnectionPool(host='ollama', port=11434): Max retries exceeded with url: /api/tags (Caused by NameResolutionError("<urllib3.connection.HTTPConnection object at 0x7fad59a1bb20>: Failed to resolve 'ollama' ([Errno -3] Temporary failure in name resolution)"))

here is the traceback : Traceback: File "/usr/local/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling result = func() File "/usr/local/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 590, in code_to_exec exec(code, module.dict) File "/app/app.py", line 171, in main() File "/app/app.py", line 71, in main st.session_state.model_options = list_model_options() File "/app/app.py", line 48, in list_model_options ollama_options = list_ollama_models() File "/app/utils.py", line 88, in list_ollama_models json_response = requests.get(url = "http://ollama:11434/api/tags").json() File "/usr/local/lib/python3.10/site-packages/requests/api.py", line 73, in get return request("get", url, params=params, kwargs) File "/usr/local/lib/python3.10/site-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, kwargs) File "/usr/local/lib/python3.10/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, send_kwargs) File "/usr/local/lib/python3.10/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, kwargs) File "/usr/local/lib/python3.10/site-packages/requests/adapters.py", line 700, in send raise ConnectionError(e, request=request)

As you start the app in the browser this error pops up

i have tried with both WSL2 and windows VScode same error


also when i rerun the streamlit i get this error too

i feel like this is related to database initialization but doing docker compose must do resolve this as docker_compose_yaml already has the python_operation.py command

AttributeError: st.session_state has no attribute "model_options". Did you forget to initialize it? More info: https://docs.streamlit.io/develop/concepts/architecture/session-state#initialization Traceback: File "/usr/local/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling result = func() File "/usr/local/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 590, in code_to_exec exec(code, module.dict) File "/app/app.py", line 171, in main() File "/app/app.py", line 89, in main model_col.selectbox(label="Select a Model", options = st.session_state.model_options, key="model_to_use") File "/usr/local/lib/python3.10/site-packages/streamlit/runtime/state/session_state_proxy.py", line 131, in getattr raise AttributeError(_missing_attr_error_message(key))

Leon-Sander commented 1 month ago

The second error is about session state initialization. I think because the first error occurred, the session state variables were not initialized correctly, then after rerunning, the code tried do access the session state, but it wasn't initialized because of the error before.

Paramjethwa commented 1 month ago

i got this solved by looking on github for similar issue

in short basically you have to change the baseURL to BASE_URL="http://172.19.80.1:11434"

this was one of the comment on GitHub thread

My setup is:

Windows 10, where I installed ollama (with OllamaSetup.exe) WSL + Ubuntu, where I installed OpenDevin Actually the issue is made of the following issues:

You need to check that ollama is actually running, so try in windows 10 (ms-dos prompt or powershell) curl 127.0.0.1:11434 You should get a "ollama is running" message

You need to understand that WSL is like a virtual machine, then "127.0.0.1" inside WSL does NOT mean connecting to windows 10, but connecting into the virtual environment in WSL.

You need to figure out the actual IP of the windows 10 machine seen from WSL. I did it with a "traceroute www.google.com" command, it gave me the following: traceroute to www.google.com (142.250.180.132), 30 hops max, 60 byte packets 1 DESKTOP-K5HF2NK.mshome.net (172.19.80.1) 0.315 ms 0.230 ms 0.207 ms ... more stuff... So my windows 10 is seen from WSL + Ubuntu as 172.19.80.1. So, my config.toml file in OpenDevin looks like:

LLM_MODEL="ollama/llama2" LLM_API_KEY="na" LLM_BASE_URL="http://172.19.80.1:11434" LLM_EMBEDDING_MODEL="llama2" WORKSPACE_DIR="./workspace" Still things does not work, because by default ollama is only accepting local network connections. So, you need to add an environment variable: OLLAMA_HOST="0.0.0.0" in your windows 10. You can test quickly that in PowerShell, just quit ollama then open PowerShell and give: $env:OLLAMA_HOST="0.0.0.0" ollama serve Now opening "localhost:3001" in browser (in windows 10) should give you a working OpenDevin.

codelearner-8 commented 1 month ago

Screenshot from 2024-10-10 01-24-59 Hi Leon

I have installed it manually on Ubuntu 22.04 and getting the above error Tried checking the functionality by " Ollama serve" and it is running.

What should be the Ollama host address?? i have attached the screenshot of config

Leon-Sander commented 1 month ago

If you have set up everyting without docker, you can simply set base_url: http://localhost:11434