paulrobello / parllama

TUI for Ollama
MIT License
24 stars 2 forks source link

Timeout Error - Parllama - WSL2 #8

Open bamit99 opened 3 days ago

bamit99 commented 3 days ago

Describe the bug Getting error when running Parllama with -u option to point to my Ollama instance.

To Reproduce Steps to reproduce the behavior:

Open Commnd Prompt. Go to WSL Type parllala -u http://192.168.0.5:11434 to get the error

Expected behavior The application should work.

Screenshots image

Desktop (please complete the following information):

Additional context

Error:

(parllama) amit@Razor:~/.parllama$ parllama -u http://192.168.0.5:11434 Settings folder /home/amit/.parllama Traceback (most recent call last): File "/home/amit/anaconda3/envs/parllama/bin/parllama", line 8, in sys.exit(run()) ^^^^^ File "/home/amit/anaconda3/envs/parllama/lib/python3.11/site-packages/parllama/main.py", line 21, in run ParLlamaApp().run() File "/home/amit/anaconda3/envs/parllama/lib/python3.11/site-packages/textual/app.py", line 1624, in run asyncio.run(run_app()) File "/home/amit/anaconda3/envs/parllama/lib/python3.11/asyncio/runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "/home/amit/anaconda3/envs/parllama/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/amit/anaconda3/envs/parllama/lib/python3.11/asyncio/base_events.py", line 654, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "/home/amit/anaconda3/envs/parllama/lib/python3.11/site-packages/textual/app.py", line 1610, in run_app await self.run_async( File "/home/amit/anaconda3/envs/parllama/lib/python3.11/site-packages/textual/app.py", line 1574, in run_async await asyncio.shield(app._shutdown()) File "/home/amit/anaconda3/envs/parllama/lib/python3.11/site-packages/textual/app.py", line 2831, in _shutdown await self._close_all() File "/home/amit/anaconda3/envs/parllama/lib/python3.11/site-packages/textual/app.py", line 2809, in _close_all await self._prune_node(stack_screen) File "/home/amit/anaconda3/envs/parllama/lib/python3.11/site-packages/textual/app.py", line 3479, in _prune_node raise asyncio.TimeoutError( TimeoutError: Timeout waiting for [ToastRack(id='textual-toastrack'), Tooltip(id='textual-tooltip'), Header(), Footer(), Static(id='StatusBar'), Static(id='PsStatusBar'), TabbedContent(id='tabbed_content')] to close; possible deadlock (consider changing App.CLOSE_TIMEOUT)

paulrobello commented 3 days ago

How do you have Ollama installed? Did you use the native windows installer or did you install it via cli inside of WSL?

paulrobello commented 3 days ago

If you have Ollama installed via the native installer it will only listen on host(localhost). You need to set the following system environment var OLLAMA_HOST to a value of 0.0.0.0:11434 Then stop and start your ollama server.

If you have Ollama installed via WSL setting this env var which is specific to Ollama prior to running any Ollama related commands may also help:

export OLLAMA_HOST=0.0.0.0:11434

Restarting your Ollama server with the updated env var, this command from WSL should work:

parllama -u "http://$(hostname).local:11434"

Thank you for taking the time to help work through this. I will update the Readme once we get everything sorted out.

paulrobello commented 2 days ago

I just release v0.2.4 which should address all your issues. Please check the WSL section of the Readme. If it does solve your issue please let me know.

bamit99 commented 2 days ago

I am using Windows Native and it is listening on home PC static IP. Not using DHCP. I have already changed it using Enviroronment Variables. I am using it with CrewAI\Fabric\MemGPT too hence I know it's working. Let me try some more troubleshooting before connecting back. I will try and take a trace to find the issue.

bamit99 commented 2 days ago

TCPDump shows connections. Ollama Logs shows Get request but no response. It immediately times out with the same above connection.

image image

paulrobello commented 1 day ago

another possible way to set the -u option on parllama

 parllama -u "http://$(grep -m 1 nameserver /etc/resolv.conf | awk '{print $2}'):11434"
bamit99 commented 1 day ago

Let me try and come back!