chigkim / VOLlama

An accessible chat client for Ollama
GNU General Public License v3.0
24 stars 3 forks source link

Error 10061 when starting the app. #1

Closed Tetrismonster closed 7 months ago

Tetrismonster commented 8 months ago

Hey. I hope everything is going well. I tried this today, and keep getting an error 10061 target machine refused to accept traceback. What could be causing this? Trying to run it as admin didn't work. Also, can the app be used completely offline?

chigkim commented 8 months ago

that means it can't talk to Ollama. Make sure Ollama server is listening correctly.

Tetrismonster commented 7 months ago

Hey. I'm trying to use Gemma with ollama and when using the client I get this issue. [Window Title] Error

[Content] [WinError 10054] An existing connection was forcibly closed by the remote host Traceback (most recent call last): File "httpx_transports\default.py", line 66, in map_httpcore_exceptions File "httpx_transports\default.py", line 228, in handle_request File "httpcore_sync\connection_pool.py", line 216, in handle_request File "httpcore_sync\connection_pool.py", line 196, in handle_request File "httpcore_sync\connection.py", line 101, in handle_request File "httpcore_sync\http11.py", line 143, in handle_request File "httpcore_sync\http11.py", line 113, in handle_request File "httpcore_sync\http11.py", line 186, in _receive_response_headers File "httpcore_sync\http11.py", line 224, in _receive_event File "httpcore_backends\sync.py", line 124, in read File "contextlib.py", line 158, in exit File "httpcore_exceptions.py", line 14, in map_exceptions httpcore.ReadError: [WinError 10054] An existing connection was forcibly closed by the remote host

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "Model.py", line 162, in ask File "llama_index\core\llms\callbacks.py", line 99, in wrapped_gen File "llama_index\llms\ollama\base.py", line 141, in stream_chat File "contextlib.py", line 137, in enter File "httpx_client.py", line 857, in stream File "httpx_client.py", line 901, in send File "httpx_client.py", line 929, in _send_handling_auth File "httpx_client.py", line 966, in _send_handling_redirects File "httpx_client.py", line 1002, in _send_single_request File "httpx_transports\default.py", line 227, in handle_request File "contextlib.py", line 158, in exit File "httpx_transports\default.py", line 83, in map_httpcore_exceptions httpx.ReadError: [WinError 10054] An existing connection was forcibly closed by the remote host

[OK] I start by typing ollama run gemma with the command prompt running with elevated privileges and then run Vollama when I get the message telling me to send a message. I can send one message maybe before I get that error. I leave the console window open when using vollama.

chigkim commented 7 months ago

Looks like Vollama can't reach your Ollama server. Do you see the model in the list?

Tetrismonster commented 7 months ago

Hi. Yes, it appears as though I can get some output from the model. I can send about two messages providing the first message is reasonably short and then I get error 10054. For the sake of completeness, I use Gemma and have 32 GB of ram and an RTX 2060 with 6 GB of video memory.

chigkim commented 7 months ago

Most likely it's Ollama problem then. Look at ollama log. https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md

Tetrismonster commented 7 months ago

Hi. I don’t want to jinx it, but it seems as though the latest update has resolved the problem. Will keep you updated and thanks so much for the help.