Open kevin-support-bot[bot] opened 3 hours ago
no
Is there any error in the docker container logs?
no, I think?
Could you check it in Github Codespaces and check if the issue is still persisting?
ok, thanks! But where can i find the tutorail about how to check it in github codespaces
Simply clicking the above link is enough. It will build (~2 mins) and give the app link.
ok thanks.
and I checked again with docker and GUI from localhost:3000.
There is no reply after I send one prompts. And I can not post anything new.
Could you check this litellm script to check if the LLM works properly?
ok thanks!
I am checking the github codespace, and after it I will try litellm script.
sincerely thanks
is this one correct? I wait for maybe like 10 mins, and no output yet, and my deepseek API seems no consumed
litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7be990072090>: Failed to establish a new connection: [Errno 111] Connection refused'))
Are you using WSL?
i am using github codespace with mac os/ chorme
under the github codespace: gpt can output, deepseek (from ollamao) cannot under my own docker: all cannot output
As Deepseek is running in your system only, the codespace won't have access to that. Now could you check if GPT is working locally using development guide?
no, i am using deepseek api
As Deepseek is running in your system only, the codespace won't have access to that. Now could you check if GPT is working locally using development guide?
gpt works very well
host='localhost', port=11434
Here Ollama is used.
so ollama is using local serving api?
then how can I use deepseek api?
Yes, ollama is running locally. Could you check this litellm script?
ok thanks, so if i want to use deepseek, i should use advanced mode?
Yes, with base_url = http://host.docker.internal:11434
yep, deepseek ok now!
Thanks.
But why my own docker cannot run?
Did you change the base URL? Local LLM Guide
no, but actually i am using api from cloud. why github codespace is ok but my docker failed to generate.
I run the command with
docker pull docker.all-hands.dev/all-hands-ai/runtime:0.14-nikolaik
docker run -it --rm --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.14-nikolaik \
-e LOG_ALL_EVENTS=true \
-v /var/run/docker.sock:/var/run/docker.sock \
-p 3000:3000 \
--add-host host.docker.internal:host-gateway \
--name openhands-app \
docker.all-hands.dev/all-hands-ai/openhands:0.14
API from deepseek.com?
litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7be990072090>: Failed to establish a new connection: [Errno 111] Connection refused'))
Is there any other error occurred like this?
API from deepseek.com?
yes, it can run only on github codespace, cannot on my own docker
Is there any error occurring when running using docker?
If you use deepseek
from API, then the model name is deepseek/deepseek-coder
No need to set base_url as litellm automatically handles that.
https://github.com/All-Hands-AI/OpenHands/issues/5171 Issue
@NonvolatileMemory Is there any error in the logs?