All-Hands-AI / OpenHands

πŸ™Œ OpenHands: Code Less, Make More
https://all-hands.dev
MIT License
31.37k stars 3.62k forks source link

Not connecting to Textgen WebUI via OpenAI API (openai.APIConnectionError: Connection error.) #435

Closed jay-c88 closed 5 months ago

jay-c88 commented 5 months ago

SETUP

opendevin-frontend@0.1.0 start vite

VITE v5.1.6 ready in 488 ms

➜ Local: http://localhost:3001/ ➜ Network: use --host to expose ➜ press h + enter to show help

🌼 daisyUI 4.9.0 β”œβ”€ βœ”οΈŽ 1 theme added https://daisyui.com/docs/themes ╰─ ❀︎ Support daisyUI project: https://opencollective.com/daisyui

🌼 daisyUI 4.9.0 β”œβ”€ βœ”οΈŽ 1 theme added https://daisyui.com/docs/themes ╰─ β˜… Star daisyUI on GitHub https://github.com/saadeghi/daisyui


* Docker running:

jay@JPC:~$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 780b2e540c29 ghcr.io/opendevin/sandbox "tail -f /dev/null" 33 seconds ago Up 32 seconds sandbox-default


---

**ISSUE**

As soon as I enter the prompt it hangs for a minute (even unresponsive to CTRL+C)

============== STEP 0

    PLAN:

    πŸ”΅ 0 Write a bash script that prints 'hello world'

Then, I get the first error and it hangs again for a minute (also unresponsive)

Retrying llama_index.embeddings.openai.base.get_embeddings in 0.12605615291471606 seconds as it raised APIConnectionError: Connection error..


After another minute I get this error log before it repeats again from STEP 0
    AGENT ERROR:
    Connection error.

Traceback (most recent call last): File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions yield File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_transports/default.py", line 233, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request raise exc from None File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request response = connection.handle_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 101, in handle_request return self._connection.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 143, in handle_request raise exc File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 93, in handle_request self._send_request_headers(**kwargs) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 151, in _send_request_headers with map_exceptions({h11.LocalProtocolError: LocalProtocolError}): File "/home/jay/anaconda3/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 926, in _request response = self._client.send( ^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_client.py", line 1015, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_transports/default.py", line 232, in handle_request with map_httpcore_exceptions(): File "/home/jay/anaconda3/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/home/jay/OpenDevin/opendevin/controller/agent_controller.py", line 89, in step action = self.agent.step(self.state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/OpenDevin/agenthub/langchains_agent/langchains_agent.py", line 122, in step self._initialize(state.plan.main_goal) File "/home/jay/OpenDevin/agenthub/langchains_agent/langchains_agent.py", line 118, in _initialize self._add_event(d) File "/home/jay/OpenDevin/agenthub/langchains_agent/langchains_agent.py", line 72, in _add_event self.memory.add_event(event) File "/home/jay/OpenDevin/agenthub/langchains_agent/utils/memory.py", line 69, in add_event self.index.insert(doc) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/indices/base.py", line 242, in insert self.insert_nodes(nodes, insert_kwargs) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 329, in insert_nodes self._insert(nodes, insert_kwargs) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 312, in _insert self._add_nodes_to_index(self._index_struct, nodes, insert_kwargs) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 233, in _add_nodes_to_index nodes_batch = self._get_node_with_embedding(nodes_batch, show_progress) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 141, in _get_node_with_embedding id_to_embed_map = embed_nodes( ^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/indices/utils.py", line 138, in embed_nodes new_embeddings = embed_model.get_text_embedding_batch( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 146, in wrapper self.spandrop(id=id_, bound_args=bound_args, instance=instance, err=e) File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 98, in span_drop h.span_drop( File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/instrumentation/span_handlers/base.py", line 77, in span_drop span = self.prepare_to_drop_span( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/instrumentation/span_handlers/null.py", line 71, in prepare_to_drop_span raise err File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/instrumentation/dispatcher.py", line 144, in wrapper result = func(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/core/base/embeddings/base.py", line 280, in get_text_embedding_batch embeddings = self._get_text_embeddings(cur_batch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/embeddings/openai/base.py", line 427, in _get_text_embeddings return get_embeddings( ^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/tenacity/init.py", line 289, in wrapped_f return self(f, args, kw) ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/tenacity/init.py", line 379, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/tenacity/init.py", line 325, in iter raise retry_exc.reraise() ^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/tenacity/init.py", line 158, in reraise raise self.last_attempt.result() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/anaconda3/lib/python3.11/concurrent/futures/_base.py", line 449, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "/home/jay/anaconda3/lib/python3.11/concurrent/futures/_base.py", line 401, in get_result raise self._exception File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/tenacity/init.py", line 382, in call result = fn(*args, kwargs) ^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/llama_index/embeddings/openai/base.py", line 180, in get_embeddings data = client.embeddings.create(input=list_of_text, model=engine, kwargs).data ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/resources/embeddings.py", line 113, in create return self._post( ^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1208, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 897, in request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 950, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/jay/.local/share/virtualenvs/OpenDevin-JujB8J7i/lib/python3.11/site-packages/openai/_base_client.py", line 960, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.

    OBSERVATION:
    Connection error.


I've also tried with `LLM_EMBEDDING_MODEL="openai/mistral"` where I get a different error without the hanging up (very long error log but it still ends with the same result `openai.APIConnectionError: Connection error.`)
rbren commented 5 months ago

openai/mistral is not a valid value for LLM_EMBEDDING_MODEL. it can be "llama2", "openai", "azureopenai", or "local"`

It's also not a valid value for LLM_MODEL.

(And I don't think openai/mistral is a thing? They're competitors IIUC)

We can add more embedding providers here as well: https://github.com/OpenDevin/OpenDevin/blob/6e4089fb75789103b8f212dcad82dd51629f9f9a/agenthub/langchains_agent/utils/memory.py#L10

You need to either set LLM_MODEL to openai and set an LLM_API_KEY to your OpenAI key, or you can try for a local set up with ollama

jay-c88 commented 5 months ago

openai/mistral is not a valid value for LLM_EMBEDDING_MODEL. it can be "llama2", "openai", "azureopenai", or "local"`

It's also not a valid value for LLM_MODEL.

(And I don't think openai/mistral is a thing? They're competitors IIUC)

We can add more embedding providers here as well:

https://github.com/OpenDevin/OpenDevin/blob/6e4089fb75789103b8f212dcad82dd51629f9f9a/agenthub/langchains_agent/utils/memory.py#L10

You need to either set LLM_MODEL to openai and set an LLM_API_KEY to your OpenAI key, or you can try for a local set up with ollama

Ah yes I see, thank you. I've set LLM_EMBEDDING_MODEL="local" now and the hanging up is gone at least, but the openai.APIConnectionError: Connection error. still exists. I'm trying to connect fully locally using textgen-webui which supports OpenAI API. And I was refering to this doc from liteLLM to set the model name. So LLM_MODEL="openai/mistral" should be correct to make sure liteLLM knows it should use OpenAI API (and I've loaded a mistral model). But I'm still getting the error (although not hanging up anymore).

I've also tried it with LM Studio which supports OpenAI API aswell, but same problem with getting openai.APIConnectionError: Connection error. I'm not sure if this is a opendevin issue or where the problem exactly is. I'll keep trying and open a new issue when I have more information. I've seen people get it work with ollama, but using OpenAI API locally seems to cause alot of problems.

Thank you for the great work! I can't wait to use it locally, hopefully.


Edit: Solved my (stupid) mistake which was not related to opendevin. For future reference and anyone who runs into this issue:

I'm using wsl in Windows to run OpenDevin, but running the model in Windows, so wsl can't access Windows' localhost. Fixed by changing settings of wsl to mirror the network from host machine. LMStudio now works fine, but Textgen-WebUI still has problem (but not regarding to connection or api).

stratte89 commented 5 months ago

openai/mistral is not a valid value for LLM_EMBEDDING_MODEL. it can be "llama2", "openai", "azureopenai", or "local" It's also not a valid value forLLM_MODEL. (And I don't thinkopenai/mistral` is a thing? They're competitors IIUC) We can add more embedding providers here as well: https://github.com/OpenDevin/OpenDevin/blob/6e4089fb75789103b8f212dcad82dd51629f9f9a/agenthub/langchains_agent/utils/memory.py#L10

You need to either set LLM_MODEL to openai and set an LLM_API_KEY to your OpenAI key, or you can try for a local set up with ollama

Ah yes I see, thank you. I've set LLM_EMBEDDING_MODEL="local" now and the hanging up is gone at least, but the openai.APIConnectionError: Connection error. still exists. I'm trying to connect fully locally using textgen-webui which supports OpenAI API. And I was refering to this doc from liteLLM to set the model name. So LLM_MODEL="openai/mistral" should be correct to make sure liteLLM knows it should use OpenAI API (and I've loaded a mistral model). But I'm still getting the error (although not hanging up anymore).

I've also tried it with LM Studio which supports OpenAI API aswell, but same problem with getting openai.APIConnectionError: Connection error. I'm not sure if this is a opendevin issue or where the problem exactly is. I'll keep trying and open a new issue when I have more information. I've seen people get it work with ollama, but using OpenAI API locally seems to cause alot of problems.

Thank you for the great work! I can't wait to use it locally, hopefully.

Edit: Solved my (stupid) mistake which was not related to opendevin. For future reference and anyone who runs into this issue: I'm using wsl in windows to run OpenDevin, but running the model in Windows, so wsl can't access Windows' localhost. Fixed by changing settings of wsl to mirror the network from host machine. LMStudio now works fine, but Textgen-WebUI still has problem (but not regarding to connection).

Hey, i got the same issue, can you share how you mirrored the network from your host machine? gpt is not helping, it says that applictions who run within wsl can access localhost on windows...

jay-c88 commented 5 months ago

Open wsl config file C:\Users\%username%\.wslconfig (create one if it doesnt exist), and add this:

[wsl2]
networkingMode=mirrored

Then restart wsl completely (exit docker and run wsl --shutdown), then restart everything.