OpenInterpreter / open-interpreter

A natural language interface for computers
http://openinterpreter.com/
GNU Affero General Public License v3.0
52.63k stars 4.65k forks source link

Seems to be trying to connect to OpenAI or via LLM when I run --local please try a language model with a different architecture #749

Closed clickbrain closed 7 months ago

clickbrain commented 11 months ago

Describe the bug

I have LM Studio running and have both mistral and code llama downloaded. When I try to run interpreter --local with either running I get the error about using a different architecture.

Here are all the messages that get spit out. LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Python Version: 3.11.3 Pip Version: 23.0.1 OS Version and Architecture: macOS-14.0-arm64-arm-64bit CPU Info: arm RAM Info: 32.00 GB used: 20.46, free: 0.09

Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/connection.py", line 174, in _new_conn conn = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/util/connection.py", line 95, in create_connection raise err File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 61] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/connectionpool.py", line 703, in urlopen httplib_response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/connectionpool.py", line 398, in _make_request conn.request(method, url, **httplib_request_kw) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/connection.py", line 244, in request super(HTTPConnection, self).request(method, url, body=body, headers=headers) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/http/client.py", line 1283, in request self._send_request(method, url, body, headers, encode_chunked) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/http/client.py", line 1329, in _send_request self.endheaders(body, encode_chunked=encode_chunked) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/http/client.py", line 1278, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/http/client.py", line 1038, in _send_output self.send(msg) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/http/client.py", line 976, in send self.connect() File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/connection.py", line 205, in connect conn = self._new_conn() ^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/connection.py", line 186, in _new_conn raise NewConnectionError( urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x1203da150>: Failed to establish a new connection: [Errno 61] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/requests/adapters.py", line 489, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/connectionpool.py", line 787, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/urllib3/util/retry.py", line 592, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x1203da150>: Failed to establish a new connection: [Errno 61] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_requestor.py", line 606, in request_raw result = _thread_context.session.request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/requests/sessions.py", line 587, in request resp = self.send(prep, send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/requests/sessions.py", line 701, in send r = adapter.send(request, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/requests/adapters.py", line 565, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x1203da150>: Failed to establish a new connection: [Errno 61] Connection refused'))

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/respond.py", line 49, in respond for chunk in interpreter._llm(messages_for_llm): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/llm/convert_to_coding_llm.py", line 65, in coding_llm for chunk in text_llm(messages): ^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/llm/setup_text_llm.py", line 144, in base_llm return litellm.completion(params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 962, in wrapper raise e File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 899, in wrapper result = original_function(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/timeout.py", line 53, in wrapper result = future.result(timeout=local_timeout_duration) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 456, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py", line 401, in get_result raise self._exception File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/timeout.py", line 42, in async_func return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/main.py", line 1403, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 3574, in exception_type raise e File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/utils.py", line 2885, in exception_type raise original_exception File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/main.py", line 509, in completion raise e File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/litellm/main.py", line 491, in completion response = openai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 155, in create response, , api_key = requestor.request( ^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_requestor.py", line 289, in request result = self.request_raw( ^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/openai/api_requestor.py", line 619, in request_raw raise error.APIConnectionError( openai.error.APIConnectionError: Error communicating with OpenAI: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x1203da150>: Failed to establish a new connection: [Errno 61] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.11/bin/interpreter", line 8, in sys.exit(cli()) ^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 24, in cli cli(self) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/cli/cli.py", line 268, in cli interpreter.chat() File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 86, in chat for _ in self._streaming_chat(message=message, display=display): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 106, in _streaming_chat yield from terminal_interface(self, message) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/terminal_interface/terminal_interface.py", line 115, in terminal_interface for chunk in interpreter.chat(message, display=False, stream=True): File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 127, in _streaming_chat yield from self._respond() File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/core.py", line 162, in _respond yield from respond(self) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/interpreter/core/respond.py", line 105, in respond raise Exception( Exception: Error communicating with OpenAI: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x1203da150>: Failed to establish a new connection: [Errno 61] Connection refused'))

Reproduce

Run LM Studio Choose a local model Start Server Run interpreter --local Tell interpreter what I want to build Get error

Expected behavior

To run the system against the local model.

Screenshots

No response

Open Interpreter version

python -v0.1.14

Python version

3.11.3

Operating System name and version

MacOS Sonoma 14.0

Additional context

No response

Notnaton commented 11 months ago

Hi, could you run this to check. This will help narrow down what the issue is.

  1. Start LM Studio
  2. Load model
  3. Start server

Now, copy the curl command from the server window and paste it into a terminal: image

Also, in server options please tell what options are on/off

clickbrain commented 11 months ago

I got this message when I ran the curl command: {"error":"Unexpected endpoint or method. (GET /v1/chat/completions)"}%

In options, this option is the only one that is off: Cross-Origin-Resource-Sharing (CORS)

Notnaton commented 11 months ago

In options, this option is the only one that is off: Cross-Origin-Resource-Sharing (CORS)

Turn that one on and try with interpreter again

clickbrain commented 11 months ago

Thanks. I turned that on and ran it again and getting the same message when I attempt to chat.

Notnaton commented 11 months ago

How are you running interpreter? Is it in a virtual environment, like conda? It seems like the curl command went through, so the server is running. But for some reason interpreter can't connect to it

clickbrain commented 11 months ago

Just running from command line in Terminal. Installed via PIP.

k2jama commented 11 months ago

I've been dealing with a similar issue as well. "openai.error.Timeout: Request timed out: HTTPConnectionPool(...)". Please help @Notnaton

Notnaton commented 11 months ago

Run pip show urllib3 and post the output

tacticalinsights commented 11 months ago

I faced a similar issue when trying to run interpreter on wsl. It worked fine when I tried it on my command prompt though.

Notnaton commented 11 months ago

@tacticalinsights Please run: pip show urllib3 And post the output here. I believe there is a version conflict going on here.

tacticalinsights commented 11 months ago

this is the output in wsl: image

this is the output in cmd: image

clickbrain commented 11 months ago

zsh: /opt/homebrew/bin/pip: bad interpreter: /opt/homebrew/opt/python@3.11/bin/python3.11: no such file or directory Name: urllib3 Version: 1.26.15 Summary: HTTP library with thread-safe connection pooling, file post, and more. Home-page: https://urllib3.readthedocs.io/ Author: Andrey Petrov Author-email: andrey.petrov@shazow.net License: MIT Location: /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages Requires: Required-by: clickhouse-connect, docker, pinecone-client, requests

Notnaton commented 11 months ago

@clickbrain Let's try to reinstall python. There seems to be something wrong with the installation... I don't use MacOS so I'm not very familiar with how it works. If this looks correct please run it:

brew uninstall python@3.11
brew install python@3.11

I see brew have other relevant stuff, if reinstalling doesn't work try one of these:

brew doctor
brew link python@3.11
clickbrain commented 11 months ago

Thanks. I actually did that about 10 minutes after I sent you that message. I now have a fresh install of 3.12. Sadly though, I am still getting the same message with OI.

clickbrain commented 11 months ago

Is there a particular LLM that you guys know works well with LM Studio and OI? I am using variations on Mistral and Code Llama, but may be an issue there?

Have you thought about trying Ollama as well as API server?

OK, So I uninstalled 3.12 that I did via download from python.org. I reinstalled via homebrew 3.11. I uninstalled Open Interpreter with PIP and then reinstalled it. It is telling me that

WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/openai-0.27.8.dist-info due to invalid metadata entry 'name' WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/openai-0.27.8.dist-info due to invalid metadata entry 'name'

when I install it.

clickbrain commented 11 months ago

I am going to try again and will let you know.

yaronbeen commented 11 months ago

Any updates on this issue?

Notnaton commented 11 months ago

@ericrallen Do you have any idea what's causing this?

clickbrain commented 10 months ago

Really would like to make this work. Any ideas @Notnaton or @ericrallen ?

Notnaton commented 10 months ago

Is there a particular LLM that you guys know works well with LM Studio and OI? I am using variations on Mistral and Code Llama, but may be an issue there?

Have you thought about trying Ollama as well as API server?

You can use interpreter --api_base http://localhost:port/v1 --api_key "" --model openai/local to connect to any LLM server

OK, So I uninstalled 3.12 that I did via download from python.org. I reinstalled via homebrew 3.11. I uninstalled Open Interpreter with PIP and then reinstalled it. It is telling me that

Python 3.12 is not supported

WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/openai-0.27.8.dist-info due to invalid metadata entry 'name' WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/openai-0.27.8.dist-info due to invalid metadata entry 'name'

@clickbrain Try:

pip install openai==0.28.0

If it still doesn't work: I think the best option now is to run interpreter in a virtual environment. There is something in the system that breaks it.

If using conda:

conda create --name NAME_OF_ENV python=3.11 conda activate NAME_OF_ENV conda install pip pip install open-interpreter interpreter --local

Of course you can use other virtual environments too.

clickbrain commented 10 months ago

OK, I have Open-Interpreter installed in Conda env running Python 3.11 I have LM Studio server running I run interpreter --local Get the instructions run LM Studio, etc.

Then I get the same mess of errors: Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

    Python Version: 3.11.6
    Pip Version: 23.3.1
    Open-interpreter Version: cmd:Interpreter, pkg: 0.1.17
    OS Version and Architecture: macOS-14.1.2-arm64-arm-64bit
    CPU Info: arm
    RAM Info: 32.00 GB, used: 15.77, free: 0.20

    Interpreter Info
    Vision: False
    Model: openai/gpt-4
    Function calling: False
    Context window: 3000
    Max tokens: 1000

    Auto run: False
    API base: http://localhost:1234/v1
    Local: True

    Curl output: [Errno 2] No such file or directory: 'curl http://localhost:1234/v1'

Traceback (most recent call last): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/connection.py", line 203, in _new_conn sock = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection raise err File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/util/connection.py", line 73, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 61] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/connectionpool.py", line 496, in _make_request conn.request( File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/connection.py", line 395, in request self.endheaders() File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/http/client.py", line 1281, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/http/client.py", line 1041, in _send_output self.send(msg) File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/http/client.py", line 979, in send self.connect() File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/connection.py", line 243, in connect self.sock = self._new_conn() ^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/connection.py", line 218, in _new_conn raise NewConnectionError( urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x10dfdfa10>: Failed to establish a new connection: [Errno 61] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/requests/adapters.py", line 486, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/connectionpool.py", line 844, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/urllib3/util/retry.py", line 515, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10dfdfa10>: Failed to establish a new connection: [Errno 61] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/openai/api_requestor.py", line 606, in request_raw result = _thread_context.session.request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/requests/adapters.py", line 519, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10dfdfa10>: Failed to establish a new connection: [Errno 61] Connection refused'))

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/respond.py", line 49, in respond for chunk in interpreter._llm(messages_for_llm): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/llm/convert_to_coding_llm.py", line 65, in coding_llm for chunk in text_llm(messages): ^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/llm/setup_text_llm.py", line 154, in base_llm return litellm.completion(params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/utils.py", line 962, in wrapper raise e File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/utils.py", line 899, in wrapper result = original_function(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/timeout.py", line 53, in wrapper result = future.result(timeout=local_timeout_duration) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/concurrent/futures/_base.py", line 456, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/concurrent/futures/_base.py", line 401, in get_result raise self._exception File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/timeout.py", line 42, in async_func return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/main.py", line 1403, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/utils.py", line 3574, in exception_type raise e File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/utils.py", line 2885, in exception_type raise original_exception File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/main.py", line 509, in completion raise e File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/litellm/main.py", line 491, in completion response = openai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 155, in create response, , api_key = requestor.request( ^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/openai/api_requestor.py", line 289, in request result = self.request_raw( ^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/openai/api_requestor.py", line 619, in request_raw raise error.APIConnectionError( openai.error.APIConnectionError: Error communicating with OpenAI: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10dfdfa10>: Failed to establish a new connection: [Errno 61] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/bradn/miniconda3/envs/openi/bin/interpreter", line 8, in sys.exit(start_terminal_interface()) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/core.py", line 21, in start_terminal_interface start_terminal_interface(self) File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 304, in start_terminalinterface interpreter.chat() File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/core.py", line 77, in chat for in self._streaming_chat(message=message, display=display): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/core.py", line 92, in _streaming_chat yield from terminal_interface(self, message) File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/terminal_interface/terminal_interface.py", line 115, in terminal_interface for chunk in interpreter.chat(message, display=False, stream=True): File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/core.py", line 113, in _streaming_chat yield from self._respond() File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/core.py", line 148, in _respond yield from respond(self) File "/Users/bradn/miniconda3/envs/openi/lib/python3.11/site-packages/interpreter/core/respond.py", line 115, in respond raise Exception( Exception: Error communicating with OpenAI: HTTPConnectionPool(host='localhost', port=1234): Max retries exceeded with url: /v1/chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10dfdfa10>: Failed to establish a new connection: [Errno 61] Connection refused'))

Please make sure LM Studio's local server is running by following the steps above.

If LM Studio's local server is running, please try a language model with a different architecture.

clickbrain commented 10 months ago

Just realized, that the URLL being accessed by OI is http://localhost:1234/v1

But LM Studio seems to be running on http://localhost:7878/v1

If you're using an OpenAI client, set openai.api_base (python), or the baseURL (node.js) property in your client configuration to "http://localhost:7878/v1".

clickbrain commented 10 months ago

OK, solved the problem. Sorry for the excessive messages. Changed the server port and we seem good to go. I'll see how it goes from here. Thank you for your help.

clickbrain commented 10 months ago

Not sure I should post this here or not, but when running with a Local LLM via LM Studios, despite being told via the config that it can access things on the OS, it refuses to do so. SO in other words, it will not open a web browser and will not access my calendar. It will through write python code and then execute it via the prompt.

Is there a way to tell the local model, that it can access MacOS, can run a browser, etc?

This was running Zephyr beta 7B. I am going to try some other models to see if I get the same issue.

anandharshini commented 8 months ago

https://learn.microsoft.com/en-us/windows/wsl/networking#accessing-windows-networking-apps-from-linux-host-ip - WSL2 to windows host web app connection

MikeBirdTech commented 7 months ago

New approach to local just was added and will be pushed to pip on the next release! Thanks!