OpenInterpreter / open-interpreter

A natural language interface for computers
http://openinterpreter.com/
GNU Affero General Public License v3.0
50.54k stars 4.4k forks source link

Executing commands in terminal #1305

Open JeromyJSmith opened 2 weeks ago

JeromyJSmith commented 2 weeks ago

Describe the bug

When executing commands in the terminal it seems like a mad-Lib of random strange words that it uses as commands. it finally realizes and runs the right command to accomplish the task but then often crashes after. the last line saying it will start the HTTP server but then hangs. this is my first Bug report. If I could provide anything else or do this better I'm happy to provide more.

Here is an example:


  I will initiate a simple HTTP server from the project directory now.

  groupBox.commands

  ---------------------------------------------------------------------------
  NameError                                 Traceback (most recent call last)
  Cell In[12], line 2
        1 print('##active_line1##')
  ----> 2 groupBox.commands

  NameError: name 'groupBox' is not defined

  It seems there was an error with the previous approach. I will correctly start a local
  HTTP server using Python in the appropriate directory containing the downloaded website
  files.

  Let's properly initiate the HTTP server now.

  indows.execute

  ---------------------------------------------------------------------------
  NameError                                 Traceback (most recent call last)
  Cell In[14], line 2
        1 print('##active_line1##')
  ----> 2 indows.execute

  NameError: name 'indows' is not defined

Was Open Interpreter helpful? (y/n):

  There was an issue with the previous execution attempt. Let me start a local HTTP
Thanks for your feedback! Would you like to send us this chat so we can improve?

(y/n):

  There was an issue with the previous execution attempt. Let me start a local HTTP
Traceback (most recent call last):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ll
ms/openai.py", line 736, in completion
    raise e
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ll
ms/openai.py", line 655, in completion
    return self.streaming(
           ^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ll
ms/openai.py", line 833, in streaming
    response = openai_client.chat.completions.create(**data, timeout=timeout)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_ut
ils/_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/res
ources/chat/completions.py", line 590, in create
    return self._post(
           ^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_ba
se_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream,
stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_ba
se_client.py", line 921, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_ba
se_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key
provided: x. You can find your API key at https://platform.openai.com/account/api-keys.',
'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ma
in.py", line 1099, in completion
    raise e
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ma
in.py", line 1072, in completion
    response = openai_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ll
ms/openai.py", line 742, in completion
    raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key
provided: x. You can find your API key at https://platform.openai.com/account/api-keys.',
'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/llm/llm.py", line 344, in fixed_litellm_completions
    yield from litellm.completion(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ut
ils.py", line 3413, in wrapper
    raise e
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ut
ils.py", line 3304, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ma
in.py", line 2447, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ut
ils.py", line 9996, in exception_type
    raise e
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ut
ils.py", line 8665, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Error code:
401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at
https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param':
None, 'code': 'invalid_api_key'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.12/bin/interpreter", line 8, in
<module>
    sys.exit(main())
             ^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/terminal_interface/start_terminal_interface.py", line 509, in main
    start_terminal_interface(interpreter)
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/terminal_interface/start_terminal_interface.py", line 475, in start_terminal_interface
    interpreter.chat()
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/core.py", line 200, in chat
    for _ in self._streaming_chat(message=message, display=display):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/core.py", line 232, in _streaming_chat
    yield from terminal_interface(self, message)
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/terminal_interface/terminal_interface.py", line 133, in terminal_interface
    for chunk in interpreter.chat(message, display=False, stream=True):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/core.py", line 271, in _streaming_chat
    yield from self._respond_and_store()
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/core.py", line 321, in _respond_and_store
    for chunk in respond(self):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/respond.py", line 78, in respond
    for chunk in interpreter.llm.run(messages_for_llm):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/llm/llm.py", line 263, in run
    yield from run_function_calling_llm(self, params)
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
    for chunk in llm.completions(**request_params):
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/llm/llm.py", line 347, in fixed_litellm_completions
    raise first_error
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/interprete
r/core/llm/llm.py", line 328, in fixed_litellm_completions
    yield from litellm.completion(**params)
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ut
ils.py", line 11832, in __next__
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ut
ils.py", line 9998, in exception_type
    raise original_exception
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/ut
ils.py", line 11772, in __next__
    chunk = next(self.completion_stream)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_st
reaming.py", line 43, in __next__
    return self._iterator.__next__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File
"/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/openai/_st
reaming.py", line 72, in __stream__
    raise APIError(
openai.APIError: The model produced invalid content. Consider modifying your prompt if you
are seeing this error persistently.

  There was an issue with the previous execution attempt. Let me start a local HTTP
  server using the correct Python code, serving the content from the "BelgiumCycles"
  directory on your desktop.

  I'll start the HTTP server now.●

Reproduce

CleanShot 2024-06-14 at 23 40 55@2x

Expected behavior

I expected it to know how run simple terminal commands and it was coming up with strange word combinations that didn't make any sense at all. Half words, maybe somehow related to other operating systems sometimes but nothing related to the actual command to run. It usually takes three or four times before it realizes what it was trying to do and runs the right command but then freezes and I get the message" was open-interpreter helpful message as it is hung up running the right command. This is brand new Bug for me. I've been using open-interpreter from the beginning. Ya'll are awesome. thank you!

Screenshots

CleanShot 2024-06-14 at 23 22 41@2x

Open Interpreter version

0.2.6

Python version

3.12.4

Operating System name and version

MacOS 14.5 (23F79)

Additional context

iMac

Retina 5K, 27-inch, 2019

Processor 3.6 GHz 8-Core Intel Core

Graphics Memory

Startup disk Serial number macOS i9 Radeon Pro 580X 8 GB 128 GB 2667 MHz DDR4 MacOS C02Z70EFJV40 14.5 (23F79)