OpenInterpreter / open-interpreter

A natural language interface for computers
http://openinterpreter.com/
GNU Affero General Public License v3.0
50.54k stars 4.4k forks source link

--os指令无法使用default.yaml #1293

Closed bincooo closed 3 weeks ago

bincooo commented 3 weeks ago

Describe the bug

如题所述

Reproduce

interpreter --model coze --auto_run --os

Expected behavior

l_interface
    for chunk in interpreter.chat(message, display=False, stream=True):
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/interpreter/core/core.py", line 234, in _streaming_chat
    yield from self._respond_and_store()
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/interpreter/core/core.py", line 282, in _respond_and_store
    for chunk in respond(self):
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/interpreter/core/respond.py", line 69, in respond
    for chunk in interpreter.llm.run(messages_for_llm):
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/interpreter/core/llm/llm.py", line 201, in run
    yield from run_text_llm(self, params)
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/interpreter/core/llm/run_text_llm.py", line 20, in run_text_llm
    for chunk in llm.completions(**params):
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/interpreter/core/llm/llm.py", line 232, in fixed_litellm_completions
    raise first_error
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/interpreter/core/llm/llm.py", line 213, in fixed_litellm_completions
    yield from litellm.completion(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/litellm/utils.py", line 3415, in wrapper
    raise e
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/litellm/utils.py", line 3306, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/litellm/main.py", line 2447, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/litellm/utils.py", line 9989, in exception_type
    raise e
  File "/usr/local/Caskroom/miniconda/base/envs/interpreter/lib/python3.11/site-packages/litellm/utils.py", line 9957, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=coze
 Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers

Screenshots

No response

Open Interpreter version

0.2.5

Python version

3.11.9

Operating System name and version

MacOS 14

Additional context

No response

Notnaton commented 3 weeks ago

What provider are you using? Openai, huggingface, ollama...?

bincooo commented 3 weeks ago

A third party implements the openai v1 interface.

bincooo commented 3 weeks ago

Additionally, is it possible for the program to customize specific functions? For instance, accessing the weather interface and querying the database.

Notnaton commented 3 weeks ago

Change model to openai/coze Add --api_base https://example.com/v1 Or https://ip_address:port/v1

bincooo commented 3 weeks ago

okey~