OpenInterpreter / open-interpreter

A natural language interface for computers
http://openinterpreter.com/
GNU Affero General Public License v3.0
52.44k stars 4.63k forks source link

interpreter.llm.completion = custom_language_model seems not working #1182

Open sq2100 opened 5 months ago

sq2100 commented 5 months ago

Describe the bug

I attempted to run the routine for the custom model, which is supposed to just echo back what the user said, but it was not successful.

def custom_language_model(openai_message):
    """
    OpenAI-compatible completions function (this one just echoes what the user said back).
    """
    users_content = openai_message[-1].get("content") # Get last message's content

    # To make it OpenAI-compatible, we yield this first:
    yield {"delta": {"role": "assistant"}}

    for character in users_content:
        yield {"delta": {"content": character}}

# Tell Open Interpreter to power the language model with this function

interpreter.llm.completion = custom_language_model

I attempted to make modifications to interpreter.llm.completions = custom_language_model But the parameters don't match up.

Reproduce

Run

def custom_language_model(openai_message):
    """
    OpenAI-compatible completions function (this one just echoes what the user said back).
    """
    users_content = openai_message[-1].get("content") # Get last message's content

    # To make it OpenAI-compatible, we yield this first:
    yield {"delta": {"role": "assistant"}}

    for character in users_content:
        yield {"delta": {"content": character}}

# Tell Open Interpreter to power the language model with this function

interpreter.llm.completion = custom_language_model

https://docs.openinterpreter.com/language-models/custom-models

Expected behavior

(this one just echoes what the user said back)

Screenshots

No response

Open Interpreter version

0.2.4

Python version

3.11

Operating System name and version

win 11

Additional context

No response

Delva0 commented 5 months ago

这可能是官方文档的问题。 将函数custom_language_model改为:

def custom_language_model(**params):
  """
  OpenAI-compatible completions function (this one just echoes what the user said back).
  """
  openai_message = params['messages']
  users_content = openai_message[-1].get("content")
  # To make it OpenAI-compatible, we yield this first:
  yield {"delta": {"role": "assistant"}}
    for character in users_content:
      yield {"delta": {"content": character}}

你可以打印查看params的结构。 另外这个函数在core.llm.run_text_llm.py中被调用。

rbrisita commented 5 months ago

I ran into this problem myself. @Delva0 is correct with converting the function signature to include **params. The parameters I am getting are:

def custom_language_model(messages, model, stream, max_tokens):

But yielding is not working for me at the moment.

rbrisita commented 1 month ago

Documentation has been updated. This issue should be closed. @KillianLucas