Open sq2100 opened 5 months ago
这可能是官方文档的问题。 将函数custom_language_model改为:
def custom_language_model(**params):
"""
OpenAI-compatible completions function (this one just echoes what the user said back).
"""
openai_message = params['messages']
users_content = openai_message[-1].get("content")
# To make it OpenAI-compatible, we yield this first:
yield {"delta": {"role": "assistant"}}
for character in users_content:
yield {"delta": {"content": character}}
你可以打印查看params的结构。 另外这个函数在core.llm.run_text_llm.py中被调用。
I ran into this problem myself. @Delva0 is correct with converting the function signature to include **params
. The parameters I am getting are:
def custom_language_model(messages, model, stream, max_tokens):
But yielding is not working for me at the moment.
Documentation has been updated. This issue should be closed. @KillianLucas
Describe the bug
I attempted to run the routine for the custom model, which is supposed to just echo back what the user said, but it was not successful.
I attempted to make modifications to
interpreter.llm.completions = custom_language_model
But the parameters don't match up.Reproduce
Run
https://docs.openinterpreter.com/language-models/custom-models
Expected behavior
(this one just echoes what the user said back)
Screenshots
No response
Open Interpreter version
0.2.4
Python version
3.11
Operating System name and version
win 11
Additional context
No response