Closed vedantroy closed 3 months ago
@vedantroy You are not using the local model properly. It is not the OpenAI endpoint but SGLang endpoint. What you mentioned is using SGLang endpoint in OpenAI compatible APIs. This usage is not supported yet.
I also encountered this problem, probably because the OpenAI endpoint for Yi-VL hasn't been supported. Temporarily I add a conversation template to conversation.py
register_conv_template(
Conversation(
name="yi",
system_message=(
"This is a chat between an inquisitive human and an AI assistant. Assume the role of the AI assistant. Read all the images carefully, and respond to the human's questions with informative, helpful, detailed and polite answers."
"这是一个好奇的人类和一个人工智能助手之间的对话。假设你扮演这个AI助手的角色。仔细阅读所有的图像,并对人类的问题做出信息丰富、有帮助、详细的和礼貌的回答。"
),
roles=("<|im_start|>user", "<|im_start|>assistant"),
sep="<|im_end|>",
stop_str=["<|endoftext|>", "<|im_end|>", "###", "\n###"]
)
)
while hardcode the chat_template_name = "yi"
, set the image placeholder <image>\n
to <image_placeholder>\n
in openai_api_adapter.py
and it works.
This issue has been automatically closed due to inactivity. Please feel free to reopen it if needed.
I get the following error:
when running:
and using: