abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
8.17k stars 974 forks source link

Mistral-instruct not using system prompt. #1832

Open AkiraRy opened 1 week ago

AkiraRy commented 1 week ago

Prerequisites

Please answer the following questions for yourself before submitting an issue.

I delete everything else as it is not relevant.

TL;DR

Using mistral-instuct formating doesn't include system prompt. I don't know if this was intential or just missed out. Here is relevant part of code

@register_chat_format("mistral-instruct")
def format_mistral_instruct(
    messages: List[llama_types.ChatCompletionRequestMessage],
    **kwargs: Any,
) -> ChatFormatterResponse:
    eos = "</s>"
    stop = eos
    prompt = ""
    for message in messages:
        if (
            message["role"] == "user"
            and message["content"] is not None
            and isinstance(message["content"], str)
        ):
            prompt += "[INST] " + message["content"]
        elif message["role"] == "assistant" and message["content"] is not None:
            prompt += " [/INST]" + message["content"] + eos
    prompt += " [/INST]"
    return ChatFormatterResponse(prompt=prompt, stop=stop)
ddh0 commented 19 hours ago

Hi @AkiraRy, the mistral instruct prompt format doesn't officially support a system prompt, so this is probably intentional