Closed Whadup closed 3 months ago
Why not. We are adding it with the system prompt so it plays better with the truncation options iirc. We can provide both options btw, but I don't feel like there is "a single right way" of managing model prompts at the moment.
Sure, there's not one single right way. Adding it in front of the system prompt feels weird to me tho, particularly for longer system prompts. In this fake example, I think the order of inputs is funky
The following are multiple choice questions on fish. You're a helpful assistant who only answers in rhyme. When you do not understand the question, politely ask for clarifications. Always remain respectful.
Idk, my intuition is that chat-models expect the prompt as the last instruction.
Oh true, got your point!
Yep that's a bug, should be examples[0]["content"] + instruction
!
Yeah, that seems like a good solution. Alternatively, we could prepend to the first message with role="user".
We have both options actually, depending on whether there is a system prompt or not. As mentioned above, they will play differently with truncation
https://github.com/huggingface/lighteval/blob/df21407d9f714bde9ecfb4dd8283afdc2150eec3/src/lighteval/few_shot_manager.py#L199
If we use a chat_template, have a system prompt and an instruction, the instruction is prepended to the system prompt message. It should be prepended to the first user message instead.