Open natolambert opened 2 months ago
There is a start at this here: https://github.com/allenai/open-instruct/blob/main/eval/templates.py#L112 But more formal integrations with fastchat would be good. I think supporting more flexible generation in general is good. E.g. llama 3 with its multiple eos tokens.
Realistically, doing our own is a losing battle, let's migrate to something maintained.