rendezqueue / rendezllama

CLI for llama.cpp with various commands to guide, edit, and regenerate tokens on the fly.
ISC License
11 stars 1 forks source link

feat(option): Instruction-response format #5

Closed grencez closed 1 year ago

grencez commented 1 year ago

Alpaca models uses multi-line \n\n### Instruction:\n\n and \n\n### Response:\n\n prefixes for both characters. Newer "instruct mode" models have similar multi-line formats, so it might be worth supporting.

That said, I haven't actually tried Alpaca and don't really want to special-case it, so this issue will be left in an ambiguous open+wontfix state for now.

grencez commented 1 year ago

The right way of doing this would involve adding an intermediate representation (IR) for character names that appears only in the prompt (but not displayed by the CLI). Editing commands would need to work with the displayed text.

The newline antiprompt would be changed to EOS also.

grencez commented 1 year ago

Maybe the instruction/response can be formatted as a single one that evolves over time:

Then squash the dialogue line into the instruction text and repeat!

grencez commented 1 year ago

To close this out, we just need to not print the answer prompt every time.