ViperX7 / Alpaca-Turbo

Web UI to run alpaca model locally
GNU Affero General Public License v3.0
876 stars 92 forks source link

Support reverse prompting (-r flag in llama.cpp) #53

Closed jhayesdev closed 1 year ago

jhayesdev commented 1 year ago

Particularly for Vicuna you need to be able to set the reverse prompt so it doesn't continue blathering on after it's already answered you. Default llama.cpp instruct mode (-ins) uses "### Instruction: " as reverse prompt, but Vicuna needs "### Human:".

I'm not sure that reverse prompts are implemented at all right now, please correct if wrong and I'm not using it right. If author is not interested in doing it, I will have 3 weeks of free time coming up and I'd be more than happy to do a PR when I can dig into it.

ViperX7 commented 1 year ago

The latest release 0.6 adds support for reverse prompting

@jhayesdev thanks for the offer I would like to checkout any suggestions or PR that will improve the project