turboderp / exui

Web UI for ExLlamaV2
MIT License
449 stars 43 forks source link

Permit setting default promt format and max tokens in models tab #50

Open dagbdagb opened 7 months ago

dagbdagb commented 7 months ago

When loading a model, permit setting the default 'Prompt Format' and 'Max Tokens for the model in question', so we don't have to set these for every new session. Is it possible to auto-detect or use some kind of heuristics to find the best template?

I notice Llama 3 instruct doesn't work as well with the default 'Chat RP' prompt. Every time. :-)