innightwolfsleep / text-generation-webui-telegram_bot

LLM telegram bot
MIT License
103 stars 20 forks source link

Instruct mode or system prompt sorely needed! #155

Open Ph0rk0z opened 11 months ago

Ph0rk0z commented 11 months ago

I am trying to spin this up for fun and to have my AI on telegram. Some things I notice:

  1. Instruct mode is not supported so you only talk to the default model. This causes quite a few problems as "assistant" mode in many models is very censored. Just say a bad word and it will lecture. Normally one can fix this using instruct or setting the system prompt but here we only have the character definitions.
  2. Loading characters from the config file seems broken. I set a character.yaml in the config file and only default example still loads. I even deleted example and copied the character to the "characters" folder in the extension but still get chiraru. I don't wish to give the option of changing the character away from default but the wrong one is loading.

I have tried to use both api and normal mode. I edited it like the user did here: https://github.com/innightwolfsleep/text-generation-webui-telegram_bot/issues/94#issuecomment-1670825815 as well. Perhaps I can also try to duplicate in: generator_params.json

Thoughts?

edit: I find that the variables in config are different than in code. I got preset and character loading by changing the name as they are written there.

innightwolfsleep commented 11 months ago

156 added context/user/bot prompt prefix and postfix to config files. This can be configured as promp template. (At least, I thing so)

Ph0rk0z commented 11 months ago

Thanks I will try it out.

innightwolfsleep commented 10 months ago

Added examples for different templates: https://github.com/innightwolfsleep/text-generation-webui-telegram_bot/discussions/158

Ph0rk0z commented 10 months ago

I tested it in the new repo and it doesn't really work as instruct mode since the example dialog isn't wrapped but it does work for a system prompt. I suppose one could manually edit the character card to fit an instruct template too.

innightwolfsleep commented 10 months ago

Can you give an examples? Share a character, config and what you wanna got?

p.s. perhaps, manual character edit needed. Need to investigate real cases.

p.p.s. I tested few common prompt templates and they work fine as for me. If you know another prompt templates - share them please, I'll try them.

Ph0rk0z commented 10 months ago

I just used alpaca. The ##instruction and ##response doesn't wrap the sample dialog so the model got confused and started outputting nothing. I used evilGPT off of chub.ai to mess with. I think if I manually wrapped the examples in the instruct template it would work fine. It's not seamless but at least now it's possible.

I gave up and went with the system prompt only, which does 80% of what it needs to. Getting the model out of the "assistant" personality was what I'm after as a lot of models are the most censored when using that. If you load up good models like euryale or airoboros in textgen and use chat vs chat instruct with the default character you can see what I'm talking about. Might be a bigger problem for small stuff like 7b/13b as they are more template bound than 70b, the latter just goes with whatever.

innightwolfsleep commented 10 months ago

Got it. Curently, example dialogue don't used as part of context in my code... This is a part of truncation logic (context never truncated; example, greeting and curent conversation messages truncating to avoid buffer owerflow)

As workaround, I can adwice to move example to context. But I realy should think about fix this... There is to much templates variation -_-

Ph0rk0z commented 10 months ago

I watched terminal in textgen, example dialog sends with the character prompt. It is formatted like the card with Char: message\n User: message\n

I didn't try standalone with exllama or llama.cpp yet. Mostly run this to access the AI when away from home.

You are also missing a few of the new repetition penalty params but it was trivial to add that.

innightwolfsleep commented 10 months ago

I think i will change template implemenation in a few next updates... Curently ": " between Char and message is hardcoded.