lmg-anon / mikupad

LLM Frontend in a single html file
https://lmg-anon.github.io/mikupad/mikupad.html
Creative Commons Zero v1.0 Universal
175 stars 24 forks source link

Update mikupad.html with simple "end of response string" injection support #61

Closed cpumaxx closed 1 month ago

cpumaxx commented 1 month ago

Added on behalf of lmg anon eyoocsuo:

Ok so uh, I just added in extremely simple automated end of reply string injection for Instruct formatting for Mikupad. What it does: adds two new input boxes, where you can set a "user end string" and a "bot end string" when the Predict button is pressed, it will see if there is a user end string, and if there is, append it to the context, and then predict when the prediction is finished, if there is a bot end string present, then it will append that to the context And that's it. Seems to work? I made a model write the code though I had to troubleshoot a bit. Does anyone want this? Idk. There's already a PR for a more complex and fuller support for Instruct templates, but this works great for me to just quickly experiment with logs. Here's a modified Mikupad with simple "end of response string" injection support.

All this does is let you set a custom string "User End String" that gets appended to context (and then sent to the backend) when you press the Chat button. And when the bot finishes its response, it adds a custom string "Bot End String". Basically this makes it really easy to keep your hands on the keyboard and just keep talking to a chatbot in Mikupad! The system prompt and entire beginning of the context is up to you to have filled in, with correct formatting. But this also makes it convenient for when you're copy pasting the log from a different frontend. The input boxes for the strings are right there above the token count, and only accept \n for newlines, which get converted internally into real newlines. The shortcut I decided for Chat is shift+enter, compared to ctrl+enter for Predict.

But yeah I decided to have there be a separate Predict and Chat button after all, as this lets you modify the bot's response (such as when choosing a dif token prob) and continue the prediction.

Like I said, I'm opening this PR on behalf of someone else. I have only tested it for basic functionality and eyeballed it for anything obviously stupid. I will not be upset if it is rejected.

neCo2 commented 1 month ago

Saw that too. I feel like just adding a checkbox/toggle somewhere to switch between chat/completion would be much more elegant than having a separate "chat" button and function. I'll see if I can shoehorn something like this into the instruct templates once they're merged, since I don't think it makes sense for these two very related functions to be separated in the ui. But the final call's up to the maintainer, of course.

neCo2 commented 1 month ago

Added this functionality to the instruct-templates PR.

lmg-anon commented 1 month ago

This is a great idea, and it's indeed a better idea to integrate it into https://github.com/lmg-anon/mikupad/pull/56, and since this has already been done, I guess I will close this PR. Thank you for the contribution anyway!