Open StefanDanielSchwarz opened 2 months ago
PR for prompt templates for Command-R and Phi, which would make the text completion endpoint work with these models. Not a fix for this issue with the chat completion endpoint, but at least there would be an alternative.
I'll still continue looking for a fix for this, too...
Describe the bug
text-generation-webui API chat completion broke completely after I tried using a "Generation Preset/Character Name" of my own. Even after removing it, it's not working anymore, even after deleting and re-adding the services.
For some reason, it now always loads the default Assistant character and uses that without the home-llm system prompt and correct prompt template. I've spent about an hour hunting for where a configuration file or database entry might have changed and could be stuck on the old version, but I didn't find anything. Restarted, deleted, reconfigured ooba and home-llm, but it remains messed up.
Expected behavior
Just as before, when the "Generation Preset/Character Name" box is empty, don't use any of ooba's default assistants.
Logs
ooba's console log:
As you can see, that's the ooba default Assistant prompt, not the home-llm prompt. And there's no prompt template, although everything looks correct in the home-llm settings UI.
Update:
After almost 2 hours of troubleshooting, it's still broken despite me completely reinstalling ooba and home-llm from scratch! I don't think ooba has any secret config files or registry entries, so I wonder if home-llm has those and doesn't remove them when the integration is deleted?
Totally stumped right now, as everything worked perfectly before I touched that cursed "Generation Preset/Character Name" field. And now, no matter the model, as soon as I enable "Use chat completions endpoint", it forgets the system prompt and prompt template and uses ooba's default assistant without a prompt template.
Update 2:
Three hours later – still no luck. I'm trying to find out where home-llm stores its settings, searched the DB, searched the file system. Any pointers? The integrations I set up must be saved somewhere, after all...
Also, if I switch Chat Mode to "Instruct", I can enter an instruct prompt template's name, but it's ignored, instead home-llm's option "Prompt Format" is used – but not consistently (only ChatML, Alpaca, Mistral are applied, ALL the others result in Alpaca being used)!