Open xydreen opened 1 year ago
oobabot seems to use the ooba server's api at /api/v1/stream
, but there exists /api/v1/chat-stream
too.
I'm not sure the difference or which one is used in the webui vs stream api vs chat-stream api, but maybe it's worth trying out.
It'll require some different work with how the input is formatted, I think.
Yup, @jmoney7823956789378 is on it.
I think it would make sense to add a feature to use the chat-stream
API, as there's good work done there to make good prompts across a range of models.
(fwiw, that API didn't exist when this was written, so hence the reason for us doing our own thing in the first place).
Setting this as one of the top features for an upcoming release. I think it would help a lot for people with different models, and I'm just curious about it. :)
I have some other commitments lately which might delay getting this done a bit, but in any case I'd welcome PRs around it. But either way it will get done at some point. :)
@chrisrude Any updates on this?
Hey guys, does anyone know what exactly would cause the output of the bot to look dramatically different than the output of that of the webui in chat mode? The webui in chat mode looks perfect and exactly how I want it. The temperature and other parameters are tuned identically. The prompt format is proper.
The responses generated in webui chat mode respond like they take the persona into account much better than when the bot responds. You can tell the bot responds using the persona to some extent however it's almost like the temperature is at the default regardless of the setting/parameters, vs when in the webui/chat mode...
Thanks!