Closed piotr25691 closed 6 months ago
Thank you for this. As of latest version, I'm only experiencing issues with the new line in non-stream. A fix is on the way.
as it stands right now, the latest version (0.5.1) does not contain this hotfix as it was released a few commits before this hotfix was pushed out. you might want to re-release 0.5.1 or make a new release 0.5.2 to fix that issue properly.
Okay.
newlines are incorrectly handled by this library with llama2 provider.
if stream is off: no newlines will ever appear. if stream is on: the ai response will repeat itself a bunch of times in random places, causing issues with character limits
this could be fixed.