Vali-98 / ChatterUI

Simple frontend for LLMs built in react-native.
GNU Affero General Public License v3.0
311 stars 18 forks source link

Strange behavior #73

Closed GameOverFlowChart closed 1 week ago

GameOverFlowChart commented 2 weeks ago

I found a strange behavior. It happens with a specific model or at least the way I trigger it works just with that one (I compared another one and it didn't happen). So the behavior is that during the generation some token or words seem to be repeated often. But once the generation is finished the text turns normal (which is why I think it has to do with the client, not the model). I'm running local on Android as always. ZomboDroid_28082024084003

This reminds me about emojis they are also shown correctly once the generation is over.

GameOverFlowChart commented 2 weeks ago

Also don't mind the random |} I was playing around with instruct formats as I found this behavior, but the behavior happens independent of the used format.

Vali-98 commented 2 weeks ago

Could you provide the model this occurs with? Its very strange indeed. The buffer used to show the output should only ever be concatenated to once per token.

GameOverFlowChart commented 2 weeks ago

Could you provide the model this occurs with?

Llama-3.1-Storm-8B-Q4_K_S.gguf which Is from here: https://huggingface.co/bartowski/Llama-3.1-Storm-8B-GGUF

This behavior seems only to happen with short input (like in the screenshot). The system message was also empty.

Vali-98 commented 1 week ago

After some testing, I have been unable to reproduce this issue. I think I will close this as the problem itself is both very niche and minor. Feel free to reopen if this issue worsens.