Closed IDiMooo closed 1 month ago
Sorry on the issue! Oh yep sometimes this might happen weirdly - an option is instead of adding 1 EOS_TOKEN, add 3 or 5 to force the model to stop generating tokens.
Sorry on the issue! Oh yep sometimes this might happen weirdly - an option is instead of adding 1 EOS_TOKEN, add 3 or 5 to force the model to stop generating tokens.
Thanks for answering! Someone on reddit also suggested increasing Context Length which is num ctx i suppose. Should I try both and test it out?
Yes try both if those work!
It worked! Thank you for help!
Hello, I've been working on a conversational model based on Cyn from Murder Drones. I've been using Unsloth's Llama 3 notebook for finetuning Llama 3.2 3B Instruct with my own dataset. The results are mostly great but sometimes I get responses like those on the pictures. It looks like the chat template I used for training. I have this chat template set in the Modelfile so I don't know why that happens. It happens rarely but often enough to notice...
Here is what my model file looks like:
I should add that I am a silly person who just jumped on a very hard task and I am very much a beginner.
Here are all the links that may be useful: