Closed hahuyhoang411 closed 1 month ago
I think the problem is related to our current llama.cpp version
I convert the model using our internal cluster with latest llama.cpp version and it works fine
Can you help me to bump the version cc @nguyenhoangthuan99
Problem solved with version bump
Problem
Sometimes the model HF repo doesn't have tokenizer.model itself which cause the bug missing tokenizer.
Recommendation
@nguyenhoangthuan99 : Add 1 more step in CI converter
https://github.com/ggerganov/llama.cpp/issues/2443
Tasks