Closed Touch-Night closed 1 year ago
Using the new format "GGUF" instead of using the old format "GGML". I have test the language model of "GGUF" and these specific Chinese characters can be display correctly without missing characters in response side.
" GGUF" is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for "GGML", which is no longer supported by llama.cpp.
Another method is choose "llamacpp_HF" or "ctransformers" instead of "llama.cpp" as model loader to load the "GGML" language model file. It is useful as the quantity of "GGUF" format file in Hugging Face are still much fewer than the "GGML" file.
Thank you for the information
Describe the bug
所有的模型都能够很好地进行英文文本的生成。但是使用llama.cpp加载方式加载ggml模型的话,在生成中文文本时,输出会明显地看到有漏字的现象。我试过了多个不同的模型和其他前端,确定是text generation webui的问题。
Translation by Bing AI: All models are able to generate English text well. However, when using the llama.cpp loading method to load the ggml model, when generating Chinese text, the output will clearly see the phenomenon of missing characters. I have tried multiple different ggml models and other front-ends, and I am sure it is a problem with the text generation webui.
Is there an existing issue for this?
Reproduction
Load a Chinese ggml model and chat with it using Chinese, to make it respond in Chinese.
Screenshot
Logs
System Info