纯c++的全平台llm加速库,支持python调用,chatglm-6B级模型单卡可达10000+token / s,支持glm, llama, moss基座,手机端流畅运行
3.32k
stars
340
forks
source link
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xce in position 4411: invalid continuation byte #489
Open
huqiangDu opened 2 months ago
When I use alpace2flm.py to generate llama2-13b model, I met an error
and my python code is
How can I fix it?