juncongmoo / pyllama

LLaMA: Open and Efficient Foundation Language Models
GNU General Public License v3.0
2.8k stars 312 forks source link

Inference Error :UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe7 in position 18: invalid continuation byte" #80

Open MaiziXiao opened 1 year ago

MaiziXiao commented 1 year ago

Seems like the inference script not working with Chinese characters sometimes Prompt: ['what are you taking about'] 🦙LLaMA: what are you taking about? 什么地方有某种事物?... please enter your prompts (Ctrl+C to exit): What is the meaing of working Prompt: ['What is the meaing of working'] 🦙LLaMA: What is the meaing of working memory?它的作用和作用域是什么?。... please enter your prompts (Ctrl+C to exit): 人怎么学会游泳 Traceback (most recent call last): File "pyllama/inference.py", line 82, in <module> run( File "pyllama/inference.py", line 65, in run user_input = input("please enter your prompts (Ctrl+C to exit): ") UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe7 in position 18: invalid continuation byte