lm-sys / FastChat

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Apache License 2.0
37k stars 4.56k forks source link

The conversation replied with garbled code #1269

Open A-runaaaa opened 1 year ago

A-runaaaa commented 1 year ago

image Using commands python3 -m fastchat.serve.cli --model-path call the dialogue robot opened by the model ,what is the reason for all the replies being garbled.

0xTong commented 1 year ago

I also faced similar questions!

Kaka23333 commented 1 year ago

I faced this question too. Is there any one has solutions?

0xTong commented 1 year ago

You need to add the llama weight together to use the vicuna model. I hope that help u.

A-runaaaa commented 1 year ago

Using only llama's weights did not cause garbled errors, but using vicuna7b_ 1.1 will result in garbled code

surak commented 1 year ago

This is related to the model. Did you tune the model yourself? Did you tune it for instructions?