Open A-runaaaa opened 1 year ago
I also faced similar questions!
I faced this question too. Is there any one has solutions?
You need to add the llama weight together to use the vicuna model. I hope that help u.
Using only llama's weights did not cause garbled errors, but using vicuna7b_ 1.1 will result in garbled code
This is related to the model. Did you tune the model yourself? Did you tune it for instructions?
Using commands python3 -m fastchat.serve.cli --model-path call the dialogue robot opened by the model ,what is the reason for all the replies being garbled.