Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.71k stars 176 forks source link

模型输出无意义的内容 #22

Closed altqxd closed 1 year ago

altqxd commented 1 year ago

从huggingface现在alpaca的模型之后 按照torchrun --nproc-per-node=1 demos/single_turn.py --llama_config /root/autodl-tmp/model/LLaMA2-Accessory/config/7B_params.json --tokenizer_path /root/autodl-tmp/model/LLaMA2-Accessory/config/tokenizer.model --pretrained_path /root/autodl-tmp/model/LLaMA2-Accessory/finetune/sg/alpaca 官网提供的命令运行,可以正常启动gradio,但是模型输出的是无意义的内容

image

ChrisLiu6 commented 1 year ago

Hi, note that we release delta patches of our trained checkpoints instead of off-the-shelf checkpoints to comply with llama's license. Please follow the instructions here to merge the weights before inference.

altqxd commented 1 year ago

Hi, note that we release delta patches of our trained checkpoints instead of off-the-shelf checkpoints to comply with llama's license. Please follow the instructions here to merge the weights before inference.

thanks,i will try