Closed mz2sj closed 1 year ago
There is no necessary to download LLAMA native model files and convert using convert_llama_weights_to_hf.py,please download llama huggingface format file direct: git clone https://huggingface.co/decapoda-research/llama-7b-hf download "lfs" files by browser,put these file together with llama-7b-hf
Thank you.I'll try this method!
hi~ I used llama.download to download model but the speed is so slow.Any advice for acceleration? By the way,I used command
pip install pyllama -U
install pyllama.Is there any diffrence beween your suggesion command. Thanks~