Closed vetka925 closed 1 year ago
Believe this may have something to do with installing into a root or notebook environment (just speculation).
Quick fix could be just change llama.hf.modeling_llama import LLaMAForC
to pyllama.llama.hf.modeling_llama import LLaMAForC
. Or just install in a conda env.
Please try pip install pyllama -U
and it should fix the issue. @vetka925
doesn't help. Same error after pip install pyllama -U
I tried to install pyllama in conda env in WSL.
That is weird. Can you list the files in your site-package's llama folder? @vetka925
Try to run: python -m llama.llama_quant decapoda-research/llama-7b-hf c4 --wbits 8 --save pyllama-7B8b.pt
Got an error: Traceback (most recent call last): File "/home/user/miniconda3/envs/transformers/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/home/user/miniconda3/envs/transformers/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/home/user/miniconda3/envs/transformers/lib/python3.10/site-packages/llama/llama_quant.py", line 16, in
from llama.hf.modeling_llama import LLaMAForC
ModuleNotFoundError: No module named 'llama.hf'