amrrs / llama-4bit-colab

llama-4bit-colab
64 stars 34 forks source link

Not working #3

Open tiagorangel1 opened 1 year ago

tiagorangel1 commented 1 year ago

I am getting this error:

# Latest step, step 6
Traceback (most recent call last):
  File "/content/GPTQ-for-LLaMa/llama_inference.py", line 108, in <module>
    model = load_quant(args.model, args.load, args.wbits)
  File "/content/GPTQ-for-LLaMa/llama_inference.py", line 27, in load_quant
    from transformers import LlamaConfig, LlamaForCausalLM 
ImportError: cannot import name 'LlamaConfig' from 'transformers' (/usr/local/lib/python3.9/dist-packages/transformers/__init__.py)
iboyles commented 1 year ago

Yes I am having the same issue.

Lauorie commented 1 year ago

Downloading (…)lve/main/config.json: 100% 427/427 [00:00<00:00, 60.5kB/s] Loading model ... Done. Downloading (…)okenizer_config.json: 100% 141/141 [00:00<00:00, 60.3kB/s] Traceback (most recent call last): File "/content/GPTQ-for-LLaMa/llama_inference.py", line 114, in tokenizer = AutoTokenizer.from_pretrained(args.model) File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/tokenization_auto.py", line 677, in from_pretrained raise ValueError( ValueError: Tokenizer class LLaMATokenizer does not exist or is not currently imported.

Tylersuard commented 1 year ago

Same issue:

Loading model ... Done. Traceback (most recent call last): File "/content/GPTQ-for-LLaMa/llama_inference.py", line 114, in tokenizer = AutoTokenizer.from_pretrained(args.model) File "/usr/local/lib/python3.9/dist-packages/transformers/models/auto/tokenization_auto.py", line 677, in from_pretrained raise ValueError( ValueError: Tokenizer class LLaMATokenizer does not exist or is not currently imported.

Tylersuard commented 1 year ago

I created a fix to solve the problem. @amrrs please accept and merge.

Tylersuard commented 1 year ago

Fix: in requirements.py, change the last line to git+https://github.com/zphang/transformers@660dd6e2bbc9255aacd0e60084cf15df1b6ae00d#egg=transformers

Tylersuard commented 1 year ago

Ok I followed instructions, still getting this error: Traceback (most recent call last): File "/content/GPTQ-for-LLaMa/llama_inference.py", line 108, in model = load_quant(args.model, args.load, args.wbits) File "/content/GPTQ-for-LLaMa/llama_inference.py", line 27, in load_quant from transformers import LlamaConfig, LlamaForCausalLM ImportError: cannot import name 'LlamaConfig' from 'transformers' (/usr/local/lib/python3.9/dist-packages/transformers/init.py)

amrrs commented 1 year ago

@Tylersuard I merged your PR, does it fix your problem?

KastyaLimoneS commented 1 year ago

I have the same issue. Followed all insructions