Open JNUpython opened 1 year ago
Python 3.10.4 (main, Mar 31 2022, 08:41:55) [GCC 7.5.0] on linux Type "help", "copyright", "credits" or "license" for more information.
import transformers transformers.version '4.28.0.dev0'
2692 # Dispatch model with hooks on all devices if necessary 2693 if device_map is not None: 2694 dispatch_model(model, device_map=device_map, offload_dir=offload_folder, offload_index=offload_index) 2695 defined offload_index paras
same problem
same problem
he tokenizer class you load from this checkpoint is not the same type as the class this function is called from. It may result in unexpected tokenization. The tokenizer class you load from this checkpoint is 'LLaMATokenizer'. The class this function is called from is 'LlamaTokenizer'. Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████████████████████████████████████████| 33/33 [00:17<00:00, 1.93it/s] Traceback (most recent call last): File "/root/autodl-tmp/alpaca-lora/generate.py", line 184, in
fire.Fire(main)
File "/root/miniconda3/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/root/miniconda3/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/root/miniconda3/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/root/autodl-tmp/alpaca-lora/generate.py", line 37, in main
model = LlamaForCausalLM.from_pretrained(
File "/root/miniconda3/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2694, in from_pretrained
dispatch_model(model, device_map=device_map, offload_dir=offload_folder, offload_index=offload_index)
TypeError: dispatch_model() got an unexpected keyword argument 'offload_index'
root