Open Ccheers opened 1 year ago
which version of peft are you using? I cloned peft and used `python3 -m pip install --user -e .
in the folder and got no issue
which version of peft are you using? I cloned peft and used
`python3 -m pip install --user -e .
in the folder and got no issue
the version is peft 0.4.0.dev0
@Ccheers have you managed to get past this error?
I am getting the exact same error message. I think the issue is running on a machine without GPU. Will attempt to use a machine with GPU and see if it fixes the error.
@Ccheers have you managed to get past this error?
I am getting the exact same error message. I think the issue is running on a machine without GPU. Will attempt to use a machine with GPU and see if it fixes the error.
i can run it on mac m1 with the cpu mode. This is my requirement version:
I bumped into the same issue. There was a commit peft: https://github.com/huggingface/peft/commit/fcff23f005fc7bfb816ad1f55360442c170cd5f5 removing device_map for 4/8 int. I just replaced the cls(**kwargs) with cls() in peft/utils/config.py:114 : https://github.com/huggingface/peft/blob/main/src/peft/utils/config.py#L114 (I went with a debugger and kwargs was {"device_map":0})
$ python generate.py --load_8bit --base_model 'decapoda-research/llama-7b-hf' --lora_weights 'tloen/alpaca-lora-7b'
===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please run
python -m bitsandbytes
and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
bin D:\python3.10\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so D:\python3.10\lib\site-packages\bitsandbytes\cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable. warn("The installed version of bitsandbytes was compiled without GPU support. " 'NoneType' object has no attribute 'cadam32bit_grad_fp32' CUDA SETUP: Loading binary D:\python3.10\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.so... argument of type 'WindowsPath' is not iterable The tokenizer class you load from this checkpoint is not the same type as the class this function is called from. It may result in unexpected tokenization. The tokenizer class you load from this checkpoint is 'LLaMATokenizer'. The class this function is called from is 'LlamaTokenizer'. Loading checkpoint shards: 100%|▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒| 33/33 [00:21<00:00, 1.55it/s] Traceback (most recent call last): File "C:\Users\chengscai\GolandProjects\alpaca-lora\generate.py", line 218, in
fire.Fire(main)
File "D:\python3.10\lib\site-packages\fire\core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "D:\python3.10\lib\site-packages\fire\core.py", line 475, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "D:\python3.10\lib\site-packages\fire\core.py", line 691, in _CallAndUpdateTrace
component = fn(*varargs, kwargs)
File "C:\Users\chengscai\GolandProjects\alpaca-lora\generate.py", line 69, in main
model = PeftModel.from_pretrained(
File "D:\python3.10\lib\site-packages\peft\peft_model.py", line 167, in from_pretrained
PeftConfig.from_pretrained(model_id, subfolder=kwargs.get("subfolder", None), kwargs).peft_type
File "D:\python3.10\lib\site-packages\peft\utils\config.py", line 114, in from_pretrained
config = cls(**kwargs)
TypeError: PeftConfig.init() got an unexpected keyword argument 'device_map'