bin /rhome/yangyj/anaconda3/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda117.so
CUDA SETUP: CUDA runtime path found: /rhome/yangyj/anaconda3/lib/libcudart.so.11.0
CUDA SETUP: Highest compute capability among GPUs detected: 8.6
CUDA SETUP: Detected CUDA version 117
CUDA SETUP: Loading binary /rhome/yangyj/anaconda3/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda117.so...
WARNING:root:Use the checkpoint in HF hub, stored in the subfolder='gsm8k' in target model.
Loading checkpoint shards: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:08<00:00, 2.80s/it]
Traceback (most recent call last):
File "/rhome/yangyj/LoftQ-main/test_gsm8k.py", line 281, in
evaluation(model_args, data_args)
File "/rhome/yangyj/LoftQ-main/test_gsm8k.py", line 128, in evaluation
model = PeftModel.from_pretrained(model,
File "/rhome/yangyj/anaconda3/lib/python3.10/site-packages/peft/peft_model.py", line 278, in from_pretrained
config = PEFT_TYPE_TO_CONFIG_MAPPING[
File "/rhome/yangyj/anaconda3/lib/python3.10/site-packages/peft/config.py", line 134, in from_pretrained
config = config_cls(**kwargs)
TypeError: LoraConfig.init() got an unexpected keyword argument 'loftq_config'
--batch_size: command not found
~/LoftQ-main$ python test_gsm8k.py \ --model_name_or_path /rhome/yangyj/LoftQ --batch_size 16 \
===================================BUG REPORT=================================== Welcome to bitsandbytes. For bug reports, please run
python -m bitsandbytes
and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
bin /rhome/yangyj/anaconda3/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda117.so CUDA SETUP: CUDA runtime path found: /rhome/yangyj/anaconda3/lib/libcudart.so.11.0 CUDA SETUP: Highest compute capability among GPUs detected: 8.6 CUDA SETUP: Detected CUDA version 117 CUDA SETUP: Loading binary /rhome/yangyj/anaconda3/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda117.so... WARNING:root:Use the checkpoint in HF hub, stored in the
evaluation(model_args, data_args)
File "/rhome/yangyj/LoftQ-main/test_gsm8k.py", line 128, in evaluation
model = PeftModel.from_pretrained(model,
File "/rhome/yangyj/anaconda3/lib/python3.10/site-packages/peft/peft_model.py", line 278, in from_pretrained
config = PEFT_TYPE_TO_CONFIG_MAPPING[
File "/rhome/yangyj/anaconda3/lib/python3.10/site-packages/peft/config.py", line 134, in from_pretrained
config = config_cls(**kwargs)
TypeError: LoraConfig.init() got an unexpected keyword argument 'loftq_config'
--batch_size: command not found
subfolder='gsm8k'
in target model. Loading checkpoint shards: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:08<00:00, 2.80s/it] Traceback (most recent call last): File "/rhome/yangyj/LoftQ-main/test_gsm8k.py", line 281, in