device_map2:{'': 0}
Traceback (most recent call last):
File "/data/zjj/Qwen/Qwen-VL/finetune.py", line 386, in <module>
train()
File "/data/zjj/Qwen/Qwen-VL/finetune.py", line 306, in train
model = transformers.AutoModelForCausalLM.from_pretrained(
File "/home/wz/anaconda3/envs/zjj_qwen/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 511, in from_pretrained
return model_class.from_pretrained(
File "/home/wz/anaconda3/envs/zjj_qwen/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2945, in from_pretrained
model = quantizer.convert_model(model)
File "/home/wz/anaconda3/envs/zjj_qwen/lib/python3.10/site-packages/optimum/gptq/quantizer.py", line 201, in convert_model
self._replace_by_quant_layers(model, layers_to_be_replaced)
File "/home/wz/anaconda3/envs/zjj_qwen/lib/python3.10/site-packages/optimum/gptq/quantizer.py", line 229, in _replace_by_quant_layers
QuantLinear = dynamically_import_QuantLinear(
TypeError: dynamically_import_QuantLinear() got an unexpected keyword argument 'disable_exllamav2'
TypeError: dynamically_import_QuantLinear() got an unexpected keyword argument 'disable_exllamav2'
hi,我在微调 qwen-vl-Chat-Int4的时候遇到了类似的问题。目前已经改了device_map,关闭了fp16,但是我有另外一个问题,不知道你是否遇到过呢?
TypeError: dynamically_import_QuantLinear() got an unexpected keyword argument 'disable_exllamav2'
方便的话,可以提供一下optimum和transformers库的版本吗?谢谢~