Closed TonyAlbertWan closed 1 year ago
found another one in line 149 at train.py
:
peft_config = LoraConfig(
target_modules=r'.*language_model.*\.(query_key_value)',
# target_modules=r'.*language_model.*\.(q_proj|v_proj)',
inference_mode=args.inference_mode,
r=args.lora_r,
lora_alpha=args.lora_alpha,
lora_dropout=args.lora_dropout
)
now its ok for tuning multimodal model
As mentioned in https://github.com/X-PLUG/mPLUG-Owl/issues/141, when tuning multimodal pretrained model, we can use BloomTokenizerFast. i change the code in line 141 from
train.py
and add some nessecary packges:However, after i changing the code, i got such error:
How to fix this? is there other issues should i concern? thanks!