QingruZhang / AdaLoRA

AdaLoRA: Adaptive Budget Allocation for Parameter-Efficient Fine-Tuning (ICLR 2023).
MIT License
231 stars 23 forks source link

from_pretrained parameters #2

Closed fxb392 closed 1 year ago

fxb392 commented 1 year ago

config = AutoConfig.from_pretrained( model_args.config_name if model_args.config_name else model_args.model_name_or_path, cache_dir=model_args.cache_dir, revision=model_args.model_revision, use_auth_token=True if model_args.use_auth_token else None, cls_dropout=training_args.cls_dropout, apply_lora=model_args.apply_lora, lora_type=model_args.lora_type, lora_module=model_args.lora_module, lora_alpha=model_args.lora_alpha, lora_r=model_args.lora_r, apply_adapter=model_args.apply_adapter, adapter_type=model_args.adapter_type, adapter_size=model_args.adapter_size, reg_loss_wgt=model_args.reg_loss_wgt, masking_prob=model_args.masking_prob, ) what does parameter cls_dropou means? I don't find it in huggingface docs.

QingruZhang commented 1 year ago

Hi, this is the dropout probability for deberta model, which is applied in this optimized dropout module. The argument is added in this line. Hope this could answer your question.

fxb392 commented 1 year ago

ok,thank you.