QingruZhang / AdaLoRA

AdaLoRA: Adaptive Budget Allocation for Parameter-Efficient Fine-Tuning (ICLR 2023).
MIT License
259 stars 28 forks source link

TypeError: __init__() got an unexpected keyword argument 'rankallocator' #4

Closed fxb392 closed 1 year ago

fxb392 commented 1 year ago

When initialize the QuestionAnsweringTrainer,i encountered this error.Why the init() menth of QuestionAnsweringTrainer haven't self.rankallocator variable ?

error:TypeError: init() got an unexpected keyword argument 'rankallocator' #4

trainer = QuestionAnsweringTrainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset if training_args.do_train else None,
    eval_dataset=eval_dataset if training_args.do_eval else None,
    eval_examples=eval_examples if training_args.do_eval else None,
    tokenizer=tokenizer,
    data_collator=data_collator,
    post_process_function=post_processing_function,
    compute_metrics=compute_metrics,
    **rankallocator=rankallocator,**
    model_args=model_args,
    tb_writter=tb_writter,
)

    class QuestionAnsweringTrainer(Trainer):
          def __init__(self, *args, eval_examples=None, post_process_function=None, **kwargs):
              super().__init__(*args, **kwargs)
              self.eval_examples = eval_examples
              self.post_process_function = post_process_function
fxb392 commented 1 year ago

@QingruZhang The model_args、tb_writer has same problem.

    rankallocator=rankallocator,
    model_args=model_args,
    tb_writter=tb_writter,
QingruZhang commented 1 year ago

Hi, please follow the instructions here to install the dependencies and updated transformers, which contains the revised trainer definition. Then, you could use the scripts povided in scripts folder to run QA tasks. Hope this can answer you question.

fxb392 commented 1 year ago

Could you tell me the version of transformer(Trainer) you install? Now i upgrade the transformer to a latest version=4.30.2 but can 't init the QuestionAnsweringTrainer. The parent Trainer also haven't rankallocator、rankallocator、tb_writter.

fxb392 commented 1 year ago

Futhermore,I didn't find the code to add loar for model.

QingruZhang commented 1 year ago

Hi, the lora is added by the updated transformers. Thus, you need to install it from the source by running cd NLG_QA , pip install -e ., which we show by the README therein.

fxb392 commented 1 year ago

Ok.thanks you a lot ,i got it .