QingruZhang / AdaLoRA

AdaLoRA: Adaptive Budget Allocation for Parameter-Efficient Fine-Tuning (ICLR 2023).
MIT License
231 stars 23 forks source link

TypeError: unsupported operand type(s) for *: 'Parameter' and 'NoneType' #21

Open tf2bb opened 5 months ago

tf2bb commented 5 months ago

Hi, thanks for this awesome work! When I ported to my own model, the following error occurred: Traceback (most recent call last): File "/opt/data/private/Pancreas/bd_dice_sam-med2d_6_0.9236_code/train.py", line 509, in main(args) File "/opt/data/private/Pancreas/bd_dice_sam-med2d_6_0.9236_code/train.py", line 435, in main train_losses, train_iter_metrics = train_one_epoch(args, model, optimizer, train_loader, epoch, criterion, trainset, criterion_CE, rankallocator, compute_orth_regu ) File "/opt/data/private/Pancreas/bd_dice_sam-med2d_6_0.9236_code/train.py", line 189, in train_one_epoch rankallocator.update_and_mask(model, epoch) File "/opt/data/private/Pancreas/AdaLoRA-main/loralib/loralib/adalora.py", line 313, in update_and_mask self.update_ipt(model) File "/opt/data/private/Pancreas/AdaLoRA-main/loralib/loralib/adalora.py", line 221, in update_ipt self.ipt[n] = (p p.grad).abs().detach() TypeError: unsupported operand type(s) for : 'Parameter' and 'NoneType'

konioy commented 3 months ago

Hello, I also encountered the same issue. how you solved it?

konioy commented 3 months ago

I solved the problem:

 for n,p in model.named_parameters():
        if "lora_" in n: 
            p.retain_grad()

But arguments ipt,exp_avg_ipt, and exp_avg_unc are all 0

QingruZhang commented 3 months ago

Thanks for your update! We will fix this error.