ZixuanKe / PyContinual

PyContinual (An Easy and Extendible Framework for Continual Learning)
300 stars 62 forks source link

Bug occurs when you delete unblock_attention in args #4

Closed Si1verBul13tzxc closed 2 years ago

Si1verBul13tzxc commented 2 years ago

It seems that after you delete unblock_attention in args. This bug occurs when running B-CL: Traceback (most recent call last): File "run.py", line 186, in appr.train(task,train_dataloader,valid_dataloader,num_train_steps,train,valid) File "/content/PyContinual/src/approaches/classification/bert_adapter_capsule_mask.py", line 60, in train global_step=self.train_epoch(t,train,iter_bar, optimizer,t_total,global_step) File "/content/PyContinual/src/approaches/classification/bert_adapter_capsule_mask.py", line 137, in train_epoch p.grad.data*=self.model.get_view_for_tsv(n,t) #open for general File "/content/PyContinual/src/networks/classification/bert_adapter_capsule_mask.py", line 193, in get_view_for_tsv if not self.args.unblock_attention: AttributeError: 'Namespace' object has no attribute 'unblock_attention'.

ZixuanKe commented 2 years ago

Hi, Thank you so much for your interest!

Yes. This argument is not used. You can simply remove the "if" statement. I updated the code base as well.

Thanks again for letting me know and feel free to reopen if you found any other bugs. Zixuan

Si1verBul13tzxc commented 2 years ago

Hi, Thank you for your patience!