Closed Si1verBul13tzxc closed 2 years ago
Hi, Thank you so much for your interest!
Yes. This argument is not used. You can simply remove the "if" statement. I updated the code base as well.
Thanks again for letting me know and feel free to reopen if you found any other bugs. Zixuan
Hi, Thank you for your patience!
It seems that after you delete unblock_attention in args. This bug occurs when running B-CL: Traceback (most recent call last): File "run.py", line 186, in
appr.train(task,train_dataloader,valid_dataloader,num_train_steps,train,valid)
File "/content/PyContinual/src/approaches/classification/bert_adapter_capsule_mask.py", line 60, in train
global_step=self.train_epoch(t,train,iter_bar, optimizer,t_total,global_step)
File "/content/PyContinual/src/approaches/classification/bert_adapter_capsule_mask.py", line 137, in train_epoch
p.grad.data*=self.model.get_view_for_tsv(n,t) #open for general
File "/content/PyContinual/src/networks/classification/bert_adapter_capsule_mask.py", line 193, in get_view_for_tsv
if not self.args.unblock_attention:
AttributeError: 'Namespace' object has no attribute 'unblock_attention'.