catalyst-team / catalyst

Accelerated deep learning R&D
https://catalyst-team.com
Apache License 2.0
3.3k stars 388 forks source link

catalys.contrib.nn.modules.LAMA: "mask" parameter not used #1332

Closed lainisourgod closed 2 years ago

lainisourgod commented 3 years ago

🐛 Bug report

  1. In TemporalDropLastWrapper. Any layer wrapped with this one will not get mask. I guess it should just move mask into net with x_out = self.net(x, mask). Also I guess drop last means drop output of net so x = x[:, :-1, :] should go after net(x, mask).
  2. In TemporalAttentionPooling: mask parameter not used at all so code works incorrectly by not removing PAD values from input.
github-actions[bot] commented 3 years ago

Hi! Thank you for your contribution! Please re-check all issue template checklists - unfilled issues would be closed automatically. And do not forget to join our slack for collaboration.

Scitator commented 3 years ago

🤔 nice catch, it may be so! that's why this is contib :) It would be great if could contribute a hotfix for this part 🚀

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.