Open grigorn opened 10 months ago
Hi @grigorn. Thanks for the issue and example. I will take a look at this bug after the CVPR deadline.
Hi @VainF did you look at this issue?
Hi @VainF, we are also facing similar issue for models which have attention_head_dim as any number greater than 1. The pruned dimensions do not work for forward pass for Attention blocks.
Hi. I am trying to prune MViTv2 with random pruning with the following code
Forward pass after pruning does not work. I think num_heads and pruned dimensions is not being calculated and updated correctly. Initially, num_heads is
After pruning it becomes this. Layers with 1 head became layers with 0 heads. Also, it has subtracted some number from out_features.
torch_pruning 1.3.2