VainF / Torch-Pruning

[CVPR 2023] Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs
https://arxiv.org/abs/2301.12900
MIT License
2.44k stars 308 forks source link

Question about positional embedding when pruning SAM model #337

Open tasakim opened 5 months ago

tasakim commented 5 months ago

Hi,there! It seems that the positional encoding in SAM is too complex for torch_pruning. https://github.com/czg1225/SlimSAM/issues/10 In fact, there is an infinite loop at the _fix_dependency_graph_non_recursive function, is there any solution other than removing the positional embedding? Or how to analyze the reason of this problem? Looking forward to your reply!

Sarthak-22 commented 2 days ago

Hi, did you find a solution to this problem