Hi,there! It seems that the positional encoding in SAM is too complex for torch_pruning.
https://github.com/czg1225/SlimSAM/issues/10
In fact, there is an infinite loop at the _fix_dependency_graph_non_recursive function, is there any solution other than removing the positional embedding? Or how to analyze the reason of this problem? Looking forward to your reply!
Hi,there! It seems that the positional encoding in SAM is too complex for torch_pruning. https://github.com/czg1225/SlimSAM/issues/10 In fact, there is an infinite loop at the _fix_dependency_graph_non_recursive function, is there any solution other than removing the positional embedding? Or how to analyze the reason of this problem? Looking forward to your reply!