Open BAI-721 opened 2 years ago
@BAI-721 seems this is a customized nn module? you can implement a replacement function and put it in replace_module
dict.
https://github.com/microsoft/nni/blob/2499be70fcf6cd74e50813a6a3dfd945a363feb6/nni/compression/pytorch/speedup/compress_modules.py#L11
@BAI-721 seems this is a customized nn module? you can implement a replacement function and put it in
replace_module
dict.
@BAI-721 - does the above suggestion works for you? we will close the issue if there are no further questions, thanks.
Dear,
First of all, thank you very much for answering a question that I have been confused about for a long time. Now, I have another problem, as shown in Fig. 1.
I set aten::sum as add_python on the jit_translate file, but the problem of Fig. 2 still occurs.
Looking forward to your reply. Thanks.
Bai
------------------ 原始邮件 ------------------ 发件人: "microsoft/nni" @.>; 发送时间: 2022年3月17日(星期四) 下午3:51 @.>; @.**@.>; 主题: Re: [microsoft/nni] NNI pruning compression acceleration (Issue #4635)
@BAI-721 seems this is a customized nn module? you can implement a replacement function and put it in replace_module dict.
@BAI-721 - does the above suggestion works for you? we will close the issue if there are no further questions, thanks.
— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you were mentioned.Message ID: @.***>
Dear,
First of all, thank you very much for answering a question that I have been confused about for a long time. Now, I have another problem, as shown in Fig. 1.
I set aten::sum as add_python on the jit_translate file, but the problem of Fig. 2 still occurs.
Looking forward to your reply. Thanks.
Bai
------------------ 原始邮件 ------------------ 发件人: "microsoft/nni" @.>; 发送时间: 2022年3月15日(星期二) 下午5:14 @.>; @.**@.>; 主题: Re: [microsoft/nni] NNI pruning compression acceleration (Issue #4635)
@BAI-721 seems this is a customized nn module? you can implement a replacement function and put it in replace_module dict. https://github.com/microsoft/nni/blob/2499be70fcf6cd74e50813a6a3dfd945a363feb6/nni/compression/pytorch/speedup/compress_modules.py#L11
— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you were mentioned.Message ID: @.***>
你好@BAI-721,我看不到你的数字,好像他们没有上传...... I set aten::sum as add_python on the jit_translate file, . However, a new problems as follows: Traceback (most recent call last): File "E:/pythonProject/dalunwen/NNI剪枝.py", line 89, in
m_speedup.speedup_model() File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\compressor.py", line 507, in speedup_model self.infer_modules_masks() File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\compressor.py", line 353, in infer_modules_masks self.update_direct_sparsity(curnode) File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\compressor.py", line 215, in update_direct_sparsity func, dummy_input, in_masks, in_constants=in_constants, batch_dim=self.batch_dim) File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\infer_mask.py", line 80, in init self.output = self.module(dummy_input) TypeError: add() received an invalid combination of arguments - got (Tensor), but expected (Tensor input, Tensor other, , Number alpha, Tensor out)
Hello @BAI-721 , I can't see your figures, seems they are not uploaded...
Hello @BAI-721 , I can't see your figures, seems they are not uploaded...
@BAI-721 may you update the figures again? thanks.
Hello @BAI-721 , I can't see your figures, seems they are not uploaded...
@BAI-721 may you update the figures again? thanks.
I use the FPGM method to prune the FPN network structure.
dummy_input = torch.ones(3, 3, 64, 64).to(device). or dummy_input = torch.randn([3, 3, 64, 64]).to(device)
I set aten::sum as add_python on the jit_translate file, .
However, a new problems as follows:
Traceback (most recent call last):
File "E:/pythonProject/NNIprune.py", line 89, in
Looking forward to your reply.
I use the FPGM method to prune the FPN network structure. dummy_input = torch.ones(3, 3, 64, 64).to(device). or dummy_input = torch.randn([3, 3, 64, 64]).to(device) I set aten::sum as add_python on the jit_translate file, . However, a new problems as follows:
Traceback (most recent call last): File "E:/pythonProject/NNIprune.py", line 89, in <module> m_speedup.speedup_model() File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\compressor.py", line 507, in speedup_model self.infer_modules_masks() File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\compressor.py", line 353, in infer_modules_masks self.update_direct_sparsity(curnode) File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\compressor.py", line 215, in update_direct_sparsity func, dummy_input, in_masks, in_constants=in_constants, batch_dim=self.batch_dim) File "E:\Anaconda\lib\site-packages\nni\compression\pytorch\speedup\infer_mask.py", line 80, in init self.output = self.module(dummy_input) TypeError: add() received an invalid combination of arguments - got (Tensor), but expected (Tensor input, Tensor other, , Number alpha, Tensor out)
Looking forward to your reply.
------------------ 原始邮件 ------------------ 发件人: "microsoft/nni" @.>; 发送时间: 2022年4月6日(星期三) 中午11:33 @.>; @.**@.>; 主题: Re: [microsoft/nni] NNI pruning compression acceleration (Issue #4635)
Hello @BAI-721 , I can't see your figures, seems they are not uploaded...
@BAI-721 may you update the figures again? thanks.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>
Describe the issue: /usr/local/lib/python3.7/dist-packages/nni/compression/pytorch/speedup/compressor.py in replace_submodule(self, unique_name, reindex_dim, reindex) 441 if not m_type in replace_module: 442 raise RuntimeError( --> 443 "Has not supported replacing the module:
{}
".format(m_type)) 444 _logger.info("replace module (name: %s, op_type: %s)", 445 g_node.name, m_type)RuntimeError: Has not supported replacing the module:
rSoftMax
Environment: Pytorch
Configuration:
Log message:
How to reproduce it?: