Open crawlingcub opened 3 years ago
Hello @crawlingcub , this is because speed up has not supported aten::cat
in v 2.3, we refactor the speed up on master branch, you can have a try on it.
描述问题:
在 InceptionV3 模型上使用 level pruner 运行 autocompress 会导致此错误:
RuntimeError: Has not supported infering input shape from output shape for module/function: aten::cat, Mixed_5b.aten::cat.222
使用的配置:
[{'sparsity': 0.6656322824445615, 'op_types': ['Linear']}, {'sparsity': 0.6198551248545695, 'op_types': ['Conv2d']}]
错误日志:
Traceback (most recent call last): ... File "/home/ubuntu/projects/mltesting/fuzzer/compression/nni_lib.py", line 128, in init_pruner pruner.compress() File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/algorithms/compression/pytorch/pruning/auto_compress_pruner.py", line 212, in compress m_speedup.speedup_model() File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 183, in speedup_model self.infer_modules_masks() File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 140, in infer_modules_masks self.infer_module_mask(module_name, None, mask=mask) File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 120, in infer_module_mask self.infer_module_mask(_module_name, module_name, out_shape=input_cmask) File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 111, in infer_module_mask .format(m_type, module_name)) RuntimeError: Has not supported infering input shape from output shape for module/function: `aten::cat`, Mixed_5b.aten::cat.222
环境:
- NNI 版本:2.3
- 培训服务(本地|远程|pai|aml|等):本地
- 客户端操作系统:ubuntu 18.04
- Python版本:3.7
- PyTorch/TensorFlow 版本:PyTorch 1.8.1
- 是否使用了 conda/virtualenv/venv?:conda
- 是否在 Docker 中运行?:否
如果您需要更多详细信息/完整的错误日志,请告诉我。 您好,我最近也在对InceptionV3网络进行剪枝操作,我用的是AMC剪枝算法,可是我的程序总是被kill,您基于autocompress进行剪枝的成功了吗?方便的话我加你交流一下?
描述问题:
在 InceptionV3 模型上使用 level pruner 运行 autocompress 会导致此错误:
RuntimeError: Has not supported infering input shape from output shape for module/function: aten::cat, Mixed_5b.aten::cat.222
使用的配置:
[{'sparsity': 0.6656322824445615, 'op_types': ['Linear']}, {'sparsity': 0.6198551248545695, 'op_types': ['Conv2d']}]
错误日志:
Traceback (most recent call last): ... File "/home/ubuntu/projects/mltesting/fuzzer/compression/nni_lib.py", line 128, in init_pruner pruner.compress() File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/algorithms/compression/pytorch/pruning/auto_compress_pruner.py", line 212, in compress m_speedup.speedup_model() File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 183, in speedup_model self.infer_modules_masks() File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 140, in infer_modules_masks self.infer_module_mask(module_name, None, mask=mask) File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 120, in infer_module_mask self.infer_module_mask(_module_name, module_name, out_shape=input_cmask) File "/home/ubuntu/anaconda3/envs/pytorch_latest_p37/lib/python3.7/site-packages/nni/compression/pytorch/speedup/compressor.py", line 111, in infer_module_mask .format(m_type, module_name)) RuntimeError: Has not supported infering input shape from output shape for module/function: `aten::cat`, Mixed_5b.aten::cat.222
环境:
- NNI 版本:2.3
- 培训服务(本地|远程|pai|aml|等):本地
- 客户端操作系统:ubuntu 18.04
- Python版本:3.7
- PyTorch/TensorFlow 版本:PyTorch 1.8.1
- 是否使用了 conda/virtualenv/venv?:conda
- 是否在 Docker 中运行?:否
如果您需要更多详细信息/完整的错误日志,请告诉我。谢谢!
Hello, my friend, I am also pruning inceptionv3. If possible, I will add your wechat or QQ to ask some questions. Thank you and look forward to your reply.
Hi @dingguodong-826 - you can join the wechat group from https://github.com/microsoft/nni.
Describe the issue:
Running autocompress with level pruner on InceptionV3 model leads to this error:
RuntimeError: Has not supported infering input shape from output shape for module/function: aten::cat, Mixed_5b.aten::cat.222
Config used:
[{'sparsity': 0.6656322824445615, 'op_types': ['Linear']}, {'sparsity': 0.6198551248545695, 'op_types': ['Conv2d']}]
Error log:
Environment:
Let me know if you need more details/the full error log. Thanks!