Closed wangyidong3 closed 2 years ago
Hello,
Thanks for sharing this great repo. I trained and test with my own dataset, the prediction results are quite good.
But when I exported the onnx file with command:
python3 tools/convert2onnx.py local_configs/topformer/topformer_tiny_512x512_80k_2x8_drive_inviol1k.py --input-img results/0109.png --shape 512 512 --checkpoint results/tiny_20k/latest.pth --output-file results/tiny_80k/tiny_512_512.onnx --show
then try the deploy script with command:
python tools/deploy_test.py local_configs/topformer/topformer_tiny_512x512_80k_2x8_drive_inviol1k.py results/tiny_80k/tiny_512_512.onnx --backend=onnxruntime --show
Then it show error:
File "tools/deploy_test.py", line 297, in main() File "tools/deploy_test.py", line 268, in main results = single_gpu_test( File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmsegmentation-0.19.0-py3.8.egg/mmseg/apis/test.py", line 91, in single_gpu_test result = model(return_loss=False, data) File "/home/yidong/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(*input, *kwargs) File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmcv/parallel/data_parallel.py", line 42, in forward return super().forward(inputs, kwargs) File "/home/yidong/.local/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 165, in forward return self.module(*inputs[0], kwargs[0]) File "/home/yidong/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 889, in _call_impl result = self.forward(*input, *kwargs) File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmcv/runner/fp16_utils.py", line 98, in new_func return old_func(args, kwargs) File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmsegmentation-0.19.0-py3.8.egg/mmseg/models/segmentors/base.py", line 110, in forward return self.forward_test(img, img_metas, kwargs) File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmsegmentation-0.19.0-py3.8.egg/mmseg/models/segmentors/base.py", line 92, in forward_test return self.simple_test(imgs[0], img_metas[0], kwargs) File "tools/deploy_test.py", line 84, in simple_test self.sess.run_with_iobinding(self.io_binding) File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 276, in run_with_iobinding self._sess.run_with_iobinding(iobinding._iobinding, run_options) RuntimeError: Error in execution: Got invalid dimensions for input: input for the following indices index: 2 Got: 1080 Expected: 512 index: 3 Got: 1878 Expected: 512 Please fix either the inputs or the model.
Does it have any requirements for input images or shapes when export onnx file? the config setting as: img_scale = (1920, 1080) , crop_size = (512, 512).
I also tried some other shape size, still confused with this error and any helps would be much appreciated.
We didn't use the 'test_deploy.py' script, so there exists some incompatibility. We just tried to update the script and hope it can solve your problems.
just run:
python tools/deploy_test.py local_configs/topformer/topformer_tiny_512x512_80k_2x8_drive_inviol1k.py results/tiny_80k/tiny_512_512.onnx --backend=onnxruntime --show --shape 512 512
Note that the 'data_popeline' used during deployment is different from the one used during evaluation. If you have any other questions, feel free to ask under this issue.
Hello,
Thanks for sharing this great repo. I trained and test with my own dataset, the prediction results are quite good.
But when I exported the onnx file with command:
then try the deploy script with command:
python tools/deploy_test.py local_configs/topformer/topformer_tiny_512x512_80k_2x8_drive_inviol1k.py results/tiny_80k/tiny_512_512.onnx --backend=onnxruntime --show
Then it show error:
File "tools/deploy_test.py", line 297, in
main()
File "tools/deploy_test.py", line 268, in main
results = single_gpu_test(
File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmsegmentation-0.19.0-py3.8.egg/mmseg/apis/test.py", line 91, in single_gpu_test
result = model(return_loss=False, data)
File "/home/yidong/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, *kwargs)
File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmcv/parallel/data_parallel.py", line 42, in forward
return super().forward(inputs, kwargs)
File "/home/yidong/.local/lib/python3.8/site-packages/torch/nn/parallel/data_parallel.py", line 165, in forward
return self.module(*inputs[0], kwargs[0])
File "/home/yidong/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, *kwargs)
File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmcv/runner/fp16_utils.py", line 98, in new_func
return old_func(args, kwargs)
File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmsegmentation-0.19.0-py3.8.egg/mmseg/models/segmentors/base.py", line 110, in forward
return self.forward_test(img, img_metas, kwargs)
File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/mmsegmentation-0.19.0-py3.8.egg/mmseg/models/segmentors/base.py", line 92, in forward_test
return self.simple_test(imgs[0], img_metas[0], kwargs)
File "tools/deploy_test.py", line 84, in simple_test
self.sess.run_with_iobinding(self.io_binding)
File "/home/yidong/anaconda3/envs/mask2former/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 276, in run_with_iobinding
self._sess.run_with_iobinding(iobinding._iobinding, run_options)
RuntimeError: Error in execution: Got invalid dimensions for input: input for the following indices
index: 2 Got: 1080 Expected: 512
index: 3 Got: 1878 Expected: 512
Please fix either the inputs or the model.
Does it have any requirements for input images or shapes when export onnx file? the config setting as: img_scale = (1920, 1080) , crop_size = (512, 512).
I also tried some other shape size, still confused with this error and any helps would be much appreciated.