hustvl / TopFormer

TopFormer: Token Pyramid Transformer for Mobile Semantic Segmentation, CVPR2022
Other
373 stars 42 forks source link

Is batchsize>1 supported in convert2onnx.py file? #13

Open tensorflowt opened 2 years ago

tensorflowt commented 2 years ago

Hi! Thank you so much for your open source project, it's great! I encountered a problem when verifying its performance, as follows: When I want to test batchsize=2, convert the pth model to the onnx model. The specific operations are as follows: 864cd9253c929a44f7e5ba3ae149160 And perform model conversion script execution: 434eec985937b597c56e6ceb9aff0f0 But the generated model is still batchsize=1: 0e883d8c29cfd465294fe13deef078f What do I need to achieve onnx model generation with batchsize=2? Thank you very much, looking forward to your reply!

speedinghzl commented 2 years ago

Your question is heavily related to the MMSegmentation framework and ONNX. It could be better to raise an issue under the corresponding repositories.