Open PigBroA opened 1 year ago
It's really weird, and I don't know why at the moment. Usually, I use the first command to inference a single image.
Hi
I tried to inference the segmentation model(mask2former_beit_adapter_large_896_80k_cityscapes_ss) and got some weird point.
(1) Run image_demo.py as CUDA_VISIBLE_DEVICES=1 python image_demo.py configs/cityscapes/mask2former_beit_adapter_large_896_80k_cityscapes_ss.py released/mask2former_beit_adapter_large_896_80k_cityscapes.pth.tar data/sample.png
(2) Run image_demo.py as python image_demo.py configs/cityscapes/mask2former_beit_adapter_large_896_80k_cityscapes_ss.py released/mask2former_beit_adapter_large_896_80k_cityscapes.pth.tar data/sample.png --device cuda:1
There are 2 kinds of command, commands mean inference with cuda(index: 1), but outputs are different.
Here is (1) output
Here is (2) output
Please let me know the reason If you know what the problem is.
Thank you
@PigBroA
May I ask if your Python version number is 3.7.11 and the torch version number is 1.9.0? Is the mmcv version number 1.4.2? Thank you!
@InterstellarFang I can't remember that exactly. but, I think I used higher versions what you mention. Thanks (maybe, python was 3.8.15, torch was 1.9.1, mmcv was 1.7.0, but I can't make sure that)
Hi
I tried to inference the segmentation model(mask2former_beit_adapter_large_896_80k_cityscapes_ss) and got some weird point.
(1) Run image_demo.py as CUDA_VISIBLE_DEVICES=1 python image_demo.py \ configs/cityscapes/mask2former_beit_adapter_large_896_80k_cityscapes_ss.py \ released/mask2former_beit_adapter_large_896_80k_cityscapes.pth.tar \ data/sample.png
(2) Run image_demo.py as python image_demo.py \ configs/cityscapes/mask2former_beit_adapter_large_896_80k_cityscapes_ss.py \ released/mask2former_beit_adapter_large_896_80k_cityscapes.pth.tar \ data/sample.png \ --device cuda:1
There are 2 kinds of command, commands mean inference with cuda(index: 1), but outputs are different.
Here is (1) output
Here is (2) output
Please let me know the reason If you know what the problem is.
Thank you