Open Luo-Z13 opened 1 year ago
@zytx121
Hi @Luo-Z13, Thank you for providing this bug. It may have been caused by an unsuccessful call to the GPU.
You can try CUDA_VISIBLE_DEVICES=3 python demo/image_demo.py
to specify the 4th GPU.
Hi @Luo-Z13, Thank you for providing this bug. It may have been caused by an unsuccessful call to the GPU.
You can try
CUDA_VISIBLE_DEVICES=3 python demo/image_demo.py
to specify the 4th GPU.
Thanks, it works.
Prerequisite
Task
I'm using the official example scripts/configs for the officially supported tasks/models/datasets.
Branch
1.x branch https://github.com/open-mmlab/mmrotate/tree/1.x
Environment
I followed the tutorial (https://mmrotate.readthedocs.io/en/1.x/get_started.html) step by step to perform testing. During the inference validation, I modified the line
parser.add_argument( '--device', default='cuda:0', help='Device used for inference')
toparser.add_argument( '--device', default='cuda:3', help='Device used for inference')
. While the former provided normal prediction results for demo.jpg, the modified device setting resulted in empty output. What could be the reason for this?Reproduces the problem - code sample
Reproduces the problem - command or script
python demo/image_demo.py\ demo/demo.jpg\ oriented-rcnn-le90_r50_fpn_1x_dota.py\ oriented_rcnn_r50_fpn_1x_dota_le90-6d2b2ce0.pth\ --out-file result.jpg > demo/demo_img_cuda3.txt
Reproduces the problem - error message
context of demo_img_cuda3.txt:
Loads checkpoint by local backend from path: oriented_rcnn_r50_fpn_1x_dota_le90-6d2b2ce0.pth 05/26 20:44:19 - mmengine - WARNING -
Visualizer
backend is not initialized because save_dir is None. result: <DetDataSample() at 0x7f2d32826790> done!
Additional information
I just use image_demo.py to inference, and I just change the device form 'cuda:0' to 'cuda:3' then the result is empty, I can get normal result when use 'cpu'.