Closed CVHub520 closed 10 months ago
It seems like I've identified the issue; it appears that the model wasn't downloaded completely.
The inference command should look like this:
bash tools/dist.sh test seg/configs/ovsam/ovsam_coco_rn50x16_point.py 8
where 8 is your number of GPUs.
Your issue is because python considers ./tools
as the project path and cannot find the config path. If you insist on your command, which only supports single-gpu inference, you can add PYTHONPATH=.
before your command.
It seems like I've identified the issue; it appears that the model wasn't downloaded completely.
That sounds reasonable. Please let me know if you have any other questions.
Hello,
When I attempt to execute the test case using the following command:
I encountered the following error. Could you please guide me on how to resolve it? Any assistance would be greatly appreciated.
Error Details:
Thank you!