I am trying to use infer model on multiple GPU but I get an error of unathurised access to GPU.Do I need to config the repo accordingly / how to use cuda visible devices ?
I did inference on multiple GPUs on vast.ai and I didn't have problems. The only things I did was to give privileges to dist_test.sh.
You can try either _chmod +x disttest.sh or _chmod 777 disttest.sh