czczup / ViT-Adapter

[ICLR 2023 Spotlight] Vision Transformer Adapter for Dense Predictions
https://arxiv.org/abs/2205.08534
Apache License 2.0
1.26k stars 139 forks source link

Inference in batches and multiple GPU #163

Open OmergottliebAB opened 9 months ago

OmergottliebAB commented 9 months ago
  1. I am trying to use infer model on multiple GPU but I get an error of unathurised access to GPU.Do I need to config the repo accordingly / how to use cuda visible devices ?
  2. Can I infer with images batches?
edoproch commented 9 months ago

I did inference on multiple GPUs on vast.ai and I didn't have problems. The only things I did was to give privileges to dist_test.sh. You can try either _chmod +x disttest.sh or _chmod 777 disttest.sh