Closed 25b3nk closed 2 years ago
sorry. we only run single-gpu inference in our research. If you want to run multi-gpu inference in your engineering project, you can refer to the training process to implement it yourself.
sorry. we only run single-gpu inference in our research. If you want to run multi-gpu inference in your engineering project, you can refer to the training process to implement it yourself.
I am trying to run inference on my machine with two GPUs, but the demo script runs on single GPU and runs out of memory. Please let me know how to set the demo script to run on multi-GPU setup.