Open yangmmm opened 3 weeks ago
how can i run the infer on multi gpus?
Currently, multi-gpu inference is not supported, since one gpu inference is enough for our datasets.
how can i run the infer on multi gpus?