Closed wmkai closed 2 years ago
I tried to modify the value samples_per_gpu from 2 to 1 in config file, but the elapsed time in inference log seems to change not much. Do I have something wrong?
samples_per_gpu
elapsed
The change of elapsed time does not indicate the batch size per GPU precisely. It could be better to print the batch size when testing.
I see, thanks a lot.
I tried to modify the value
samples_per_gpu
from 2 to 1 in config file, but theelapsed
time in inference log seems to change not much. Do I have something wrong?