megvii-model / CrowdDetection

Apache License 2.0
274 stars 44 forks source link

Failed to request for multiple GPUs during inference #6

Closed zehuichen123 closed 4 years ago

zehuichen123 commented 4 years ago

test.py provides API for multi-GPUs testing. However, when I set -d 4, the program seems to request memory on only GPU 0 which leads to OOM.

11 16:42:50[mgb] ERR cudaMalloc failed while requesting 57933824 bytes (55.250MiB) of memory; error: out of memory(last_err=2(out of memory) device=0 mem_free=29.312MiB mem_tot=24220.312MiB)
11 16:42:50[mgb] could not allocate memory on device 0; try to gather free blocks from child streams, got 0.00MiB(0 bytes).

Have you met this problem before?

zehuichen123 commented 4 years ago

I solved it by setting os.environ for each process.