unsky / FPN

Feature Pyramid Networks for Object Detection
524 stars 263 forks source link

I want to konw the Memory Of GPU? #8

Closed 10183308 closed 6 years ago

10183308 commented 7 years ago

i set the batch size si 320 ,but the memory of gpu is 4417MiB .

+------------------------------------------------------+
| NVIDIA-SMI 352.39 Driver Version: 352.39 |
|-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | |===============================+======================+======================| | 0 GeForce GTX TIT... Off | 0000:02:00.0 Off | N/A | | 22% 38C P8 15W / 250W | 154MiB / 12287MiB | 0% Default | +-------------------------------+----------------------+----------------------+ | 1 GeForce GTX TIT... Off | 0000:03:00.0 On | N/A | | 30% 71C P2 97W / 250W | 4481MiB / 12279MiB | 0% Default | +-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+ | Processes: GPU Memory | | GPU PID Type Process name Usage | |=============================================================================| | 0 20213 C python 128MiB | | 1 8878 G /usr/bin/X 36MiB | | 1 20213 C python 4417MiB |

I want to kown ,is it normal?

unsky commented 7 years ago

@the image size is more effect than batch size in memory