thtrieu / darkflow

Translate darknet to tensorflow. Load trained weights, retrain/fine-tune using tensorflow, export constant graph def to mobile devices
GNU General Public License v3.0
6.13k stars 2.09k forks source link

High RAM usage after loading inference model into GPU #1015

Open shivSD opened 5 years ago

shivSD commented 5 years ago

Trained a YOLOv2 architecture on the custom images using darknet and after freezing the graph using darkflow, Model weight (.pb) file size is 268MB, Once we load this into gpu for inference it is consuming 7.93GB. I know tensorflow allocates the buffers for the output data at each stage at the beginning. Please, can somebody explain why there is so much memory usage by tensorflow.

Btw, if we run the same model with darknet framework it is taking 2.9GB of RAM.

yotamin commented 4 years ago

resolved by inserting the following at the begging: gpu_fraction = 0.05 # this will control the percentage of gpu used import tensorflow as tf config = tf.ConfigProto() config.gpu_options.per_process_gpu_memory_fraction = gpu_fraction session = tf.Session(config=config)

You will still get 'ran out of memory' warning, but checking the gpu memory usage will show it's using as much as you allocate.