GeorgeSeif / Semantic-Segmentation-Suite

Semantic Segmentation Suite in TensorFlow. Implement, train, and test new Semantic Segmentation models easily!
2.51k stars 878 forks source link

Using GPU? #189

Closed jsolves closed 5 years ago

jsolves commented 5 years ago

-What are your command line arguments? python train.py --model=BiSeNet --dataset=XXX --crop_height=480 --crop_width=480 --batch_size=4 -Have you written any custom code? No -What have you done to try and solve this issue? Read "FAQ" and "Issues" -TensorFlow version? gpu 1.9.0

Describe the problem

I run train.py but the gpu utilization showed in nvidia-smi is insignificant. I changed the line "with tf.device('/cpu:0'):" to "with tf.device('/gpu:0'):" in train.py but nothing changes. Is there a way to activate gpu utilization that I haven't found? Or there is no significant difference between gpu and cpu training?

Thanks in advance.

ryohachiuma commented 5 years ago

DId you correctly installed GPU version of tensorflow?

jsolves commented 5 years ago

Somehow, my conda tensorflow-gpu was banned from existence. After reinstaling it, now I can see gpu utilization.

Thank you very much!