tensorflow / nmt

TensorFlow Neural Machine Translation Tutorial
Apache License 2.0
6.37k stars 1.96k forks source link

GPU utilization problem #253

Open amirj opened 6 years ago

amirj commented 6 years ago

I'm training a nmt model using 50k vocab and millions of training data. I just noticed that during the training process, nmt leverages all available CPUs while GPU utilization is low and always change from 1% to 60%. It seems that loading and processing data is the bottleneck. What's the problem?

amirj commented 6 years ago

Any idea to solve this?

xtknight commented 6 years ago

Please see here.

https://github.com/tensorflow/nmt/issues/254

amirj commented 6 years ago

@xtknight Thanks for your answer. It seems that the CPU cycles waste in pre-processing input files (loading vocabularies, mapping words to ids,...) and GPU does not feed very well. So, If moving my input files to tfrecords would solve the problem. What's your idea?

xtknight commented 6 years ago

I am not sure but I think CPU usage is due to elements of the basic LSTM cell looping on the CPU.