Open gollakousik123 opened 6 years ago
Hi @gollakousik123
if you are working on Linux, you can use nvidia-smi
to check your GPU usage.
Or you can set log device to True
so that you can see whether the model is assigned to a GPU:
https://github.com/charlesq34/pointnet/blob/d64d2398e55b24f69e95ecb549ff7d4581ffc21e/train.py#L131
Best, Charles
thank you so much for the quick response!
how much it take if i trained it on CPU?
In my experience with PointNet about 20 times longer.
Tom Hemmes T +31641181777 E tom@hemm.es W hemm.es
On Tue, Mar 6, 2018 at 6:41 PM, Saira05 notifications@github.com wrote:
how much it take if i trained it on CPU?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/charlesq34/pointnet/issues/87#issuecomment-370865275, or mute the thread https://github.com/notifications/unsubscribe-auth/ALxau3XZMbZTHL-GtceNYs6WHndB4yi9ks5tbspGgaJpZM4SW3eh .
ok can you tell me how to use its trained model??
im using nvidia gtx 980m when i run the code im getting this line at the beginning Tensor("Placeholder_2:0", shape=(), dtype=bool, device=/device:GPU:0) how to know whether it is using my gpu for training process or not.i trained it for two but i still didnt get the results