ray-project / ray_lightning

Pytorch Lightning Distributed Accelerators using Ray
Apache License 2.0
211 stars 34 forks source link

`ray_ddp` showing no use of gpu #177

Closed JiahaoYao closed 2 years ago

JiahaoYao commented 2 years ago
(train_mnist pid=11589) GPU available: False, used: False
(train_mnist pid=11589) TPU available: False, using: 0 TPU cores
(train_mnist pid=11589) IPU available: False, using: 0 IPUs
(train_mnist pid=11589) HPU available: False, using: 0 HPUs

when we are using the delayed gpu's, the logging code is here

https://github.com/Lightning-AI/lightning/blob/56ff89743b42f276ce123aad70341f2c62cf5cf9/src/pytorch_lightning/trainer/trainer.py#L1751-L1761

JiahaoYao commented 2 years ago

@amogkam one issue for discussion later!

amogkam commented 2 years ago

Yeah i'm not sure if this is avoidable. Maybe we can just log a message saying to ignore this output?