Closed raozhongyu closed 2 years ago
I also train the transfuser model which the GPU can reach 90%.
Can you say more about the model training with 20% util? Which model did you train? what batch size did you use? how much GPU memory was occupied?
I train the neat model with 1.7.0 with batch size of 16. The GPU memory is about 10G. In addition, the GPU-Util is normal when i train other model such as AIM-bev model.
The GPU-util for NEAT is expected to be lower than other models since the dataloader for neat involves a lot more computation but I can't tell by what factor. If you have a GPU with higher memory then you can try increasing the batch size or you can also try to make the __getitem__
more efficient.
I try to train the model but the GPU-Util is only 20% with cuda 10.1 and torch 1.7. Is that normal?