autonomousvision / neat

[ICCV'21] NEAT: Neural Attention Fields for End-to-End Autonomous Driving
MIT License
303 stars 47 forks source link

The gpu-Util is low #12

Closed raozhongyu closed 1 year ago

raozhongyu commented 2 years ago

I try to train the model but the GPU-Util is only 20% with cuda 10.1 and torch 1.7. Is that normal?

raozhongyu commented 2 years ago

I also train the transfuser model which the GPU can reach 90%.

ap229997 commented 2 years ago

Can you say more about the model training with 20% util? Which model did you train? what batch size did you use? how much GPU memory was occupied?

raozhongyu commented 2 years ago

I train the neat model with 1.7.0 with batch size of 16. The GPU memory is about 10G. In addition, the GPU-Util is normal when i train other model such as AIM-bev model.

ap229997 commented 2 years ago

The GPU-util for NEAT is expected to be lower than other models since the dataloader for neat involves a lot more computation but I can't tell by what factor. If you have a GPU with higher memory then you can try increasing the batch size or you can also try to make the __getitem__ more efficient.