microsoft / CodeXGLUE

CodeXGLUE
MIT License
1.55k stars 366 forks source link

code-to-text OOM on a single V100 #82

Closed KevinHuuu closed 3 years ago

KevinHuuu commented 3 years ago

Hi there! I am running the code-to-text experiments with default setting (follow the readme file) on a single V100 with GPU memory 16 GB and got an OOM. Just curious what is the GPU numbers/settings for training the code summarzation model, finetuned from coderbert?

Thanks in advance!

guoday commented 3 years ago

We use 2 GPU with 16G. Maybe you can reduce a half of batch size