facebookresearch / SymbolicMathematics

Deep Learning for Symbolic Mathematics
Other
523 stars 114 forks source link

Is it possible to do evaluation on multiple gpus? #1

Closed jyuno426 closed 1 year ago

jyuno426 commented 4 years ago

Thanks for your great works!

I want to do evaluation for pre-trained models with beam size 10 on multiple gpus, because I cannot run evaluation on single gpu due to out of memory, though the gpu is TITAN X (Pascal) with 12GB of memory.

However, when I run evaluation on multiple gpus, it throws up the following error:

AttributeError: 'DistributedDataParallel' object has no attribute 'generate_beam'

How can I do in this situation?

Thanks!

glample commented 4 years ago

Hi,

The multi-GPU code is distributing the model (i.e. each GPU has its own copy of the model), and the model is never split across GPUs to decrease the memory usage, so if this does not run on 1 GPU, it will not run on multiple GPUs.

Instead, can you try to simply decrease the batch size?