facebookresearch / EGG

EGG: Emergence of lanGuage in Games
MIT License
281 stars 99 forks source link

Mixed precision training #167

Closed eugene-kharitonov closed 3 years ago

eugene-kharitonov commented 3 years ago

Enables mixed-precision training when --fp16 is specified.

Description

Uses pytorch.cuda.amp primitives in Trainer.

Related Issue (if any)

https://github.com/facebookresearch/EGG/issues/165

Motivation and Context

/a/ Reduces mem-consumption, which could be interesting if you run REINFORCE on for under-powered GPUs /b/ Must give a speed-up on modern GPUs (eg Volta) (however, I didn't test that)

How Has This Been Tested?

/1/ Manually checked convergence on the channel (zipfian) game /2/ Checked that it unlocks ~2x larger batch size on language_bottleneck/image_classification

eugene-kharitonov commented 3 years ago

Hello @msclar -- you could try rebasing/trying this branch out, if you want to experiment with larger batches :)