Closed lxyzler closed 4 years ago
@lxyzler batch size is defined in config files, for example, https://github.com/aim-uofa/AdelaiDet/blob/0157227f966eda93c1299a402537b616207ba226/configs/BAText/TotalText/attn_R_50.yaml#L9.
thank you very much @tianzhi0549 ,it works
CUDA out of memory. Tried to allocate 136.00 MiB (GPU 3; 10.76 GiB total capacity; 1.72 GiB already allocated; 74.56 MiB free; 2.01 GiB reserved in total by PyTorch)