Closed Aiden0609 closed 1 year ago
Hi~ Thanks for your attention to our work. Yeah, you can use 4 1080ti GPUs to reproduce DocTr. For us, the batch size for each GPU is 3 and the total batch size is 12. Hope it helps
Thank you! Quite a large amount of GPU memory it consumed, I'll see if my hardware is capable.
DocTr does not include the BN layer. You can use two 2 1080ti GPUs to train the network. We do not ablate the batch size. Maybe batch size 6 performs better.
Hi, splendid work you've done! I've been considering to reproduce it but I don't know what kind of hardware conditions are recommended to successfully reproduced the training of Geo and Illu network, so I wish to know the GPU memory consumed for your training process. Is it 4 1080ti GPUs that you've used? Tks. I really need to know this!