Closed LYKlyk closed 4 years ago
Batch size can be increased through the batch_size
variable in the MSRAHandDataset
class in datamsra.py file. Since the dataset size is huge you would require better hardware if you wish to decrease the batch size and load the complete dataset into the memory. We have trained our network for 3 epochs after which it is found to overfit. You can find these and other details in the Implementation section of our paper.
Environment: Ubuntu RTX 2080ti 11g
When I try to train with MSRA dataset, I find that the training speed is very slow, and the batch size can only be set to 2. How can I increase the size of batch size? Why can a video card with 11g memory only read 2 batch sizes? How can I improve my training speed? How many epochs do you need to train? Thank you for response.