Closed WeiKin007 closed 3 years ago
batch_size is the number of datasets per step.
The number of datasets used for training is batch_size x steps_per_epoch x epochs
.
Sorry that is kind of confusing. So if I have 1000 images for training, and my batch size is set to 1, lets say it takes 60 seconds per epcoh to process the whole dataset, when I set my batch size to 10, shouldn't the time taken per epoch reduce as I am processing more samples per batch?
I accidentally closed it.
If you increase the batch_size, the time spent per epoch will increase.
I am using this package to train a model on a custom dataset of images. I have been training my models using settings of batch size = 1, but when I increase the batch size to let's say 2, the time per epoch increases, which goes against what I understand of how batch processing works. Or is the way how the batches work differently in this package?