Open jihadbourassi opened 5 years ago
I did not understand the BATCH_SIZE variable so instead, I used more STEPS_PER_EPOCH and VALIDATION_STEPS.
Try using say 300 steps and 50 validation steps. This allows more data per epoch before validating each time.
You can set the batch size in your config file, under class Config, in config.py. It's config.IMAGES_PER_GPU
.
Actually your batch size, config.BATCH_SIZE
is derived by multiplying config.IMAGES_PER_GPU
* config.GPU_COUNT
.
So if you're working on a single GPU, and you've set your config.GPU_COUNT
to 1, then BATCH_SIZE
should be equal to IMAGES_PER_GPU
.
When i change the value of IMAGES_PER_GPU and GPU_COUNT in config.py,still there is no change in the batch size.Please help ,what can i do for this
@Prithvi0167 it is because the BATCH_SIZE is only assigned during initialization, you can see it in init function in config.py on line 216 self.BATCH_SIZE = self.IMAGES_PER_GPU * self.GPU_COUNT
To re-calculate batch size after you declare in your config
config = YourConfig()
config.BATCH_SIZE = config.IMAGES_PER_GPU * config.GPU_COUNT
or alternatively, you can just directly declare it in your config
Hello I am trying to train the Mask RCNN model for licence plate detection i followed the ballon detection examlpe and strated training without changing any of the hyperparameters (except multiprocessing=false since i am using cpu) by observing the training i noticed that the loss function started high in the first epochs and goes up and down a lot from one epoch step to another, and it only got down to 0.70 by the 23th epoch (out of 30) with an above 1 val_loss value still, so i figured i should increase the batch size (left batch size=1) since my image dataset is pretty diverse in order to reduce the variance of the loss function but i can't seem to do so since everytime i change the batch size on config.py and launch training it shows me BATCH SIZE=1 each time, i currentlly restarted training with the weights generated from the 23th epoch.
I am new to neural networks so bare with me please any help will be much appriciated, thank you all very much and have a good day.