Closed golunovas closed 4 years ago
Hey, @golunovas
Could you please provide a use case for the odd batch_size? It seems that the best way to handle that problem is to include an explicit check into AutoAlbument which ensures that batch_size is an odd number before running a search phase. Otherwise, I think some problems related to the different number of augmented and not-augmented images may arise.
Well, I ran into that issue on the last batch of the epoch when I just ran a search on the generated config search.yaml where drop_last wasn't set to true for the dataloader and I had an odd number of samples in the dataset. But the odd batch size will lead to exactly the same issue. IMO, the easiest solution is to set drop_last to true by default and require an even batch size to be set in the config.
I have added the drop_last: True
parameter to config files created by autoalbument-create
. Example configs now also contain this parameter. The fixed version 0.0.4 is also uploaded to PyPI.
It seems like the issue is coming from here. https://github.com/albumentations-team/autoalbument/blob/dbeffa703df6d2468e4bf43a1628543f7ece366b/autoalbument/faster_autoaugment/search.py#L254
As far as I understand, it requires an even batch otherwise it fails here https://github.com/albumentations-team/autoalbument/blob/dbeffa703df6d2468e4bf43a1628543f7ece366b/autoalbument/faster_autoaugment/search.py#L201