tensorflow / tpu

Reference models and tools for Cloud TPUs.
https://cloud.google.com/tpu/
Apache License 2.0
5.21k stars 1.77k forks source link

Does this posterize augmentation same as PIL implementation ? #642

Open CoinCheung opened 4 years ago

CoinCheung commented 4 years ago

https://github.com/tensorflow/tpu/blob/ea5d379424e4121d29d12ff611ec6a0705e01e94/models/official/efficientnet/autoaugment.py#L223

I noticed that the above line 'cut off' some bits on both side of each byte of each pixel. However, the PIL implementation seems to only cut off one side.

By the way, I noticed that random augmentation used M=10 as the hyper-parameters. If so, the input image would be converted to all-zeros ?

CoinCheung commented 4 years ago

Same with solarized function. Did you use M=10 in the experiments ?

BarretZoph commented 4 years ago

Hi in the autoaugment version is also cuts things off on only one side too. By shifting to the right and then to the left, only the right side will be cut off.

We experiments around will different M magnitudes (up to 28 on ImageNet), which can be seen in our paper here https://arxiv.org/abs/1909.13719.