google-research / augmix

AugMix: A Simple Data Processing Method to Improve Robustness and Uncertainty
Apache License 2.0
979 stars 157 forks source link

augmentations used in augmix #22

Closed kiranchari closed 3 years ago

kiranchari commented 3 years ago

Hi,

I have a couple of questions about the augmentations used in augmix -

  1. The AugMix paper mentions that contrast augmentations were removed from augmix as that would overlap with one of the tested corruptions (Contrast) - but I see that AutoContrast is still used in the code: https://github.com/google-research/augmix/blob/master/augmentations.py#L141

  2. I am curious how or why the augmentations in augmix impact performance on these corruptions as the connection between them is not immediately clear. Do you have a take on this, perhaps through an ablation study of the augmentations in augmix?

Thank you.

hendrycks commented 3 years ago
  1. AutoContrast in PIL is not statistically similar to contrast reduction.
  2. Unfortunately no. There aren't any augmentations that do anything statistically similar to Gaussian noise, but for some reason they still help.

On Mon, Sep 6, 2021 at 3:02 AM kiranchari @.***> wrote:

Hi,

If I may, I have a couple of questions about the augmentations used in augmix -

1.

The AugMix paper mentions that contrast augmentations were removed from augmix as that would overlap with one of the tested corruptions (Contrast) - but I see that AutoContrast is still used in the code: https://github.com/google-research/augmix/blob/master/augmentations.py#L141 2.

I am curious how or why the augmentations in augmix impact performance on these corruptions as the connection between them is not immediately clear. Do you have a take on this, perhaps through an ablation study of the augmentations in augmix?

Thank you.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/google-research/augmix/issues/22, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACZBITRRML2UFHAJ2YA6GXLUASGUXANCNFSM5DQDJNBQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

kiranchari commented 3 years ago

Both Autocontrast and Histogram equalization (augmentations.equalize) can vary the distribution of contrast across images. Perhaps this provides some contrast robustness?

hendrycks commented 3 years ago

AutoContrast in PIL is a complicated histogram-based method that scales the image to take up the full range from min to max values, while the "contrast" corruption in ImageNet-C merely squishes inputs closer to their mean.

https://github.com/tensorflow/tpu/blob/8462d083dd89489a79e3200bcc8d4063bf362186/models/official/efficientnet/autoaugment.py#L285

def contrast(x, severity=1):
    c = [0.4, .3, .2, .1, .05][severity - 1]

    x = np.array(x) / 255.
    means = np.mean(x, axis=(0, 1), keepdims=True)
    return np.clip((x - means) * c + means, 0, 1) * 255