facebookresearch / SLIP

Code release for SLIP Self-supervision meets Language-Image Pre-training
MIT License
747 stars 69 forks source link

Any reason for slightly non-standard val augmentation? #20

Open mitchellnw opened 2 years ago

mitchellnw commented 2 years ago

Hello, I am wondering if there's any reason to use these transforms for validation agumentation https://github.com/facebookresearch/SLIP/blob/main/main.py#L172-L177 when I often see https://github.com/pytorch/examples/blob/main/imagenet/main.py#L223-L230.