tristandeleu / pytorch-meta

A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch
https://tristandeleu.github.io/pytorch-meta/
MIT License
1.97k stars 256 forks source link

Is not normalizing in the helper functions a problem? #154

Open brando90 opened 2 years ago

brando90 commented 2 years ago

I noticed that we have this: https://github.com/tristandeleu/pytorch-meta/blob/d55d89ebd47f340180267106bde3e4b723f23762/torchmeta/datasets/helpers.py#L165

    defaults = {
        'transform': Compose([Resize(84), ToTensor()])
    }

i.e. there is no normalization e.g. using:

    normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])

might this be a problem? I assume it might if we compare results with different papers but if we only compare results within our own paper or reproduce results with the torchmeta transform then things are fine...?