Could you explain how this super parameter is obtained, please!
In 425 line of AET5_Improved_Mixmatch.py, the parameter of matrix transform,
transforms.Normalize((0., 0., 16., 0., 0., 16., 0., 0.), (1., 1., 20., 1., 1., 20., 0.015, 0.015)).
And if the data set is self-built, does this parameter need to be changed?
Personally, I did not finetune it and I just used the same parameters from AET, which is our self-supervised learning methods. I think this parameter can applied for most of the datasets directly.
Could you explain how this super parameter is obtained, please! In 425 line of AET5_Improved_Mixmatch.py, the parameter of matrix transform, transforms.Normalize((0., 0., 16., 0., 0., 16., 0., 0.), (1., 1., 20., 1., 1., 20., 0.015, 0.015)). And if the data set is self-built, does this parameter need to be changed?
unlabel_dataset = AET_Memory_Dataloader(dataset_dir=data.train_path, dataset_mean=TRAIN_MEAN,dataset_std=TRAIN_STD,shift=params['shift'],degrees=params['rot'],shear=params['shear'], train_label=True,translate=(params['translate'], params['translate']), scale=(params['shrink'], params['enlarge']), fillcolor=(128, 128, 128), resample=PIL.Image.BILINEAR, matrix_transform=transforms.Compose([ transforms.Normalize((0., 0., 16., 0., 0., 16., 0., 0.), (1., 1., 20., 1., 1., 20., 0.015, 0.015)), ]), transform_pre= transform_train, rand_state=params['seed'], valid_size=0, num_classes=num_classes,extra_path=data.extra_path,patch_length=24, )