Closed onlytailei closed 7 years ago
Data Augmentation is partly done within the data loader instead of a dedicated module. As far as I remember, the main data augmentation that is not applied here is chromatic data augmentation. i.e., warping on color values and contrast. And that augmentation is to me pointless as we apply a normalization on color values during loading, which will set mean intensity to 0 and contrast (std) to 1
Every other augmentation, like translation, corpping and such can be found here : https://github.com/ClementPinard/FlowNetPytorch/blob/master/flow_transforms.py
However, if you still feel the need to test intensity/contrast data augmentation, it would be very easy to do so with a function like this :
class RandomNormalize(object):
def __init__(self, mean, std):
"""Args:
mean (tuple of 2): min and max mean to apply to input
std (tuple of 2): min and max std to apply to input
"""
self.mean = mean
self.std = std
def __call__(self, tensor):
for t in tensor:
m = np.random.uniform(*self.mean)
s = np.random.uniform(*self.std)
t.mul_(s).add_(m)
return tensor
And to apply it to your dataset :
input_transform = transforms.Compose([
flow_transforms.ArrayToTensor(),
transforms.Normalize(mean=[0,0,0], std=[255,255,255]),
normalize,
RandomNormalize((-1,1),(0.5,1.5))
])
Have you tried to implement the dataAugmentation part of the original flownet?