When you "prepare random unit tensor" in the VAT code, you use this :
d = torch.rand(x.shape).to( torch.device('cuda' if torch.cuda.is_available() else 'cpu'))d = _l2_normalize(d)
It mean that d is strictly positive. It's intended?
Sometimes we need to have a partially negative perturbation.
Why you didn't use something like the following code?
d = torch.rand(x.shape) - 0.5d = d.to('cuda' if torch.cuda.is_available() else 'cpu')d = _l2_normalize(d)
Hello,
When you "prepare random unit tensor" in the VAT code, you use this :
d = torch.rand(x.shape).to( torch.device('cuda' if torch.cuda.is_available() else 'cpu'))
d = _l2_normalize(d)
It mean that d is strictly positive. It's intended? Sometimes we need to have a partially negative perturbation.
Why you didn't use something like the following code?
d = torch.rand(x.shape) - 0.5
d = d.to('cuda' if torch.cuda.is_available() else 'cpu')
d = _l2_normalize(d)