torch / torch7

http://torch.ch
Other
9k stars 2.38k forks source link

Buggy cmul behavior on Tensors with 0-strides. #1067

Open yzhuang opened 7 years ago

yzhuang commented 7 years ago

To reproduce:

th> a = torch.Tensor({1.0})
                                                                      [0.0001s]
th> b = torch.Tensor({2.0, 2.0, 2.0})
                                                                      [0.0001s]
th> a:expandAs(b):cmul(b)
 8
 8
 8
[torch.DoubleTensor of size 3]

I would expect the result to contain three 2s, not three 8s.

I can understand why this happens --- there's only a single slot in the underlying storage, and it is multiplied by 2 3-times. However, the behavior is very counter-intuitive.

fmassa commented 7 years ago

Indeed, and that is a long standing problem, and has already been reported in https://github.com/torch/torch7/issues/289 If you check the behaviour of numpy, it actually doesn't allow by default expanded arrays to be mutated, see for example here. But if we hack around that by modifying the flags of the array, then numpy behaviour is similar to torch's.

a = np.array([1])
b = np.broadcast_to(a, (3,))
b.flags.writeable = True
b *= 2
print(b)

yields

array([8, 8, 8])