FluxML / Torch.jl

Sensible extensions for exposing torch in Julia.
Other
213 stars 15 forks source link

2D tensors should be reversed? #36

Open bjosv79 opened 4 years ago

bjosv79 commented 4 years ago

Hi, When implementing a linear layer, the following code does not produce the expected output:

import Torch: tensor
W = collect(reshape(1.f0:6.f0, (3,2)))
x = reshape([1.f0; 1.f0], (2, 1))
expected = W * x
output = tensor(W, dev = 0) * tensor(x, dev = 0)

I noted that 2d and nd tensors are treated differently and to me this looks like the root cause: https://github.com/FluxML/Torch.jl/blob/master/src/tensor.jl#L143-L149

Is there a good reason to treat them differently?

ToucheSir commented 4 years ago

See https://github.com/FluxML/Torch.jl/issues/21 and https://github.com/FluxML/Torch.jl/blob/f4da6dcbf521872efdd17b9599104a5ba739df0d/src/ops.jl#L10. This should hopefully be easier to do now that the native library build instructions have been cleaned up.