Jutho / TensorOperations.jl

Julia package for tensor contractions and related operations
https://jutho.github.io/TensorOperations.jl/stable/
Other
443 stars 56 forks source link

@ncon wrapped in a function always returns the same reference when cache is enabled #83

Closed mhauru closed 4 years ago

mhauru commented 4 years ago
using LinearAlgebra
using TensorOperations

#TensorOperations.disable_cache()
TensorOperations.enable_cache()

op1 = randn(2, 2)
op2 = randn(2, 2)
op3 = randn(2, 2)

f(op) = @ncon((op, op3), ([-1 3], [3 -3]))

b = f(op1)
bcopy = deepcopy(b)
c = f(op2)
@show(norm(b - bcopy))
@show(norm(b - c))

outputs e.g.

norm(b - bcopy) = 2.4838965781832383
norm(b - c) = 0.0

In other words, when I wrap an @ncon call in a function, it seems that repeated calls to that function always return a reference to the same array, the elements of which continue to track the result of the latest call. Just to show that this really depends on wrapping it in a function,

b = @ncon((op1, op3), ([-1 3], [3 -3]))
bcopy = deepcopy(b)
c = @ncon((op2, op3), ([-1 3], [3 -3]))
@show(norm(b - bcopy))
@show(norm(b - c))

outputs e.g.

norm(b - bcopy) = 0.0
norm(b - c) = 2.628150532683183

Disabling the cache fixes this.

Jutho commented 4 years ago

Yes, huge oversight from my part. I treat the last contraction separately in the code, specifically not to cache the end result, but then I did nonetheless used the cache for this allocation. This should now be fixed on master.

mhauru commented 4 years ago

Thanks!