Closed RaimelMedina closed 1 month ago
Hi, could you give us more details?
Sorry for the late reply. I was trying to prepare a minimal working example where the issue emerges. It seems the pivot optimization does not manage to find a good set of initial pivots so perhaps this could be the problem...
I attach the code and an h5 file with an example of a problematic tensor
Thanks once again!
Thank you for the minimal example. It helped a lot. There seems to be an issue in the test script rather than TCI.jl.
The following code assumes row-major but Julia uses column-major like Fortran:
f(x::Vector{Int}) = vector[parse(Int, join(x .- 1); base=2)+1]
We can avoid this issue simply as
tensor = reshape(vector, localdims...)
f(x::Vector{Int}) = tensor[x...]
Enclosed includes my fix (Please find the comment "HERE IS THE FIX"). test_fixed.jl.zip
Oh, I apologize for this mistake :( No matter what I tried nothing worked and look where the error was ... Thanks again and sorry for all the trouble
@RaimelMedina You do not need to apologize! If you want to handle quantics indices, you could use QuanticsGrids.jl, which provides conversions between different representations such as quantics and linear grid index.
Hi again :)
More than an issue this is something that together with a colleague we have observed when trying to compress certain objects with TCI and compare the result with what we would obtain with SVD.
We are trying to learn a 14-qubit state with wave-function coefficients being all positive and non-zero. As an example: when applying tci2 on the state we see that tci2 converges to maxbonddim 80 although the maxbonddim parameter is set to 128. Moreover, we observe that the pivoterrors are all 0 but the relative error of the entries is 40%! We also see that when doing the usual svd compression on the tensor we obtain an MPS with maxbonddim~5.
Is there a way to check where the issues come from here? Why would tci2 fail? Is there anything we can try to make it work?
Would greatly appreciate any suggestion/insight you can offer
Thanks!