Closed sakurairihito closed 1 month ago
@sakurairihito
Thank you.
Point 1: We should provide an option to choose whether or not we normalize the error in SVD/QR and CI. Point 2: This may be a bug. Point 3: This may be a bug.
@rittermarc What do you think?
@rittermarc
I am implementing a fix.
I agree with all of your points. I would consider 2. and 3. bugs, and we should fix them; thank you for pointing it out!
There are several potential improvements for the
compress!
function, especially using SVD.Normalize singular values in
_factorize
: We need to consider normalizing these values by the maximum singular value, as we discussed about this issue with one of the developers.Proposed change:
I am not sure what the maximum values are when using CI.
Avoid recompression during left canonicalization: When making tensor trains left canonical by sweeping from left to right using SVD, we should avoid recompressing the tensor train. The current implementation sets the tolerance or maximum bond dimensions to finite values, so these must be fixed.
Contract singular values with left unitary matrix: In the right-to-left sweep in _factorize, that is, we move the canonical center from left to the right, we need to contract the singular value matrix with the left unitary matrix of SVD. This could improve the accuracy of the recompression of TT. However, current implementation contracts the singular value matrix with the right unitary matrix of SVD.
Proposed change in
_factorize
: