ITensor / ITensorNetworks.jl

A package with general tools for working with higher-dimensional tensor networks based on ITensor.
MIT License
51 stars 12 forks source link

Improvements in `OpSum` to `TTN` conversion #117

Open mtfishman opened 5 months ago

mtfishman commented 5 months ago

Followup to #116:

mtfishman commented 2 months ago

A comment on the representation of the symbolic TTN object:

Seems like this data structure could be a DataGraph with a graph structure matching the IndsNetwork/TTN graph structure and a SparseArrayDOK stored on the vertices, where the number of dimensions is the degree of the graph and the elements are Scaled{coefficient_type,Prod{Op}}. Does that sound right to you?

I suppose one thing that needs to be stored is the meaning of each dimension of the SparseArrayDOK on the vertices since you want to know which dimension corresponds to which neighbor. So interestingly the best representation may be an ITensor, or maybe a NamedDimsArray wrapping a SparseArrayDOK, where the dimension names are the edges of the graph.

_Originally posted by @mtfishman in https://github.com/mtfishman/ITensorNetworks.jl/pull/166#discussion_r1589141729_

mtfishman commented 2 months ago

Regarding the data structure used in the svd_bond_coefs(...) function:

This could be a DataGraph with that data on the edges of the graph.

I also wonder if Dict{QN,Matrix{coefficient_type}} could be a block diagonal BlockSparseMatrix where those matrices are the diagonal blocks and the QNs are the sector labels of the graded axes.

_Originally posted by @mtfishman in https://github.com/mtfishman/ITensorNetworks.jl/pull/166#discussion_r1589149589_