Open mtfishman opened 3 years ago
I'm thinking of using A ⊗ B
as a notation for lazily computing ITensors (thought of as a tensor product, so you could think of it as tensors producted together and waiting to get contracted). This could also be used in this case to lazily contract an MPO with an MPS, like: ϕ = ψ₁ + ψ₂ + H ⊗ ψ
.
The tensor product ⊗
may also create a tensor network object, which could store things like an adjacency list for the network based on analyzing which tensor of indices in common, but that is a story for another issue.
A neat generalization of #528 is to add up any objects that we can form a density matrix for. So it would be pretty straightfoward to add up arbitrary mixtures of
MPS
andMPO*MPS
, all in a single call, for example (basically by merging the density matrixMPO*MPS
contraction algorithm with the new density matrixMPS
addition algorithm). We could use a notation like the following:where you can think of
(H, ψ₃)
as a lazily contracted MPS and MPO, and perhaps we could even move towardsH * ψ
being a lazy contraction for MPO and MPS. This could also work for adding up MPOs and products of MPOs.