google / TensorNetwork

A library for easy and efficient manipulation of tensor networks.
Apache License 2.0
1.82k stars 359 forks source link

get_path not giving optimal path for "optimal" algorithm #873

Closed mganahl closed 4 years ago

mganahl commented 4 years ago

get_path is not giving the optimal path for algorithm=opt_einsum.paths.dynamic_programming. The result is optimal in leading order, but for cases with degenerate leading order but differing subleading order it is not always giving the lowest cost result. I think this is an issue in opt_einsum, but I'm posting it here so we can keep track of it internally as well.

import opt_einsum
import functools
from tensornetwork.contractors.opt_einsum_paths.utils import get_path
import tensornetwork as tn
import numpy as np
backend='numpy'
alg = functools.partial(opt_einsum.paths.dynamic_programming, memory_limit=None)

D=100
M=10
d=4
mps = tn.Node(np.random.rand(D,d,D), backend=backend)
mpsc = tn.Node(np.random.rand(D,d,D), backend=backend)
mpo = tn.Node(np.random.rand(M,D,D), backend=backend)
L = tn.Node(np.random.rand(M,M,d,d), backend=backend)

mpo[0] ^ L[0]
mpo[1] ^ mps[0]
mpo[2] ^ mpsc[0]
mps[1] ^ L[2]
mpsc[1] ^ L[3]

names = ['mps','mps.conj()', 'mpo', 'L']
nodes = [mps,mpsc,mpo,L]
np.random.seed(132)
np.random.shuffle(nodes)
path,_ = get_path(nodes=nodes,algorithm = alg)
for p in path:
    n1 = names.pop(np.max(p))
    n2 = names.pop(np.min(p))    

    names.append(n1 + '-' +n2)
print(path)    
print(names)
mganahl commented 4 years ago

turns out there's a bug in the test