dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
820 stars 67 forks source link

Alternative Greedy Algorithms #233

Open dgasmith opened 1 month ago

dgasmith commented 1 month ago

An interesting paper citing us crossed my desk: https://arxiv.org/pdf/2405.09644

I have not had the chance to dig into, but worth looking through to see if there are additional algorithms to implement.

jcmgray commented 1 month ago

Yes in my experience, and from their results, optimizing the combine function (i.e. sample sum, max, min for combining size_a and size_b) is subsumed by optimizing a costmod coefficient like:

size_ab / costmod - (size_a + size_b) * costmod

which is what cotengra does in its hyper-greedy approach. I actually recently added a separate random-greedy optimizer with costmod sampling to the (rust-written) https://github.com/jcmgray/cotengrust, which dramatically reduces the path finding time too.

As a side note I might open a separate issue about potentially interfacing these rust functions.