dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
848 stars 68 forks source link

Fink optimal path for caching constant intermediates #199

Open fangzhangmnm opened 2 years ago

fangzhangmnm commented 2 years ago

For example:

i,ij,jk->k where sizes are [5],[5,100],[100,5], and the second and the third tensor is a constant Then it is a good idea to cache ij,jk->ik

dgasmith commented 2 years ago

I believe the following will help you: https://dgasmith.github.io/opt_einsum/getting_started/sharing_intermediates/

fangzhangmnm commented 2 years ago

I meant doing the optimization under the consumption that the constant tensors can be contracted beforehand offline. The above functionality only cache the intermediates, but the path planning might not consider which tensor is a constant

dgasmith commented 2 years ago

Yes, cache/intermediate aware paths are something we have discussed but have not implemented. It isn't clear that there is a general approach to solve the problem as straightforward approaches become combinatorial in nature.

Happy to take a PR which attempts this functionality!