Closed dgasmith closed 5 years ago
Thanks! We definitely are looking for some advancements on contraction algorithms. If you want to make a PR, we would love to include it! :)
Sure, we can look into it. It might be easiest to add a single algorithm at first to your contractors to see how the implementation goes. What about starting with the "optimal" algorithm?
That would be awesome! Feel free to email me if you need any help.
@dgasmith Out of interest: Are there any plans to do opt_einsum's path optimization in e.g. cython in the future? I am guessing it could make a significant difference to the performance of, say, the optimal path algorithm.
This is something I would love to do and would help our downstream packages. Making an issue on the opt_einsum
GitHub would be a good place to get started and we could figure out how we would implement this. My bet that most of the bottlenecks at this point are in the cost evaluations and set logic which could have extremely fast C implementations.
Apologies for not making progress on this issue, my time is a bit swamped at the moment. I should be able to get back to this issue in September (!).
We've added most of the determanistic algorithms from opt_einsum
as of https://github.com/google/TensorNetwork/pull/173. Any upgrades to opt_einsum
now will be immediately incorporated into this library. ^-^
We had some issues getting the random contractors integrated, as they seem to not exist in the pip package. (See https://github.com/dgasmith/opt_einsum/issues/93) Once that is fixed, integrating them should be one liners.
The opt_einsum repository has been working on paths for tensor contractions for a few years now and have come up with a variety of algorithms tuned for several use cases. A few algorithms to check out:
We originally powered the optimization algorithms in the NumPy
einsum
function, but have expanded to a number of other projects such as Pyro. Hopefully you may find these algorithms useful as well!