dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
848 stars 68 forks source link

Document state of PyTorch upstreaming in the README #231

Closed bmillwood closed 4 weeks ago

bmillwood commented 4 months ago

The README mentions that some of these optimizations already exist upstream in numpy, but that you need to pass optimize=True to access them. This is useful!

The docs for torch.einsum suggest that it automatically uses opt_einsum already if available (see also discussion at https://github.com/dgasmith/opt_einsum/pull/205). It would be helpful to also mention that here, and say whether it's necessary to explicitly import opt_einsum to get this behaviour (I believe no?), potentially also mentioning torch.backends.opt_einsum.is_available() and torch.backends.opt_einsum.enabled, or anything else that seems relevant / useful. (I think the torch docs could also be improved here, and may submit an issue or PR there, but I think it would be useful to say something here regardless.)

Doing something like this for every supported opt_einsum backend might be quite a task, but let's not let perfect be the enemy of good :)

janeyx99 commented 4 months ago

hehe I wrote the torch einsum docs regarding using opt_einsum, so let me know where you think the docs could be improved! (feel free to open an issue on torch and cc me)

bmillwood commented 4 months ago
dgasmith commented 4 weeks ago

Closing in favor of https://github.com/pytorch/pytorch/issues/127109.