Closed bmillwood closed 4 weeks ago
hehe I wrote the torch einsum docs regarding using opt_einsum, so let me know where you think the docs could be improved! (feel free to open an issue on torch and cc me)
Closing in favor of https://github.com/pytorch/pytorch/issues/127109.
The README mentions that some of these optimizations already exist upstream in
numpy
, but that you need to passoptimize=True
to access them. This is useful!The docs for
torch.einsum
suggest that it automatically usesopt_einsum
already if available (see also discussion at https://github.com/dgasmith/opt_einsum/pull/205). It would be helpful to also mention that here, and say whether it's necessary to explicitly importopt_einsum
to get this behaviour (I believe no?), potentially also mentioningtorch.backends.opt_einsum.is_available()
andtorch.backends.opt_einsum.enabled
, or anything else that seems relevant / useful. (I think the torch docs could also be improved here, and may submit an issue or PR there, but I think it would be useful to say something here regardless.)Doing something like this for every supported opt_einsum backend might be quite a task, but let's not let perfect be the enemy of good :)