dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
848 stars 68 forks source link

when used with pytorch, is it possible to apply gradient checkpointing? #168

Closed seongwook-ham closed 2 years ago

seongwook-ham commented 3 years ago

when used with pytorch, is it possible to apply gradient checkpointing?

dgasmith commented 2 years ago

Apologies, it isn't clear what you are asking here. I'm going to close without further details.