dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
863 stars 68 forks source link

when used with pytorch, is it possible to apply gradient checkpointing? #168

Closed seongwook-ham closed 3 years ago

seongwook-ham commented 3 years ago

when used with pytorch, is it possible to apply gradient checkpointing?

dgasmith commented 3 years ago

Apologies, it isn't clear what you are asking here. I'm going to close without further details.