dgasmith / opt_einsum

⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
https://dgasmith.github.io/opt_einsum/
MIT License
848 stars 68 forks source link

cuTensor with cupy #144

Closed cnk113 closed 4 years ago

cnk113 commented 4 years ago

Hello,

Are you considering integrating cuTensor at this point? It looks like you were considering it before #97. cuTensor is released with CUDA 11, and seems to be a massive speedup., Also the cutensor module in cupy looks like it's fully integrated at this point.

Best

jcmgray commented 4 years ago

The way things are set up means that if cupy (or any other library) has implemented it (in cupy.tensordot and cupy.einsum) then there are no changes required on opt_einsum's end! Just supply cupy arrays, or supply backend='cupy' to a ContractExpression.

It's currently a little fiddly to install cupy with cutensor but I have tested that this is working. One thing to note is that opt_einsum creates high dimensional intermediates and cutensor currently only supports 12 dimensions.