SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
MIT License
1.05k stars 86 forks source link

cpp version of nattenav layer #36

Closed AntonyDrovalov closed 2 years ago

AntonyDrovalov commented 2 years ago

Hello! Do you plan to release c++ version of natten layer (without cuda)? For fast inference on cpu

alihassanijr commented 2 years ago

Hello and thanks for your interest. Absolutely, it's been on our todo list. We'll eventually push out a CPU version and deprecate the pure torch implementation. I'll keep this issue open until we do so you get notified when that PR gets merged.

alihassanijr commented 2 years ago

I'm closing this issue now because we're moving our extension to its own separate repository, and it is going to be pip-installable from now on. We're still working on a CPU implementation, that is definitely on our list of features. You are welcome to open an issue there if you'd like.

As for the pip installs, if you're using CUDA you can refer to our website and get a pip install command for your specific torch version.

pip install natten will eventually serve as our CPU-only build, like most other extension.

Thanks again for your interest.

alihassanijr commented 2 years ago

@AntonyDrovalov we now have CPU kernels in v0.14.2! https://github.com/SHI-Labs/NATTEN/tree/v0.14.2