SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022
MIT License
1.05k stars 86 forks source link

When will the code be released? #2

Closed linjing7 closed 2 years ago

linjing7 commented 2 years ago

Hi, I think your work has a superising performance. But the parallesim is a problem I'm concerned. When will you release your code?

alihassanijr commented 2 years ago

Hi, thank you for your interest.

We are currently in the process of preparing the public release of our code, training scripts, config files and checkpoints. We aim to release the code specifically in the coming days, so please stay tuned for that.

As for parallelism, we wrote our own CUDA kernel to compute NA, and it is quite fast in its current form. It is not as fast as it theoretically can be, but those are optimizations that we plan to explore in the near future.

I'm going to close this issue now, but feel free to reopen it if you have other questions.

linjing7 commented 2 years ago

Okay, looking forword to your code :).

nnzhangup commented 2 years ago

looking forword your code