SHI-Labs / NATTEN

Neighborhood Attention Extension. Bringing attention to a neighborhood near you!
https://shi-labs.com/natten/
Other
339 stars 25 forks source link

XLA support #101

Open nom opened 6 months ago

nom commented 6 months ago

Hi, fantastic work with NATTEN. Is support for XLA on your roadmap? It would be great to enable neighborhood attention on TPUs and other non-Nvidia GPUs.

alihassanijr commented 6 months ago

Hello and thank you for your interest.

We definitely foresee expanding NATTEN to more backends and frameworks, and recent changes to the API reduce the dependency on torch to ease that process.

While our top priority at the moment is feature completeness for our CUDA backend, we definitely will consider other backends directly first (we would attempt to port or rewrite our kernels for other accelerators such as Apple Silicon and AMD GPUs), but depending on demand, we might consider higher-level platforms such as XLA and Triton.