cmhungsteve / Awesome-Transformer-Attention

An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
4.67k stars 490 forks source link

Transformers in Flax #22

Closed conceptofmind closed 2 years ago

conceptofmind commented 2 years ago

Hi,

I previously worked with lucidrains to implement 18 different Vision Transformers and PaLM in Google's Flax/JAX. I am going to release versions in DeepMind's Haiku soon as well. If this is of any interest I can open up a PR and add them to the corresponding places on the list.

Vision Transformers: https://github.com/conceptofmind/vit-flax

PaLM: https://github.com/conceptofmind/PaLM-flax

Thank you,

Enrico

cmhungsteve commented 2 years ago

Hello,

I added your Vision Transformer repo to the list. PaLM is a pure Transformer, which is out of this repo's scope.

Thanks.

conceptofmind commented 2 years ago

Hi @cmhungsteve,

Thank you for adding the Vision Transformer to the list.

The ViT-flax repository I linked above currently includes Flax transformer architecture implementations for papers such as:

I will open up a PR for the matching sections. And provide an update when all the Haiku versions are complete.

Thank you,

Enrico