-
Dear author, thank you for providing such a lightweight reinforcement learning library. Currently, I am hoping to integrate your attention mechanism into other reinforcement learning algorithms. I enc…
-
Hello author, I am a beginner in the YOLO model and would like to ask you some questions. In your article, you mentioned the CPAM module you used, and also mentioned that you used other attention modu…
-
I hope this message finds you well. I recently read your impressive paper on [FLatten Transformer: Vision Transformer using Focused Linear Attention], and I must say I was truly amazed by your work.
…
-
-
-
-
See project proposal [here](https://andre-martins.github.io/pages/project-examples-for-deep-structured-learning-fall-2019.html).
-
### Model description
"Attention Is All You Need" is a landmark 2017 research paper authored by eight scientists working at Google, responsible for expanding 2014 attention mechanisms proposed by Bah…
-
When I read the code in your nice_stand.py file, I didn't see you using self-attention or graph attention mechanisms, but you describe this part in your paper
![图片1](https://github.com/eeyhsong/NICE-…
-
Flash Attention can only be used with fp16 and bf16, not with fp32. Therefore, we should make flash attention optional in our codebase so that one can deactivate it during inference in exchange for hi…