-
### 🐛 Describe the bug
In PyTorch =, it does not support any Flash Attention on Windows.
### Versions
This is a report of a regression between 2.1.2 and 2.2.0+
cc @peterjc123 @mszhanyi @s…
-
Hello,
I'm searching for a way to visualize the attention maps of the pre-trained models but I didn't found any solution yet. Did someone already succesfully did this ?
Thank you !
-
Hello, I have sent an email to your mailbox to apply for access to data, and attached a statement that we only use it for scientific research, but I have not received a reply so far. Please pay attent…
-
Hi, I'm working on the attention mechanism for face recognition models, I'm using the ir model as a backbone, but I don't know much about the details of the implementation of grad-cam, what exactly sh…
-
Hi,
I'd like to compile `projects/pt1/examples/torchscript_stablehlo_backend_tinybert.py` into torch-mlir, so I did the modification:
```
--- a/projects/pt1/examples/torchscript_stablehlo_backend…
-
# 🐛 Bug
If i use MemoryEfficientAttentionFlashAttentionOp as my attention op for memory efficient attention, and use attention bias, it will give me errors :(
## Command
```
import math
impor…
-
**Titre du graphique** : "Densité de population (2021)"
**Type de graphique** : jauge linéaire
**Echelle du graphique** : répartir la plage de valeurs en 6 sections
ou bien propositions sui…
-
Hi, I would like to ask about the Deformable Attention mechanism in the paper.
I went to the paper DEFORMABLE DETR: DEFORMABLE TRANSFORMERS
FOR END-TO-END OBJECT DETECTION and the Deformable Atten…
-
### Describe the bug
环境搭建完成,运行第一遍的时候可以跑出结果。后续就一直运行不出来,一般会卡在下面这个状态中。下面状态提醒是否不用管??
Generating outputs: 0%| …
-
Hi,
I wonder if we can manually verify attention mask patterns during testing. While I can visualize masks by printing them as strings, I'm looking to add proper test assertions.
- How to assert…