-
Hi,
I wonder if we can manually verify attention mask patterns during testing. While I can visualize masks by printing them as strings, I'm looking to add proper test assertions.
- How to assert…
-
Hi, thanks for an wonderful work!
I have an question about the optimization object. It is seems that the optimized additional residual information can make the network pay more attention on the spe…
-
so nice job, but when I visual the attention map follow the #3 , however, I get the bellow result, someone can help me? Thanks
![20_40](https://user-images.githubusercontent.com/17820697/89883075-42f…
-
I ran the file `models/AVT_ConvLSTM_Sub-Attention/main_inference.py`, but got an error:
我按照README运行了main_inference.py,但是报错:
> Given groups=1, weight of size [256, 3, 72, 3], expected input[16, 4, …
-
### Is your feature request related to a problem ? Please describe. ✍️
have noticed the following in the website's navbar:
1. **Logo**: There is no hover effect or animation on the logo, making …
-
-
## Description
I attempted to compile a Hugging Face model (the Hugging Face model link is: https://huggingface.co/OpenGVLab/InternViT-6B-448px-V1-5, which includes both the model architecture code …
-
Hello @DeepRNN !
I took a look at attentions that model generates in test mode.
I did the following: in `base_model.py:200` i changed the code as following
```
memory, output, scor…
-
# attention
q = q * self.scale # Normalization.
attn_logits = torch.einsum('bnd,bld->bln', q, k)
attn = self.softmax(attn_logits)
attn…
-
Dear authors,
I wanted to express my gratitude to you again! Your work immensely inspired me.
I was wondering if you could kindly explain the relationship between the variables `offset`, `batch…