issues
search
cmsflash
/
efficient-attention
An implementation of the efficient attention module.
https://arxiv.org/abs/1812.01243
MIT License
269
stars
26
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
efficient-attention applly in Cross Attention
#13
stanny880913
opened
1 month ago
1
Shape of attention map
#12
aarontyliu
closed
1 year ago
1
dimension problem
#11
feimadada
opened
1 year ago
1
in_channels, key_channels, head_count, value_channels
#10
feimadada
closed
1 year ago
2
About Normalization
#9
VoyageWang
closed
1 year ago
3
why no activate function?
#8
micklexqg
closed
1 year ago
4
How to replicate attention maps in object detection
#7
chandlerbing65nm
closed
1 year ago
5
How to use efficient attention class
#6
chandlerbing65nm
closed
2 years ago
7
How to apply scaled dot-product attention by efficient attention?
#5
wangyue7777
closed
2 years ago
2
Can you apply masks in this attention model?
#4
rongcuid
closed
2 years ago
1
Different query positions on the same image
#3
horanyinora
closed
3 years ago
1
Question about the paper - what is PSMNet (baseline)?
#2
oliver-batchelor
closed
3 years ago
1
About module parameters
#1
TRillionZxY
closed
3 years ago
4