issues
search
lucidrains
/
flash-cosine-sim-attention
Implementation of fused cosine similarity attention in the same style as Flash Attention
MIT License
206
stars
11
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
failed building wheel for flash-cosine-sim-attention
#10
dogukantai
opened
1 year ago
0
Can not import debug
#9
bitdom8
opened
1 year ago
0
Training Loss and Experiments
#8
conceptofmind
closed
3 months ago
31
Support head dimension 16?
#7
alann-github
closed
2 years ago
1
Import fails
#6
kradonneoh
closed
2 years ago
3
Pip Install Fails
#5
kradonneoh
closed
2 years ago
3
GPU Benchmarks
#4
conceptofmind
closed
3 months ago
25
make install fails
#3
antorsae
closed
2 years ago
4
Performance compared to plain version
#2
inspirit
closed
2 years ago
6
why the flash-cosine-sim-attention is slower than plain-cosine transformer?
#1
3bobo
closed
2 years ago
2