issues
search
facebookresearch
/
xformers
Hackable and optimized Transformers building blocks, supporting a composable construction.
https://facebookresearch.github.io/xformers/
Other
8.41k
stars
597
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Does the latest xformers wheel uploaded now support cuda11.8 and cuda12.4?
#1115
leij0318
opened
17 hours ago
1
scaled_dot_product_attention output is different from memory_efficient_attention
#1114
aenoca
opened
3 days ago
0
Enabling softcap option
#1113
SpyrosMouselinos
closed
3 days ago
1
CUTLASS Fused multi head attention
#1112
yoon5862
opened
3 days ago
0
what version about torch in FLUX bf16?
#1111
Zhuangvictor0
closed
3 days ago
1
compile for rocm w/ gfx1032 card
#1110
brcisna
opened
6 days ago
4
No operator found for `memory_efficient_attention_forward` with inputs:
#1109
brcisna
opened
1 week ago
1
Enable complete BlockDiagonalGappyKeysMask and BlockDiagonalPaddedKey Support on CK
#1108
qianfengz
closed
1 week ago
0
`memory_efficient_attention` is slower than `scaled_dot_product_attention` of PyTorch?
#1107
QinlongHuang
opened
1 week ago
2
unable to pip install 0.0.28.post1 on windows
#1104
ajkessel
closed
5 days ago
5
no module 'xformers'. Processing without...
#1102
ZeroCool22
opened
1 week ago
2
Enable AMD CI
#1099
danthe3rd
opened
2 weeks ago
0
Prebuilt wheel for Windows
#1098
KohakuBlueleaf
opened
2 weeks ago
3
why conflicting release notes?
#1097
BBC-Esq
opened
2 weeks ago
1
Hey, you! The people that made this thing. You know that Torch v 2.4.1 is out, right??
#1096
Xephier102
opened
2 weeks ago
8
xformers Installation failed
#1095
wzgrx
closed
2 weeks ago
1
Tiny upstream pr
#1094
qianfengz
closed
2 weeks ago
3
BlockDiagonalGappyKeysMask backward support
#1093
cumulo-autumn
opened
3 weeks ago
2
[ROCm] update rocm_ci workflow from rocm fork
#1092
tenpercent
closed
3 weeks ago
0
[ROCm] tweak rocm wheel build workflow to avoid ci timeouts
#1091
tenpercent
closed
3 weeks ago
1
flash attn bug
#1090
zhw-zhang
closed
1 month ago
15
`ScaledDotProduct` with attention mask returns different result as standard attention
#1089
amyxlu
opened
1 month ago
3
Fix wrong docstring arg name
#1088
kit1980
closed
1 month ago
1
Use weights_only for load
#1087
kit1980
closed
1 month ago
0
Sparse attention will not reduce peak memory usage
#1086
ThisisBillhe
closed
1 month ago
2
Add torch compile support for ck attention op
#1085
jianyuh
closed
1 month ago
0
Error in [conda install xformers -c xformers/label/dev]
#1084
11whitewater
opened
1 month ago
6
Build with CUDA 12.6 and VS 2022 17.11.1 broken on Windows
#1083
levicki
opened
1 month ago
14
Improvement in ROCM fmha-backward
#1082
qianfengz
closed
1 month ago
0
Build rocm wheels
#1081
tenpercent
closed
1 month ago
2
How to get Q @ K^T similarity?
#1080
volcverse
opened
1 month ago
0
🚀 Precompiled xFormers for CUDA 12.4 and PyTorch 2.4 Compatibility
#1079
sashaok123
opened
1 month ago
13
Update README.md
#1078
spdraptor
closed
1 month ago
0
Update README.md
#1077
spdraptor
closed
1 month ago
2
Local attention mask size mismatch
#1076
samuelwheeler
closed
1 month ago
1
WHY Performance regression with xformers 0.27.post2 + cuda 2.4.0+cu121 on old NVIDIA GPU?
#1075
motolo
closed
1 month ago
4
`fmha.cutlass.FwOp` is 2x slower than `fmha.flash.FwOp`
#1074
Luke20000429
opened
2 months ago
3
Windows build of xformers cannot work on pytorch>=2.2 now.
#1073
KohakuBlueleaf
closed
2 months ago
14
What is the counterpart of xformers' attn_bias in flash_attn_func?
#1072
complexfilter
opened
2 months ago
1
Support for ARM64 Architecture
#1071
KumoLiu
opened
2 months ago
0
No support for pythorch 2.3.1+rocm6
#1070
srijan789
opened
2 months ago
4
USE_FLASH_ATTENTION was not enabled for build
#1069
MrHorakhty
closed
1 month ago
1
When will xformers support Flash Attention 3?
#1068
complexfilter
opened
2 months ago
4
Remove _check_large_shapes checking in fmha/ck.py
#1067
qianfengz
closed
2 months ago
2
Can you make it work with Pytorch 2.2.0?
#1066
simsim314
opened
2 months ago
1
Rotary Embedding Not being Registered
#1065
r4hul77
opened
2 months ago
0
Device error on 0.0.27.dev844
#1064
Cospui
opened
2 months ago
5
I am unable to install the xformers
#1063
MrHorakhty
opened
3 months ago
1
Server breakdown when installing xformers both manually and from source via ninja
#1062
YacratesWyh
opened
3 months ago
0
Update for complete functional support of ck fmha forward/backward
#1061
qianfengz
closed
3 months ago
7
Next