issues
search
catie-aq
/
flashT5
A fast implementation of T5/UL2 in PyTorch using Flash Attention
Apache License 2.0
59
stars
7
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Triton issues
#6
prd-hung-trinh
opened
1 week ago
0
Tried training a model
#5
amazingvince
opened
1 month ago
2
is Flash Attention a requirement?
#4
SoshyHayami
opened
3 months ago
1
Does the training support BF16 or fp16?
#3
jamesharrisivi
opened
4 months ago
10
Use github warning
#2
mishig25
closed
4 months ago
1
Requirements
#1
danil31219as
closed
5 months ago
1