issues
search
OctoberChang
/
X-Transformer
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
BSD 3-Clause "New" or "Revised" License
135
stars
28
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
How to generate the file X.trn.pnz
#20
shunshun-lala
opened
2 years ago
1
about the pretrained models
#19
shunshun-lala
closed
2 years ago
1
Clarification on different configurations
#18
ThejaniYapa
closed
11 months ago
0
No space left on device
#17
Khalid-Usman
closed
2 years ago
0
What is the non-linear scoring function sigma to obtain final ranking?
#16
vinaysetty
closed
3 years ago
3
I wonder whether the cluster number K is manual setup or auto-calculated?
#15
Gpwner
closed
3 years ago
1
about the dataset
#14
Liyx98
closed
3 years ago
4
Neural label embeddings
#13
vinaysetty
closed
3 years ago
1
Changed code to run indexer
#12
DarshanPatel11
closed
3 years ago
0
Class Weight
#11
siyanew
closed
3 years ago
1
multi-label classification / paperswithcode dataset
#10
ghost
opened
3 years ago
1
Assertion error in evaluation assert tY.shape == pY.shape fails
#9
vinaysetty
closed
3 years ago
2
modeling.py : difference between RobertaForXMLC (etc.) class and transformers.RobertaForSequenceClassification?
#8
simonlevine
closed
3 years ago
1
label_map.txt in dataset
#7
gaurav-krishna
opened
3 years ago
3
fix typo
#6
WrRan
closed
3 years ago
0
single-gpu eval fixes
#5
simonlevine
closed
3 years ago
0
Update preprocess.py
#4
simonlevine
closed
3 years ago
0
Issue with training stage
#3
simonlevine
closed
3 years ago
4
Update transformer.py
#2
simonlevine
closed
4 years ago
0
how 桶
#1
renmada
closed
4 years ago
0