issues
search
BinWang28
/
SBERT-WK-Sentence-Embedding
IEEE/ACM TASLP 2020: SBERT-WK: A Sentence Embedding Method By Dissecting BERT-based Word Models
Apache License 2.0
177
stars
27
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Computing embedding for train is extremely slow
#14
anaivebird
opened
3 years ago
1
How can i get the sentence representations from SBERT-WK
#13
boscoj2008
opened
3 years ago
3
Biasing Attention
#12
g-luo
closed
3 years ago
4
If I want to use Chinese dataset, what can I do?
#11
SomeoneNotLikeYou
closed
3 years ago
1
Remove unused variable
#10
JohnGiorgi
opened
4 years ago
0
Is `q` ever used?
#9
JohnGiorgi
closed
3 years ago
1
Vectorize unmask sum computation
#8
JohnGiorgi
closed
4 years ago
1
Vectorize reshape operation
#7
JohnGiorgi
closed
4 years ago
1
Why not use this methond to finetune Bert?
#6
hlang8160
closed
3 years ago
2
Comparison with sentence-transformers
#5
MastafaF
opened
4 years ago
6
Whether or not the BERT listed in experimental results of STS tasks (p8, table 3) be fine-tuned?
#4
slvher
closed
4 years ago
2
Time and Memory consumption
#3
Thomzoy
closed
3 years ago
8
Missing file problem
#2
None403
closed
4 years ago
6
can you explain this calculation?
#1
noowad93
closed
4 years ago
4