issues
search
jeromerony
/
dml_cross_entropy
Code for the paper "A unifying mutual information view of metric learning: cross-entropy vs. pairwise losses" (ECCV 2020 - Spotlight)
https://arxiv.org/abs/2003.08983
BSD 3-Clause "New" or "Revised" License
164
stars
18
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Why |z_i-z_j| is both tightness term and diversity term ?
#12
Irish-kw
opened
1 year ago
0
Query about SPCE implementation
#11
yixiao-huang
opened
1 year ago
0
CVE-2007-4559 Patch
#10
TrellixVulnTeam
closed
1 year ago
0
Link between the center loss and conditional cross entropy
#9
needylove
closed
2 years ago
4
Some comparisons
#8
arita37
opened
2 years ago
1
Getting OOM after first evaluation
#7
zz827
closed
3 years ago
0
Clarification on SCE loss
#6
lindsey98
closed
3 years ago
2
Understanding the results
#5
silvaphl
closed
3 years ago
8
Trained models?
#4
XiSHEN0220
closed
3 years ago
0
Some overseen advantages of the pairwise losses
#3
ducha-aiki
closed
4 years ago
3
the implementation of losses
#2
liquor233
closed
4 years ago
3
In Class SmoothCrossEntropy, the temperature is not used
#1
ArsenLuca
closed
4 years ago
1