yuanli2333 / Hadamard-Matrix-for-hashing

CVPR2020/TNNLS2023: Central Similarity Quantization/Hashing for Efficient Image and Video Retrieval
MIT License
233 stars 46 forks source link

pairwise loss #8

Closed hbellafkir closed 4 years ago

hbellafkir commented 4 years ago

hi,

The paper doesn't use any pairwise loss. Is this a joke? why are you implementing something different that the paper's claim?

thanks in advance

hbellafkir commented 4 years ago

Ok lambda1 is set to be 0, which weights the pairwise loss. I was very upset before cause its not the first time, that I read a paper, but the code itself tells another story.

Kenny-Li2023 commented 3 years ago

Ok lambda1 is set to be 0, which weights the pairwise loss. I was very upset before cause its not the first time, that I read a paper, but the code itself tells another story.

Do you know why the author uses pairwise loss?

hbellafkir commented 3 years ago

They set lambda1 to be 0 for Training, so they doesn't use pairwise loss.

Kenny-Li2023 commented 3 years ago

They set lambda1 to be 0 for Training, so they doesn't use pairwise loss.

Why is Q_loss completely different from LQ in the paper?