stanford-futuredata / ColBERT

ColBERT: state-of-the-art neural search (SIGIR'20, TACL'21, NeurIPS'21, NAACL'22, CIKM'22, ACL'23, EMNLP'23)
MIT License
2.95k stars 377 forks source link

Why is the labels initialised to zeros? #298

Closed satya77 closed 8 months ago

satya77 commented 8 months ago

In training.py the labels are set to a vector of zeros (https://github.com/stanford-futuredata/ColBERT/blob/8fb3abbeead17c506a323de7108603d559c061b1/colbert/training/training.py#L73) and then used in the loss: loss = nn.CrossEntropyLoss()(scores, labels[:scores.size(0)]) Why all zeros? Is nothing relevant in the batch? Why are the target_scores not used?

okhat commented 8 months ago

I think this is probably answered in 2-3 old issues.

Basically, zeros here indicate that position 0 has the relevant item.