naver / splade

SPLADE: sparse neural search (SIGIR21, SIGIR22)
Other
737 stars 80 forks source link

When do you drop a term? #1

Closed hguan6 closed 2 years ago

hguan6 commented 2 years ago

I understand that the log-saturation function and regularization loss suppress the weights of the frequent terms. But when do you drop a term (setting the term weight to zero)? Is it when the logit is less or equal to zero, so that the log(1+ReLu(.)) function outputs zero?

sclincha commented 2 years ago

HI @hguan6 , The terms that have a weight of zero are not considered. More precisely, the term weights that do not pass the Relu ( i.e Relu(w)=0) remains zero after the log and so we discard them.

Best

hguan6 commented 2 years ago

@sclincha Thank you!