theeluwin / pytorch-sgns

Skipgram Negative Sampling implemented in PyTorch
MIT License
302 stars 59 forks source link

myConfusion #17

Open tutou-pifeng opened 7 months ago

tutou-pifeng commented 7 months ago

Why code below, in the project, can be used as "loss". oloss = t.bmm(ovectors, ivectors).squeeze().sigmoid().log().mean(1) nloss = t.bmm(nvectors, ivectors).squeeze().sigmoid().log().view(-1, context_size, self.n_negs).sum(2).mean(1)

In my judgment, "loss" should be "prediction" - "actual result". But, in the upper code, "oloss" is prediction, without operation on actual result.

theeluwin commented 7 months ago

I thought of loss as some quantity that needs to be optimized. From this perspective, the goal of the given code is to maximize the difference in likelihood between positive and negative cases.