SPOClab-ca / BENDR

170 stars 57 forks source link

about the loss function #3

Closed stickOverCarrot closed 3 years ago

stickOverCarrot commented 3 years ago

Very appreciate for your contribution.i am really interested in the self training in EEG. The only question is about calculating loss function. In your paper, The calculation of the denominator uses cosine similarity between the output of the transformer and the 20 distractors and the input of the transformer. However, in the code, the calculation of the denominator uses cosine similarity between the input of the transformer and the 20 distractors, and the output of the transformer. In other word, the output and the input switch positions. Are both the calculation approaches the same? Or why did you change the calculation approache in the code? Thanks!

kostasde commented 3 years ago

I believe both are the same, the argument order to cosine similarity is irrelevant. The important thing in the section of code I believe you are describing is that the comparison between c and z are localized at the right index (zero'th).

As to why I implemented this way, I want to say that it was because there was a previous loss function and I made the least number of code changes to construct this one, but honestly don't know for sure. If you find that this doesn't work as I suggest still, please get in touch. Sorry for any confusion!!

stickOverCarrot commented 3 years ago

@kostasde Thank you for your answer! I'l try to run the code at your suggestion.