graykode / nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers
https://www.reddit.com/r/MachineLearning/comments/amfinl/project_nlptutoral_repository_who_is_studying/
MIT License
14.07k stars 3.91k forks source link

Added CBOW method to Torch file #14

Closed us closed 5 years ago

us commented 5 years ago

Word2Vec has 2 methods skip-grams and CBOW(Continous Bag of Words) and I added CBOW method.

us commented 5 years ago

and change file name to Word2Vec-Torch-SkipGram(Softmax).py

graykode commented 5 years ago

@us Could you tell me what is different with my Skip-gram code?!

graykode commented 5 years ago

@us Ok I will change my file name, then after I will wait CBOW code ! Thanks

graykode commented 5 years ago

@us See my Commits https://github.com/graykode/nlp-tutorial/commit/6a2a47a1146c6f53effaa44072db155bdc1338ae I will close this Pr, Thanks!

us commented 5 years ago

Just changed w and target location. If you know CBOW, that is learning to predict the word by the context. The skip-gram model is designed to predict the context.

I changed there :

for w in context:
        skip_grams.append([w, target])

https://stackoverflow.com/questions/38287772/cbow-v-s-skip-gram-why-invert-context-and-target-words