graykode / nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers
https://www.reddit.com/r/MachineLearning/comments/amfinl/project_nlptutoral_repository_who_is_studying/
MIT License
14.32k stars 3.95k forks source link

A questions about seq2seq-torch.py in the 43th line #24

Closed MowanHu closed 5 years ago

MowanHu commented 5 years ago

Hi, Im an nlp rookie, I want to ask u a question, your code extract input(context) in a fixed window in 43th area, and "word sequence" is a sentences list , some words may extract their neighbour words form different sentences, so, is this way harm to the result?

And my training result seems not very well and I didn't change the codes. image

If u see this issues, please answer me in your free time. Although my english is poor, I still want to express my gratitude to u.

graykode commented 5 years ago

Hello. It will be more helpful for me if you add code line link such as https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py#L10

MowanHu commented 5 years ago

Hello. It will be more helpful for me if you add code line link such as https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py#L10

Sorry, i made a wrong title. The question is from https://github.com/graykode/nlp-tutorial/blob/master/1-2.Word2Vec/Word2Vec-Skipgram-Torch(Softmax).py#L44

And "word sequence" is a sentences list , some words may extract their neighbour words form different sentences, so, is this way harm to the result? In my view, ngram should operate in the same sentence, not different sentences.

graykode commented 5 years ago

Yes you are right but I dont care in this example.

MowanHu commented 5 years ago

Yes you are right but I dont care in this example.

Thank u, I know u means.