attardi / deepnl

Deep Learning for Natural Language Processing
GNU General Public License v3.0
457 stars 116 forks source link

Error increasing with new epochs #59

Open rudra0713 opened 5 years ago

rudra0713 commented 5 years ago

Hi, I have been trying to do an sentiment analysis on twitter data using sswe. However, as each epoch is passing, the error value is continuously increasing and accuracy always remains 0. Can someone explain to me why this is happening!!

The following is a sample of error:

Epoch: 39, pairs: 900000, sent: 406, avg. error: 1794.635 Epoch: 39, pairs: 910000, sent: 919, avg. error: 1811.219 40 epochs Examples: 914760 Error: 124025225.271232 Accuracy: 0.000000 11924 corrections skipped Epoch: 40, pairs: 920000, sent: 259, avg. error: 1841.875 Epoch: 40, pairs: 930000, sent: 772, avg. error: 1851.828 41 epochs Examples: 937629 Error: 126510924.248558 Accuracy: 0.000000 11738 corrections skipped Epoch: 41, pairs: 940000, sent: 117, avg. error: 1883.111 Epoch: 41, pairs: 950000, sent: 619, avg. error: 1891.424 Epoch: 41, pairs: 960000, sent: 1141, avg. error: 1913.427 42 epochs Examples: 960498 Error: 129020667.139668 Accuracy: 0.000000 11861 corrections skipped Epoch: 42, pairs: 970000, sent: 474, avg. error: 1932.033

I am using python dl-sentiwords.py training_twitter.tsv --vocab twitter_words.txt --vectors twitter_word_vectors.txt --variant word2vec command to run sswe. The tsv file contains actual tweets in this format: 619971047195045888 1 negative "if these runway renovations at the airport prevent me from seeing taylor swift on monday, bad blood will have a new meaning."

words.txt contains the individual words and vectors.txt contains word2vec embedding for each word.

Any help will be highly appreciated.

ravi03071991 commented 5 years ago

Hi,

Facing the same issue here. Did you find the solution for it?

Regards, D. Ravi Theja.

samoliver360 commented 5 years ago

Hi, I have been trying to do an sentiment analysis on twitter data using sswe. However, as each epoch is passing, the error value is continuously increasing and accuracy always remains 0. Can someone explain to me why this is happening!!

The following is a sample of error:

Epoch: 39, pairs: 900000, sent: 406, avg. error: 1794.635 Epoch: 39, pairs: 910000, sent: 919, avg. error: 1811.219 40 epochs Examples: 914760 Error: 124025225.271232 Accuracy: 0.000000 11924 corrections skipped Epoch: 40, pairs: 920000, sent: 259, avg. error: 1841.875 Epoch: 40, pairs: 930000, sent: 772, avg. error: 1851.828 41 epochs Examples: 937629 Error: 126510924.248558 Accuracy: 0.000000 11738 corrections skipped Epoch: 41, pairs: 940000, sent: 117, avg. error: 1883.111 Epoch: 41, pairs: 950000, sent: 619, avg. error: 1891.424 Epoch: 41, pairs: 960000, sent: 1141, avg. error: 1913.427 42 epochs Examples: 960498 Error: 129020667.139668 Accuracy: 0.000000 11861 corrections skipped Epoch: 42, pairs: 970000, sent: 474, avg. error: 1932.033

I am using python dl-sentiwords.py training_twitter.tsv --vocab twitter_words.txt --vectors twitter_word_vectors.txt --variant word2vec command to run sswe. The tsv file contains actual tweets in this format: 619971047195045888 1 negative "if these runway renovations at the airport prevent me from seeing taylor swift on monday, bad blood will have a new meaning."

words.txt contains the individual words and vectors.txt contains word2vec embedding for each word.

Any help will be highly appreciated.

Hello, rudra0713. Is everything ok? Can you share with us the step-by-step of commands+arguments to run the SSWE? It is not very clear in documentation!!!!

samoliver360 commented 5 years ago

Hi,

Facing the same issue here. Did you find the solution for it?

Regards, D. Ravi Theja.

Hello, D. Ravi. Did you reach success in using this implementation? If yes, please can you clarify the usage(step-by-step) for me? I would be very grateful. Regards. Samir. Masters student.

ravi03071991 commented 5 years ago

Hi @samoliver360,

Sorry, I could not solve the issue.

Regards, D. Ravi Theja.