Open AzizCode92 opened 7 years ago
The issue is solved because I was working on a very huge TSV file , so I have tried to split it into parts and then the error is solved.
Hi! Sorry, I have seen you have recently used this library..
I am training the Sentiment Specific embedding. At the end of each epoch, I have got a message like this:
23 epochs Examples: 7818897 Error: 326588146461.176880 Accuracy: 0.000000 23589 corrections skipped
The accuracy remains always zero, no matter the number of epochs. Is it ok? Did you get the same accuracy?
Thank you! :)
Hi ! I have seen Mr.Attardi's comment about the meaning of both accuracy and errors and he said
Don't worry about those numbers. You shoud get useable embeddings anyway.
Thank you!! I have just found out the comment you have referred: https://github.com/attardi/deepnl/issues/32
@AzizCode92 Got the same issue, i'm using a big file, sorry but the dl-sentiwords.py need one file (according to the example) how did you manage to input several files ? Thanks.
Halo! Have you successfully installed deepnl? Where do you install it? windows, linux or mac? i have some problems in windows? Can you help me?
Hi all , When I use
dl-sentiwords.py trained1.tsv --vocab words.txt --vectors vectors.txt
I got this errortrained1.tsv is a file with the follwing format :
I have obtained the tsv file by transforming a huge dataset of tweets into a tsv and by making some transformation to the columns so that it suits the format mentioned above. For further details about my code here is the https://github.com/AzizCode92/text_mining_project/blob/master/csv_tsv.py