Closed darentsia closed 4 years ago
../sent2vec/fasttext sent2vec -input data/preprocessed_docs/result_0.txt -output my_model_s2v -minCount 10 -dim 700 -epoch 7 -lr 0.3 -wordNgrams 2 -loss ns -neg 10 -thread 48 -t 0.00005 -dropoutK 4 -bucket 10000 Read 227M words Number of words: 374566 Number of labels: 0 Progress: 19.0% words/sec/thread: 59 lr: 0.242889 loss: 20.702034 eta: 125h56m Segmentation fault
Trained with these parameters, but the training process ended with the message:
"Segmentation fault"
Can you, please, explain, why?
Machine parameters: 48 cores, 126Gb RAM. 1.5Tb of free disk space.
Size of the input file: 1.5Gb
Hi, I think you should try lowering the learning rate. Sometimes training with a high learning rate might end up blowing up the gradients.
Trained with these parameters, but the training process ended with the message:
Can you, please, explain, why?
Machine parameters: 48 cores, 126Gb RAM. 1.5Tb of free disk space.
Size of the input file: 1.5Gb