Closed hoi-nx closed 5 years ago
Did you try using the python wrapper instead?
I used to 16GB (700dim, trained on english wikipedia) but it not working. Can you help me ??
Did you link the models correctly?
MODEL_WIKI_UNIGRAMS = os.path.abspath("./sent2vec_wiki_unigrams")
MODEL_WIKI_BIGRAMS = os.path.abspath("./sent2vec_wiki_bigrams")
I checked it and it was completely correct. (wiki_bigrams 16gb).
Did you try to use the Python wrapper? I'll try to see this weekend if I can fix the notebook, but the Cython wrapper should be the default solution unless you're on Windows or Mac
I had a similar error. The embeddings created was empty.
Traceback (most recent call last):
File "main.py", line 145, in
I had the same empty embedding error
I run get_sentence_embeddings_from_pre-trained_models but result my_embeddings.shape is (0,)