pdrm83 / sent2vec

How to encode sentences in a high-dimensional vector space, a.k.a., sentence embedding.
MIT License
133 stars 12 forks source link

how to use the pre-trained model in word2vec #4

Closed ezraguo-dev closed 3 years ago

ezraguo-dev commented 3 years ago

Dear friend,

I'm a newcomer to NLP, and many thanks for your works to save a lot of time for me. But, for the word2vec sample on the GitHub page, I cannot find where the definition of PRETRAINED_VECTORS_PATH is. Could you explain more detailed to me?

Many thanks,

Ezra

pdrm83 commented 3 years ago

You must first download a word2vec model from, for example, the GloVe project. The PRETRAINED_VECTORS_PATH is the address of the corresponding file on your computer. Cheers

pdrm83 commented 3 years ago

Did that response help you? If so please close the issue. Please also feel free to contribute to this open-source project.

ezraguo-dev commented 3 years ago

Great thanks for your suggestion, your response helps me alot

rky71992 commented 3 years ago

Hello, I have seen your project and its very interesting. I am trying to implement it and receiving the same issue as mentioned above. For PRETRAINED_VECTOR_PATH I have downloaded the pre-trained models from GloVe.(Wiki + Giga). Its a zip file containing 4 txt files. I have tried zip file path as well as txt file path. But its sill showing error.

Could you explain what I am doing wrong. Thanks