Closed ericlingit closed 4 years ago
That section in the gensim documentation refers to continuing training with gensim specifically. I didn't train the embeddings with gensim, but with GloVe, so this section doesn't really apply here. If you want to continue training the embeddings (this is usually called "finetuning"), you can load them in a deep learning framework like PyTorch:
>>> from torch import nn, tensor
>>> from bpemb import BPEmb
>>> bpemb_en = BPEmb(lang="en", vs=100000, dim=100)
>>> emb_layer = nn.Embedding.from_pretrained(tensor(bpemb_en.vectors))
>>> emb_layer
Embedding(100000, 100)
Is it possible to continue training with your pre-trained models?
From this page, it is stated that
and gensim's documentation mentions that:
I'm assuming the answer is no? Please correct me if I'm wrong.