Andras7 / word2vec-pytorch

Extremely simple and fast word2vec implementation with Negative Sampling + Sub-sampling
178 stars 55 forks source link

fixed issue of passing empty parameter list to optimizer #9

Open prschoenfelder opened 3 years ago

prschoenfelder commented 3 years ago

Hi @Andras7, first I want to thank you for providing this code, it really is a big help. I ran into this error when trying to train the model:

Traceback (most recent call last): File "/path/to/test.py", line 8, in w2v_trainer.train() File "/path/to/venv/lib/python3.8/site-packages/word2vec/trainer.py", line 37, in train optimizer = optim.SparseAdam(self.skip_gram_model.parameters(), lr=self.initial_lr) File "/path/to/venv/lib/python3.8/site-packages/torch/optim/sparse_adam.py", line 49, in init super(SparseAdam, self).init(params, defaults) File "/path/to/venv/lib/python3.8/site-packages/torch/optim/optimizer.py", line 47, in init raise ValueError("optimizer got an empty parameter list") ValueError: optimizer got an empty parameter list

So I went ahead and wrapped self.skip_gram_model.parameters() in list(). Might this be a version issue? I ran it with python=3.8, torch=1.7.1

Works for me now, hope this is fine with you? Feel free to accept the PR.