Closed gveni closed 5 years ago
I got that problem also.
On Mar 7, 2019, at 5:28 PM, gveni notifications@github.com wrote:
Now I am trying to run the actual lda2vec process, I get the following error:
freqs=freqs) # Python list of shape (vocab_size,). Frequencies of each token, same order as embed matrix mappings. TypeError: init() got an unexpected keyword argument 'load_embeds'
Any help is appreciated. Thanks!
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.
Whoops, thought I replied yesterday! That's what I get for being on mobile. The arguments changed in that function 2 days ago (which is one of the reasons I added a warning in the readme). load_embeds
is no longer a parameter of the init function.
There were 2 redundant parameters previously: load_embeds
and pretrained_embeddings
. These have now turned into a single parameter w_in
which is None by default. If you pass your numpy array of pretrained embeddings to the w_in
parameter, it will do the same thing as having load_embeds=True, pretrained_embeddings=embedding_matrix
.
Also, I'm open to changing the w_in
parameter name to pretrained_embeddings
if that's less confusing. What do you think @dbl001 ?
Sounds good!
On Mar 8, 2019, at 9:26 AM, Nathan Raw notifications@github.com wrote:
Also, I'm open to changing the w_in parameter name to pretrained_embeddings if that's less confusing. What do you think @dbl001 https://github.com/dbl001 ?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/nateraw/Lda2vec-Tensorflow/issues/33#issuecomment-471008916, or mute the thread https://github.com/notifications/unsubscribe-auth/AC9i21ujqKdODSeSn4pbLRdvr6nGE828ks5vUp1XgaJpZM4bkdNX.
Cool! I'll be updating more stuff tonight. I'll try to get that readme updated too. I'll leave this issue open until then. Thanks guys!
Thanks @nateraw. For now, should I replace load_embeds and pretrained_embeddings parameters of model function with w_in = None to resolve this error?
Or this:
embedding_matrix = P.load_glove("glove.6B.300d.txt")
m = model(num_docs, vocab_size, num_topics=num_topics, embedding_size=embed_size, restore=False, w_in=embedding_matrix, freqs=freqs)
On Mar 8, 2019, at 9:50 AM, gveni notifications@github.com wrote:
Thanks @nateraw https://github.com/nateraw. For now, should I replace load_embeds and pretrained_embeddings parameters of model function with w_in = None?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/nateraw/Lda2vec-Tensorflow/issues/33#issuecomment-471016520, or mute the thread https://github.com/notifications/unsubscribe-auth/AC9i2wwVd2Liy1Viv_H826dqdYz90RyWks5vUqLvgaJpZM4bkdNX.
@gveni what @dbl001 just commented is correct!
Let me download the embedding matrix and see how this goes. Thanks @dbl001 and @nateraw.
FYI
On Mar 8, 2019, at 10:03 AM, gveni notifications@github.com wrote:
Let me download the embedding matrix and see how this goes. Thanks @dbl001 https://github.com/dbl001 and @nateraw https://github.com/nateraw.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/nateraw/Lda2vec-Tensorflow/issues/33#issuecomment-471020640, or mute the thread https://github.com/notifications/unsubscribe-auth/AC9i25L80Tb6QehWzQ6GXqvHRHERKmKvks5vUqYGgaJpZM4bkdNX.
Links to download embeds should be in readme (but they aren't). I'll add an issue.
FYI, you meant 'glove.6B.50d.txt' and not 'glove.6B.300d.txt'. Otherwise it throws ValueError. Thanks
Actually it depends which glove embeddings you downloaded. I have the 300 dimension ones, you must be using the 50 dimension version.
When you download the glove.6B zip file, you get the 50d, 100d, 200d, and 300d versions.
Now I am trying to run the actual lda2vec process, I get the following error:
freqs=freqs) # Python list of shape (vocab_size,). Frequencies of each token, same order as embed matrix mappings. TypeError: init() got an unexpected keyword argument 'load_embeds'
Any help is appreciated. Thanks!