Closed atTC2 closed 6 years ago
the model was created to be loaded via gensim.
Nonetheless you can simply dump the model, you can use https://github.com/idio/wiki2vec/blob/master/resources/gensim/convert_model.py
python convert_model.py /something/your/en.model
Thank you - it is now happy to load the model!
Now just to get the RAM to load it - do you have any recommendations on size?
No idea about memory usage in DeepLearning4j
No worries, if I get it to work I'll add it to the thread in case it helps others (but can confirm it needs > 15GiB).
Thanks for the help!
I'm trying to load
en.model
withdeeplearning4j
's Word2Vec implementation.The following code is used:
but unfortunately this exception is thrown:
Running your Python 'quick start' example works fine, so I'm unsure where the problem lies - either with me for not loading it correctly in some way or with DL4J (in which case I apologise for making an issue here).
Has this issue been seen before? Do you know if this is the correct way for you to load your data with dl4j's implementation? Thank you for any help.