epfml / sent2vec

General purpose unsupervised sentence representations
Other
1.19k stars 256 forks source link

terminate called after throwing an instance of 'std::bad_alloc' #98

Closed pbeneteau closed 4 years ago

pbeneteau commented 4 years ago

Hi,

I managed to open the wiki_bigrams.bin pre-trained model on my personal computer without any problem (16gb RAM). However, when I try to launch it on a server (8gb RAM), I get this error in 100% of the cases:

terminate called after throwing an instance of 'std::bad_alloc' 
what():    std::bad_alloc 
Aborted

Here is my code:

    def __init__(self, fasttext_model):
        self.model = sent2vec.Sent2vecModel()
        self.model.load_model(fasttext_model)

I read that this error appears when there is no more RAM available. Any idea to fix this issue?

pbeneteau commented 4 years ago

Does it mean I need a 64Gb of RAM computer to open a 64Gb model?

guptaprkhr commented 4 years ago

Hi, Yes. You need to load the model completely on the RAM.

pbeneteau commented 4 years ago

Yeah it is what I understood too. I managed to create swap memory in order to open the file even if I don’t have enough RAM. However it is a lot slower (like 1-2min) and this solution is not viable.