Closed lfoppiano closed 2 years ago
That does seem quite large. It looks like glove just uses as much memory as it can get. Based on that error line, I think it read the whole file but then couldn't allocate W. How about pruning some number of the lines or some of the words? I don't know how to pick the best ones to prune, though.
After several attempts, I reach the part of the
demo.sh
where it's training and dumping the model. Unfortunately I get the following error:The command
glove
does not have the-memory
option, so I wonder if this is due to the fact that my shuffled coreference file is 1.2 TB...Any clue or suggestion is welcome.
Thanks in advance