Open Engineerumair opened 5 years ago
I have the same problem.Had it solved?
I have the same problem.Had it solved?
No, it is not solved yet
could this be that it is not a bug and that it needs just more memory?
use randomwalk and save file to local, then use fasttext get vec.
The following code almost takes all of the memory.
node2vec = Node2Vec(G_fb, dimensions=emb_size, walk_length=80, num_walks=10, workers=4)
I have 12gb Memory and the nodes are 1.6 million which takes all of the memory and I have processed only 0.1 million node and then got memory error.
Here is the error:
...site-packages\node2vec\node2vec.py in init(self, graph, dimensions, walk_length, num_walks, p, q, weight_key, workers, sampling_strategy, quiet, temp_folder) 68 self.require = "sharedmem" 69 ---> 70 self._precompute_probabilities() 71 self.walks = self._generate_walks() 72
...site-packages\node2vec\node2vec.py in _precompute_probabilities(self) 117 if current_node not in first_travel_done: 118 first_travel_weights.append(self.graph[current_node][destination].get(self.weight_key, 1)) --> 119 d_neighbors.append(destination) 120 121 # Normalize
MemoryError:
Is there a way to process some of the nodes and then save the embedding file and then again process some of the nodes and append those to the saved file and so on?