eliorc / node2vec

Implementation of the node2vec algorithm.
MIT License
1.2k stars 245 forks source link

there is problem with parallel job , when my computer may do parallel calcualtoins #3

Closed Sandy4321 closed 6 years ago

Sandy4321 commented 6 years ago

for example when I use this repo https://github.com/HKUST-KnowComp/MNE

then parallel jobs are happen it is output for example

C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
We are loading data from: e:\graphs ML\code\Scalable_Multiplex_Network_Embedding_MNE_27may_removed_LINE_Cpp_code\MNE-master\data\Vickers-Chan-7thGraders_multiplex.edges
Finish loading data
finish building the graph
2018-05-27 08:24:31,539 : WARNING : Slow version of MNE is being used
2018-05-27 08:24:31,539 : INFO : collecting all words and their counts
2018-05-27 08:24:31,539 : INFO : PROGRESS: at sentence #0, processed 0 words, keeping 0 word types
2018-05-27 08:24:31,542 : INFO : collected 29 word types from a corpus of 5800 raw words and 580 sentences
2018-05-27 08:24:31,542 : INFO : Loading a fresh vocabulary
2018-05-27 08:24:31,542 : INFO : min_count=0 retains 29 unique words (100% of original 29, drops 0)
2018-05-27 08:24:31,542 : INFO : min_count=0 leaves 5800 word corpus (100% of original 5800, drops 0)
2018-05-27 08:24:31,543 : INFO : deleting the raw counts dictionary of 29 items
2018-05-27 08:24:31,546 : INFO : sample=0.001 downsamples 29 most-common words
2018-05-27 08:24:31,546 : INFO : downsampling leaves estimated 1133 word corpus (19.5% of prior 5800)
2018-05-27 08:24:31,546 : INFO : estimated required memory for 29 words and 200 dimensions: 60900 bytes
2018-05-27 08:24:31,546 : INFO : resetting layer weights
E:\graphs ML\code\Scalable_Multiplex_Network_Embedding_MNE_27may_removed_LINE_Cpp_code\MNE-master\MNE.py:655: UserWarning: C extension not loaded for Word2Vec, training will be slow. Install a C compiler and reinstall gensim for fast training.
  warnings.warn("C extension not loaded for Word2Vec, training will be slow. "
2018-05-27 08:24:31,549 : INFO : training model with 4 workers on 29 vocabulary and 200 features, using sg=1 hs=0 sample=0.001 negative=5 window=5
2018-05-27 08:24:31,550 : INFO : expecting 580 sentences, matching count from corpus used for vocabulary survey
2018-05-27 08:24:35,637 : INFO : PROGRESS: at 1.72% examples, 477 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:24:39,745 : INFO : PROGRESS: at 8.62% examples, 1189 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:24:43,835 : INFO : PROGRESS: at 15.52% examples, 1445 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:24:48,022 : INFO : PROGRESS: at 22.41% examples, 1559 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:24:52,222 : INFO : PROGRESS: at 29.31% examples, 1613 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:24:56,323 : INFO : PROGRESS: at 36.21% examples, 1664 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:24:57,571 : INFO : PROGRESS: at 41.38% examples, 1813 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:00,408 : INFO : PROGRESS: at 43.10% examples, 1701 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:25:01,824 : INFO : PROGRESS: at 48.28% examples, 1813 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:25:04,289 : INFO : PROGRESS: at 50.00% examples, 1735 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:05,993 : INFO : PROGRESS: at 55.17% examples, 1817 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:08,556 : INFO : PROGRESS: at 56.90% examples, 1745 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:10,071 : INFO : PROGRESS: at 62.07% examples, 1828 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:12,396 : INFO : PROGRESS: at 63.79% examples, 1770 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:14,412 : INFO : PROGRESS: at 68.97% examples, 1825 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:25:16,363 : INFO : PROGRESS: at 70.69% examples, 1789 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:18,620 : INFO : PROGRESS: at 75.86% examples, 1828 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:20,163 : INFO : PROGRESS: at 77.59% examples, 1809 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:25:21,434 : INFO : PROGRESS: at 81.03% examples, 1839 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:25:22,846 : INFO : PROGRESS: at 82.76% examples, 1827 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:25:24,182 : INFO : PROGRESS: at 84.48% examples, 1817 words/s, in_qsize 7, out_qsize 0
2018-05-27 08:25:25,422 : INFO : PROGRESS: at 86.21% examples, 1812 words/s, in_qsize 8, out_qsize 0
2018-05-27 08:25:26,625 : INFO : PROGRESS: at 89.66% examples, 1843 words/s, in_qsize 6, out_qsize 0
2018-05-27 08:25:28,240 : INFO : PROGRESS: at 91.38% examples, 1824 words/s, in_qsize 5, out_qsize 0
2018-05-27 08:25:29,541 : INFO : PROGRESS: at 93.10% examples, 1818 words/s, in_qsize 4, out_qsize 0
2018-05-27 08:25:29,748 : INFO : worker thread finished; awaiting finish of 3 more threads
2018-05-27 08:25:30,427 : INFO : worker thread finished; awaiting finish of 2 more threads
2018-05-27 08:25:31,174 : INFO : PROGRESS: at 98.28% examples, 1866 words/s, in_qsize 1, out_qsize 1
2018-05-27 08:25:31,174 : INFO : worker thread finished; awaiting finish of 1 more threads
2018-05-27 08:25:31,362 : INFO : worker thread finished; awaiting finish of 0 more threads
2018-05-27 08:25:31,362 : INFO : training on 580000 raw words (113232 effective words) took 59.8s, 1894 effective words/s
2018-05-27 08:25:31,512 : WARNING : Slow version of MNE is being used
2018-05-27 08:25:31,512 : INFO : collecting all words and their counts
2018-05-27 08:25:31,512 : INFO : PROGRESS: at sentence #0, processed 0 words, keeping 0 word types
2018-05-27 08:25:31,512 : INFO : collected 29 word types from a corpus of 5800 raw words and 580 sentences
2018-05-27 08:25:31,512 : INFO : Loading a fresh vocabulary
2018-05-27 08:25:31,512 : INFO : min_count=0 retains 29 unique words (100% of original 29, drops 0)
2018-05-27 08:25:31,512 : INFO : min_count=0 leaves 5800 word corpus (100% of original 5800, drops 0)
2018-05-27 08:25:31,512 : INFO : deleting the raw counts dictionary of 29 items
2018-05-27 08:25:31,512 : INFO : sample=0.001 downsamples 29 most-common words
2018-05-27 08:25:31,512 : INFO : downsampling leaves estimated 1137 word corpus (19.6% of prior 5800)
2018-05-27 08:25:31,512 : INFO : estimated required memory for 29 words and 200 dimensions: 60900 bytes
2018-05-27 08:25:31,512 : INFO : resetting layer weights
2018-05-27 08:25:31,528 : INFO : training model with 4 workers on 29 vocabulary and 200 features, using sg=1 hs=0 sample=0.001 negative=5 window=5
2018-05-27 08:25:31,528 : INFO : expecting 580 sentences, matching count from corpus used for vocabulary survey
2018-05-27 08:25:35,376 : INFO : PROGRESS: at 17.24% examples, 488 words/s, in_qsize 5, out_qsize 0
2018-05-27 08:25:35,714 : INFO : worker thread finished; awaiting finish of 3 more threads
2018-05-27 08:25:35,830 : INFO : worker thread finished; awaiting finish of 2 more threads
2018-05-27 08:25:37,110 : INFO : PROGRESS: at 82.76% examples, 1694 words/s, in_qsize 1, out_qsize 1
2018-05-27 08:25:37,112 : INFO : worker thread finished; awaiting finish of 1 more threads
2018-05-27 08:25:37,177 : INFO : worker thread finished; awaiting finish of 0 more threads
2018-05-27 08:25:37,177 : INFO : training on 58000 raw words (11320 effective words) took 5.6s, 2016 effective words/s
2018-05-27 08:25:37,177 : WARNING : under 10 jobs per worker: consider setting a smaller `batch_words' for smoother alpha decay
2018-05-27 08:25:37,277 : WARNING : Slow version of MNE is being used
2018-05-27 08:25:37,277 : INFO : collecting all words and their counts
2018-05-27 08:25:37,277 : INFO : PROGRESS: at sentence #0, processed 0 words, keeping 0 word types
2018-05-27 08:25:37,277 : INFO : collected 29 word types from a corpus of 5800 raw words and 580 sentences
2018-05-27 08:25:37,277 : INFO : Loading a fresh vocabulary
2018-05-27 08:25:37,277 : INFO : min_count=0 retains 29 unique words (100% of original 29, drops 0)
2018-05-27 08:25:37,277 : INFO : min_count=0 leaves 5800 word corpus (100% of original 5800, drops 0)
2018-05-27 08:25:37,277 : INFO : deleting the raw counts dictionary of 29 items
2018-05-27 08:25:37,277 : INFO : sample=0.001 downsamples 29 most-common words
2018-05-27 08:25:37,277 : INFO : downsampling leaves estimated 1091 word corpus (18.8% of prior 5800)
2018-05-27 08:25:37,277 : INFO : estimated required memory for 29 words and 200 dimensions: 60900 bytes
2018-05-27 08:25:37,277 : INFO : resetting layer weights
2018-05-27 08:25:37,277 : INFO : training model with 4 workers on 29 vocabulary and 200 features, using sg=1 hs=0 sample=0.001 negative=5 window=5
2018-05-27 08:25:37,277 : INFO : expecting 580 sentences, matching count from corpus used for vocabulary survey
eliorc commented 6 years ago

I have tested the parallel part on different machine and they all work.

The code you are supplying here might not be using joblib implementation of Parallel execution - this was the part that was creating the exception in your previous issue.

I have tested the code on the example with 4 workers - everything works fine.

Since I can't replicate your problem I can't solve it. Verify that a parallel code USING JOBLIB is running fine on the same interpreter you are trying to run node2vec.

If that works please supply a program that runs the working code with the failing node2vec code on the same interpreter.

Sandy4321 commented 6 years ago

this code runs on one core import networkx as nx from node2vec import Node2Vec

https://github.com/eliorc/node2vec

''' https://github.com/eliorc/node2vec previous Python3 implementation of the node2vec algorithm Aditya Grover, Jure Leskovec and Vid Kocijan. node2vec: Scalable Feature Learning for Networks. A. Grover, J. Leskovec. ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2016. https://github.com/aditya-grover/node2vec https://github.com/snap-stanford/snap/tree/master/examples/node2vec https://snap.stanford.edu/node2vec/#code https://snap.stanford.edu/node2vec/

better code: https://github.com/HKUST-KnowComp/MNE '''

FILES

EMBEDDING_FILENAME = './embeddings.emb' EMBEDDING_MODEL_FILENAME = './embeddings.model'

Create a graph

graph = nx.fast_gnp_random_graph(n=100, p=0.5)

Precompute probabilities and generate walks

original node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original

change number of workers to 1 instead of 4

node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers= 1) #May25

Embed

model = node2vec.fit(window=10, min_count=1, batch_words=4) # Any keywords acceptable by gensim.Word2Vec can be passed, diemnsions and workers are automatically passed (from the Node2Vec constructor)

Look for most similar nodes

model.wv.most_similar('2') # Output node names are always strings

Save embeddings for later use

model.wv.save_word2vec_format(EMBEDDING_FILENAME)

Save model for later use

model.save(EMBEDDING_MODEL_FILENAME) q=1

Sandy4321 commented 6 years ago
import networkx as nx
from node2vec import Node2Vec
#https://github.com/eliorc/node2vec

'''
https://github.com/eliorc/node2vec
previous
Python3 implementation of the node2vec algorithm Aditya Grover, Jure Leskovec and Vid Kocijan.
node2vec: Scalable Feature Learning for Networks. A. Grover, J. Leskovec. 
ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2016.
https://github.com/aditya-grover/node2vec
https://github.com/snap-stanford/snap/tree/master/examples/node2vec
https://snap.stanford.edu/node2vec/#code
https://snap.stanford.edu/node2vec/

better code:
https://github.com/HKUST-KnowComp/MNE
'''

# FILES
EMBEDDING_FILENAME = './embeddings.emb'
EMBEDDING_MODEL_FILENAME = './embeddings.model'

# Create a graph
graph = nx.fast_gnp_random_graph(n=100, p=0.5)

# Precompute probabilities and generate walks
# original node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
# change number of workers to 1 instead of 4
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers= 1) #May25

# Embed
model = node2vec.fit(window=10, min_count=1, batch_words=4)  # Any keywords acceptable by gensim.Word2Vec can be passed, `diemnsions` and `workers` are automatically passed (from the Node2Vec constructor)

# Look for most similar nodes
model.wv.most_similar('2')  # Output node names are always strings

# Save embeddings for later use
model.wv.save_word2vec_format(EMBEDDING_FILENAME)

# Save model for later use
model.save(EMBEDDING_MODEL_FILENAME)
q=1
Sandy4321 commented 6 years ago

typical output is

C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]
Computing transition probabilities:   1%|          | 1/100 [00:00<00:10,  9.24it/s]
Computing transition probabilities:   3%|▎         | 3/100 [00:00<00:08, 11.29it/s]
Computing transition probabilities:   5%|▌         | 5/100 [00:00<00:07, 12.49it/s]
Computing transition probabilities:   7%|▋         | 7/100 [00:00<00:07, 12.65it/s]
Computing transition probabilities:   9%|▉         | 9/100 [00:00<00:06, 13.03it/s]
Computing transition probabilities:  11%|█         | 11/100 [00:00<00:06, 12.93it/s]
Computing transition probabilities:  13%|█▎        | 13/100 [00:00<00:06, 13.29it/s]
Computing transition probabilities:  15%|█▌        | 15/100 [00:01<00:06, 13.52it/s]
Computing transition probabilities:  17%|█▋        | 17/100 [00:01<00:06, 13.62it/s]
Computing transition probabilities:  19%|█▉        | 19/100 [00:01<00:05, 13.81it/s]
Computing transition probabilities:  21%|██        | 21/100 [00:01<00:05, 13.98it/s]
Computing transition probabilities:  23%|██▎       | 23/100 [00:01<00:05, 13.94it/s]
Computing transition probabilities:  25%|██▌       | 25/100 [00:01<00:05, 14.00it/s]
Computing transition probabilities:  27%|██▋       | 27/100 [00:01<00:05, 14.03it/s]
Computing transition probabilities:  29%|██▉       | 29/100 [00:02<00:05, 14.02it/s]
Computing transition probabilities:  31%|███       | 31/100 [00:02<00:04, 14.01it/s]
Computing transition probabilities:  33%|███▎      | 33/100 [00:02<00:04, 14.14it/s]
Computing transition probabilities:  35%|███▌      | 35/100 [00:02<00:04, 14.16it/s]
Computing transition probabilities:  37%|███▋      | 37/100 [00:02<00:04, 14.27it/s]
Computing transition probabilities:  39%|███▉      | 39/100 [00:02<00:04, 14.27it/s]
Computing transition probabilities:  41%|████      | 41/100 [00:02<00:04, 14.35it/s]
Computing transition probabilities:  43%|████▎     | 43/100 [00:02<00:03, 14.46it/s]
Computing transition probabilities:  45%|████▌     | 45/100 [00:03<00:03, 14.50it/s]
Computing transition probabilities:  47%|████▋     | 47/100 [00:03<00:03, 14.50it/s]
Computing transition probabilities:  49%|████▉     | 49/100 [00:03<00:03, 14.49it/s]
Computing transition probabilities:  51%|█████     | 51/100 [00:03<00:03, 14.55it/s]
Computing transition probabilities:  53%|█████▎    | 53/100 [00:03<00:03, 14.58it/s]
Computing transition probabilities:  55%|█████▌    | 55/100 [00:03<00:03, 14.60it/s]
Computing transition probabilities:  57%|█████▋    | 57/100 [00:03<00:02, 14.62it/s]
Computing transition probabilities:  59%|█████▉    | 59/100 [00:04<00:02, 14.64it/s]
Computing transition probabilities:  61%|██████    | 61/100 [00:04<00:02, 14.56it/s]
Computing transition probabilities:  63%|██████▎   | 63/100 [00:04<00:02, 14.60it/s]
Computing transition probabilities:  65%|██████▌   | 65/100 [00:04<00:02, 14.67it/s]
Computing transition probabilities:  67%|██████▋   | 67/100 [00:04<00:02, 14.66it/s]
Computing transition probabilities:  69%|██████▉   | 69/100 [00:04<00:02, 14.68it/s]
Computing transition probabilities:  71%|███████   | 71/100 [00:04<00:01, 14.72it/s]
Computing transition probabilities:  73%|███████▎  | 73/100 [00:04<00:01, 14.74it/s]
Computing transition probabilities:  75%|███████▌  | 75/100 [00:05<00:01, 14.73it/s]
Computing transition probabilities:  77%|███████▋  | 77/100 [00:05<00:01, 14.75it/s]
Computing transition probabilities:  79%|███████▉  | 79/100 [00:05<00:01, 14.78it/s]
Computing transition probabilities:  81%|████████  | 81/100 [00:05<00:01, 14.78it/s]
Computing transition probabilities:  83%|████████▎ | 83/100 [00:05<00:01, 14.76it/s]
Computing transition probabilities:  85%|████████▌ | 85/100 [00:05<00:01, 14.75it/s]
Computing transition probabilities:  87%|████████▋ | 87/100 [00:05<00:00, 14.74it/s]
Computing transition probabilities:  89%|████████▉ | 89/100 [00:06<00:00, 14.73it/s]
Computing transition probabilities:  91%|█████████ | 91/100 [00:06<00:00, 14.68it/s]
Computing transition probabilities:  93%|█████████▎| 93/100 [00:06<00:00, 14.71it/s]
Computing transition probabilities:  95%|█████████▌| 95/100 [00:06<00:00, 14.72it/s]
Computing transition probabilities:  97%|█████████▋| 97/100 [00:06<00:00, 14.67it/s]
Computing transition probabilities:  99%|█████████▉| 99/100 [00:06<00:00, 14.72it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:06<00:00, 14.73it/s]
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]
Computing transition probabilities:   7%|▋         | 7/100 [00:00<00:01, 64.84it/s]
Computing transition probabilities:   7%|▋         | 7/100 [00:00<00:01, 62.42it/s]
Computing transition probabilities:   6%|▌         | 6/100 [00:00<00:01, 49.13it/s]
Computing transition probabilities:   7%|▋         | 7/100 [00:00<00:01, 59.00it/s]
Computing transition probabilities:  15%|█▌        | 15/100 [00:00<00:01, 68.23it/s]
Computing transition probabilities:  13%|█▎        | 13/100 [00:00<00:01, 57.14it/s]
Computing transition probabilities:  12%|█▏        | 12/100 [00:00<00:01, 52.55it/s]
Computing transition probabilities:  15%|█▌        | 15/100 [00:00<00:01, 64.67it/s]
Computing transition probabilities:  19%|█▉        | 19/100 [00:00<00:01, 57.88it/s]
Computing transition probabilities:  23%|██▎       | 23/100 [00:00<00:01, 68.94it/s]
Computing transition probabilities:  19%|█▉        | 19/100 [00:00<00:01, 54.41it/s]
Computing transition probabilities:  21%|██        | 21/100 [00:00<00:01, 61.72it/s]
Computing transition probabilities:  25%|██▌       | 25/100 [00:00<00:01, 57.72it/s]
Computing transition probabilities:  29%|██▉       | 29/100 [00:00<00:01, 66.53it/s]
Computing transition probabilities:  25%|██▌       | 25/100 [00:00<00:01, 55.38it/s]
Computing transition probabilities:  27%|██▋       | 27/100 [00:00<00:01, 60.20it/s]
Computing transition probabilities:  32%|███▏      | 32/100 [00:00<00:01, 59.00it/s]
Computing transition probabilities:  36%|███▌      | 36/100 [00:00<00:00, 65.13it/s]
Computing transition probabilities:  31%|███       | 31/100 [00:00<00:01, 55.73it/s]
Computing transition probabilities:  33%|███▎      | 33/100 [00:00<00:01, 60.08it/s]
Computing transition probabilities:  38%|███▊      | 38/100 [00:00<00:01, 57.91it/s]
Computing transition probabilities:  37%|███▋      | 37/100 [00:00<00:01, 56.36it/s]
Computing transition probabilities:  42%|████▏     | 42/100 [00:00<00:00, 63.97it/s]
Computing transition probabilities:  39%|███▉      | 39/100 [00:00<00:01, 59.04it/s]
Computing transition probabilities:  49%|████▉     | 49/100 [00:00<00:00, 63.81it/s]
Computing transition probabilities:  45%|████▌     | 45/100 [00:00<00:00, 57.81it/s]
Computing transition probabilities:  44%|████▍     | 44/100 [00:00<00:00, 56.48it/s]
Computing transition probabilities:  46%|████▌     | 46/100 [00:00<00:00, 58.83it/s]
Computing transition probabilities:  56%|█████▌    | 56/100 [00:00<00:00, 63.99it/s]
Computing transition probabilities:  51%|█████     | 51/100 [00:00<00:00, 57.51it/s]
Computing transition probabilities:  52%|█████▏    | 52/100 [00:00<00:00, 57.86it/s]
Computing transition probabilities:  53%|█████▎    | 53/100 [00:00<00:00, 59.67it/s]
Computing transition probabilities:  63%|██████▎   | 63/100 [00:00<00:00, 64.59it/s]
Computing transition probabilities:  59%|█████▉    | 59/100 [00:01<00:00, 58.71it/s]
Computing transition probabilities:  61%|██████    | 61/100 [00:00<00:00, 61.65it/s]
Computing transition probabilities:  57%|█████▋    | 57/100 [00:01<00:00, 56.45it/s]
Computing transition probabilities:  70%|███████   | 70/100 [00:01<00:00, 64.21it/s]
Computing transition probabilities:  67%|██████▋   | 67/100 [00:01<00:00, 60.51it/s]
Computing transition probabilities:  63%|██████▎   | 63/100 [00:01<00:00, 56.08it/s]
Computing transition probabilities:  68%|██████▊   | 68/100 [00:01<00:00, 61.49it/s]
Computing transition probabilities:  77%|███████▋  | 77/100 [00:01<00:00, 64.65it/s]
Computing transition probabilities:  74%|███████▍  | 74/100 [00:01<00:00, 61.23it/s]
Computing transition probabilities:  71%|███████   | 71/100 [00:01<00:00, 57.97it/s]
Computing transition probabilities:  75%|███████▌  | 75/100 [00:01<00:00, 61.60it/s]
Computing transition probabilities:  81%|████████  | 81/100 [00:01<00:00, 61.72it/s]
Computing transition probabilities:  84%|████████▍ | 84/100 [00:01<00:00, 63.77it/s]
Computing transition probabilities:  78%|███████▊  | 78/100 [00:01<00:00, 58.10it/s]
Computing transition probabilities:  82%|████████▏ | 82/100 [00:01<00:00, 61.70it/s]
Computing transition probabilities:  88%|████████▊ | 88/100 [00:01<00:00, 61.75it/s]
Computing transition probabilities:  92%|█████████▏| 92/100 [00:01<00:00, 64.51it/s]
Computing transition probabilities:  89%|████████▉ | 89/100 [00:01<00:00, 62.06it/s]
Computing transition probabilities:  85%|████████▌ | 85/100 [00:01<00:00, 58.08it/s]
Computing transition probabilities:  99%|█████████▉| 99/100 [00:01<00:00, 63.96it/s]
Computing transition probabilities:  95%|█████████▌| 95/100 [00:01<00:00, 61.23it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 63.65it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main

Computing transition probabilities:  91%|█████████ | 91/100 [00:01<00:00, 57.86it/s]
Computing transition probabilities:  96%|█████████▌| 96/100 [00:01<00:00, 61.68it/s]    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__

Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 61.64it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module

Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 60.86it/s]
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
Traceback (most recent call last):
  File "<string>", line 1, in <module>
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__

Computing transition probabilities:  98%|█████████▊| 98/100 [00:01<00:00, 58.56it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 58.72it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
        self.walks = self._generate_walks()    self.walks = self._generate_walks()

  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
        in enumerate(num_walks_lists, 1))in enumerate(num_walks_lists, 1))

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
        n_jobs = self._initialize_backend()n_jobs = self._initialize_backend()

n_jobs = self._initialize_backend()  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend

n_jobs = self._initialize_backend()  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
        **self._backend_args)**self._backend_args)

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
        **self._backend_args)**self._backend_args)

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
    ImportError'[joblib] Attempting to do parallel computing ': 
ImportError[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information: 
[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
        '[joblib] Attempting to do parallel computing ''[joblib] Attempting to do parallel computing '

ImportErrorImportError: : [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information

C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")

Computing transition probabilities:   0%|          | 0/100 [00:00<?, ?it/s]
Computing transition probabilities:   7%|▋         | 7/100 [00:00<00:01, 66.19it/s]
Computing transition probabilities:   6%|▌         | 6/100 [00:00<00:01, 59.64it/s]
Computing transition probabilities:   5%|▌         | 5/100 [00:00<00:02, 44.00it/s]
Computing transition probabilities:   5%|▌         | 5/100 [00:00<00:01, 47.73it/s]
Computing transition probabilities:  14%|█▍        | 14/100 [00:00<00:01, 64.53it/s]
Computing transition probabilities:  12%|█▏        | 12/100 [00:00<00:01, 59.47it/s]
Computing transition probabilities:  12%|█▏        | 12/100 [00:00<00:01, 53.14it/s]
Computing transition probabilities:  10%|█         | 10/100 [00:00<00:01, 48.36it/s]
Computing transition probabilities:  20%|██        | 20/100 [00:00<00:01, 59.96it/s]
Computing transition probabilities:  18%|█▊        | 18/100 [00:00<00:01, 56.89it/s]
Computing transition probabilities:  18%|█▊        | 18/100 [00:00<00:01, 51.88it/s]
Computing transition probabilities:  17%|█▋        | 17/100 [00:00<00:01, 51.22it/s]
Computing transition probabilities:  24%|██▍       | 24/100 [00:00<00:01, 54.25it/s]
Computing transition probabilities:  23%|██▎       | 23/100 [00:00<00:01, 53.59it/s]
Computing transition probabilities:  23%|██▎       | 23/100 [00:00<00:01, 51.20it/s]
Computing transition probabilities:  21%|██        | 21/100 [00:00<00:01, 46.49it/s]
Computing transition probabilities:  29%|██▉       | 29/100 [00:00<00:01, 52.19it/s]
Computing transition probabilities:  28%|██▊       | 28/100 [00:00<00:01, 51.66it/s]
Computing transition probabilities:  28%|██▊       | 28/100 [00:00<00:01, 49.64it/s]
Computing transition probabilities:  26%|██▌       | 26/100 [00:00<00:01, 46.43it/s]
Computing transition probabilities:  34%|███▍      | 34/100 [00:00<00:01, 50.56it/s]
Computing transition probabilities:  33%|███▎      | 33/100 [00:00<00:01, 49.53it/s]
Computing transition probabilities:  33%|███▎      | 33/100 [00:00<00:01, 49.98it/s]
Computing transition probabilities:  31%|███       | 31/100 [00:00<00:01, 46.38it/s]
Computing transition probabilities:  39%|███▉      | 39/100 [00:00<00:01, 50.44it/s]
Computing transition probabilities:  39%|███▉      | 39/100 [00:00<00:01, 50.38it/s]
Computing transition probabilities:  39%|███▉      | 39/100 [00:00<00:01, 50.98it/s]
Computing transition probabilities:  38%|███▊      | 38/100 [00:00<00:01, 48.46it/s]
Computing transition probabilities:  45%|████▌     | 45/100 [00:00<00:01, 50.45it/s]
Computing transition probabilities:  44%|████▍     | 44/100 [00:00<00:01, 50.82it/s]
Computing transition probabilities:  45%|████▌     | 45/100 [00:00<00:01, 50.91it/s]
Computing transition probabilities:  43%|████▎     | 43/100 [00:00<00:01, 48.57it/s]
Computing transition probabilities:  50%|█████     | 50/100 [00:00<00:00, 50.39it/s]
Computing transition probabilities:  49%|████▉     | 49/100 [00:00<00:01, 49.69it/s]
Computing transition probabilities:  50%|█████     | 50/100 [00:01<00:01, 49.05it/s]
Computing transition probabilities:  48%|████▊     | 48/100 [00:01<00:01, 47.77it/s]
Computing transition probabilities:  57%|█████▋    | 57/100 [00:01<00:00, 51.73it/s]
Computing transition probabilities:  57%|█████▋    | 57/100 [00:01<00:00, 50.77it/s]
Computing transition probabilities:  55%|█████▌    | 55/100 [00:01<00:00, 50.09it/s]
Computing transition probabilities:  55%|█████▌    | 55/100 [00:01<00:00, 48.97it/s]
Computing transition probabilities:  65%|██████▌   | 65/100 [00:01<00:00, 53.64it/s]
Computing transition probabilities:  62%|██████▏   | 62/100 [00:01<00:00, 51.19it/s]
Computing transition probabilities:  63%|██████▎   | 63/100 [00:01<00:00, 50.68it/s]
Computing transition probabilities:  61%|██████    | 61/100 [00:01<00:00, 49.40it/s]
Computing transition probabilities:  68%|██████▊   | 68/100 [00:01<00:00, 51.69it/s]
Computing transition probabilities:  72%|███████▏  | 72/100 [00:01<00:00, 53.87it/s]
Computing transition probabilities:  69%|██████▉   | 69/100 [00:01<00:00, 50.74it/s]
Computing transition probabilities:  67%|██████▋   | 67/100 [00:01<00:00, 50.09it/s]
Computing transition probabilities:  74%|███████▍  | 74/100 [00:01<00:00, 52.04it/s]
Computing transition probabilities:  80%|████████  | 80/100 [00:01<00:00, 55.23it/s]
Computing transition probabilities:  75%|███████▌  | 75/100 [00:01<00:00, 50.53it/s]
Computing transition probabilities:  74%|███████▍  | 74/100 [00:01<00:00, 51.43it/s]
Computing transition probabilities:  80%|████████  | 80/100 [00:01<00:00, 52.14it/s]
Computing transition probabilities:  87%|████████▋ | 87/100 [00:01<00:00, 56.08it/s]
Computing transition probabilities:  81%|████████  | 81/100 [00:01<00:00, 52.30it/s]
Computing transition probabilities:  81%|████████  | 81/100 [00:01<00:00, 50.72it/s]
Computing transition probabilities:  87%|████████▋ | 87/100 [00:01<00:00, 53.05it/s]
Computing transition probabilities:  94%|█████████▍| 94/100 [00:01<00:00, 56.92it/s]
Computing transition probabilities:  87%|████████▋ | 87/100 [00:01<00:00, 51.18it/s]
Computing transition probabilities:  89%|████████▉ | 89/100 [00:01<00:00, 53.42it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 56.94it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information

Computing transition probabilities:  93%|█████████▎| 93/100 [00:01<00:00, 52.83it/s]
Computing transition probabilities:  93%|█████████▎| 93/100 [00:01<00:00, 51.54it/s]
Computing transition probabilities:  96%|█████████▌| 96/100 [00:01<00:00, 53.97it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 54.38it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information

Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 53.26it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information

Computing transition probabilities:  99%|█████████▉| 99/100 [00:01<00:00, 51.41it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 51.43it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
    _fixup_main_from_name(data['init_main_from_name'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
    alter_sys=True)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
eliorc commented 6 years ago

If you manage to run it on one core and not on many, it seems that you have a problem with your system and joblib's Parallel

Look at this SO question and answer, try to implement what he is saying and if it works let me know

https://stackoverflow.com/questions/35452694/python-joblib-parallel-on-windows-not-working-even-if-name-main?utm_medium=organic&utm_source=google_rich_qa&utm_campaign=google_rich_qa

Sandy4321 commented 6 years ago

thanks for soon answer I run as it is recommended in link you sent me "Under Windows it runs OK if it is run from a script, like python script_with_your_code.py" So I run it like this E:\graphs ML\code\node2vec-master\node2vec-master>python node2vec_example_may23_2018.py unfortunately error is the same

Microsoft Windows [Version 10.0.16299.431]
(c) 2017 Microsoft Corporation. All rights reserved.

E:\graphs ML\code\node2vec-master\node2vec-master>python node2vec_example_may23_2018.py
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 93.25it/s]
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 75.33it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 75.13it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
        exitcode = _main(fd)exitcode = _main(fd)

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
        prepare(preparation_data)prepare(preparation_data)

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 73.88it/s] 
_fixup_main_from_path(data['init_main_from_path'])_fixup_main_from_path(data['init_main_from_path'])
Traceback (most recent call last):

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
      File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
run_name="__mp_main__")
run_name="__mp_main__")exitcode = _main(fd)  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path

  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
            pkg_name=pkg_name, script_name=fname)pkg_name=pkg_name, script_name=fname)pkg_name=pkg_name, script_name=fname)

  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
        mod_name, mod_spec, pkg_name, script_name)    mod_name, mod_spec, pkg_name, script_name)
mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code

  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
      File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)    exec(code, run_globals)
exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>

  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
      File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # originalnode2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original

node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__

  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 71.99it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
                self.walks = self._generate_walks()self.walks = self._generate_walks()self.walks = self._generate_walks()self.walks = self._generate_walks()

  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
            in enumerate(num_walks_lists, 1))
in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    in enumerate(num_walks_lists, 1))
in enumerate(num_walks_lists, 1))  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
            n_jobs = self._initialize_backend()
n_jobs = self._initialize_backend()
n_jobs = self._initialize_backend()  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
        **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure

n_jobs = self._initialize_backend()  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend

      File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
        **self._backend_args)
**self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
**self._backend_args)  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
        '[joblib] Attempting to do parallel computing '
    '[joblib] Attempting to do parallel computing ''[joblib] Attempting to do parallel computing '

ImportErrorImportError'[joblib] Attempting to do parallel computing 'ImportError: :
: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more informationImportError[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information

:
[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 74.04it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
Computing transition probabilities:  90%|████████████████████████████████████████▌    | 90/100 [00:01<00:00, 66.90it/s]exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 70.85it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 66.90it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 66.50it/s]
    Traceback (most recent call last):
n_jobs = self._initialize_backend()  File "<string>", line 1, in <module>

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    **self._backend_args)    exitcode = _main(fd)

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
        prepare(preparation_data)'[joblib] Attempting to do parallel computing '

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
ImportError:     [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information_fixup_main_from_path(data['init_main_from_path'])

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 73.98it/s]
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
Computing transition probabilities:  93%|█████████████████████████████████████████▊   | 93/100 [00:01<00:00, 71.02it/s]node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 72.63it/s]  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure

    '[joblib] Attempting to do parallel computing '
ImportErrorTraceback (most recent call last):
:   File "<string>", line 1, in <module>
[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
    node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
    in enumerate(num_walks_lists, 1))
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
    n_jobs = self._initialize_backend()
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
    '[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 71.06it/s]
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 70.58it/s]Traceback (most recent call last):

  File "<string>", line 1, in <module>
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
        exitcode = _main(fd)exitcode = _main(fd)

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
        prepare(preparation_data)prepare(preparation_data)

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
        _fixup_main_from_path(data['init_main_from_path'])_fixup_main_from_path(data['init_main_from_path'])

  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
  File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    run_name="__mp_main__")
      File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
run_name="__mp_main__")
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
        pkg_name=pkg_name, script_name=fname)pkg_name=pkg_name, script_name=fname)

  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
        mod_name, mod_spec, pkg_name, script_name)mod_name, mod_spec, pkg_name, script_name)

  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
  File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
        exec(code, run_globals)exec(code, run_globals)

  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
        node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # originalnode2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original

  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
    self.walks = self._generate_walks()
self.walks = self._generate_walks()  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks

  File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
        in enumerate(num_walks_lists, 1))in enumerate(num_walks_lists, 1))

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
        n_jobs = self._initialize_backend()n_jobs = self._initialize_backend()

  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
    **self._backend_args)
      File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
**self._backend_args)
'[joblib] Attempting to do parallel computing '  File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure

ImportError    : '[joblib] Attempting to do parallel computing '[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information

ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
  warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities:  43%|███████████████████▎                         | 43/100 [00:01<00:02, 25.51it/s]forrtl: error (200): program aborting due to control-C event
Image              PC                Routine            Line        Source
libifcoremd.dll    00007FFE720594C4  Unknown               Unknown  Unknown
KERNELBASE.dll     00007FFEB9157EDD  Unknown               Unknown  Unknown
KERNEL32.DLL       00007FFEBA301FE4  Unknown               Unknown  Unknown
ntdll.dll          00007FFEBCC1F061  Unknown               Unknown  Unknown
forrtl: error (200): program aborting due to control-C event
Image              PC                Routine            Line        Source
libifcoremd.dll    00007FFE720594C4  Unknown               Unknown  Unknown
KERNELBASE.dll     00007FFEB9157EDD  Unknown               Unknown  Unknown
KERNEL32.DLL       00007FFEBA301FE4  Unknown               Unknown  Unknown
ntdll.dll          00007FFEBCC1F061  Unknown               Unknown  Unknown
forrtl: error (200): program aborting due to control-C event
Image              PC                Routine            Line        Source
libifcoremd.dll    00007FFE720594C4  Unknown               Unknown  Unknown
KERNELBASE.dll     00007FFEB9157EDD  Unknown               Unknown  Unknown
KERNEL32.DLL       00007FFEBA301FE4  Unknown               Unknown  Unknown
ntdll.dll          00007FFEBCC1F061  Unknown               Unknown  Unknown
forrtl: error (200): program aborting due to control-C event
Image              PC                Routine            Line        Source
libifcoremd.dll    00007FFE720594C4  Unknown               Unknown  Unknown
KERNELBASE.dll     00007FFEB9157EDD  Unknown               Unknown  Unknown
KERNEL32.DLL       00007FFEBA301FE4  Unknown               Unknown  Unknown
ntdll.dll          00007FFEBCC1F061  Unknown               Unknown  Unknown
forrtl: error (200): program aborting due to control-C event
Image              PC                Routine            Line        Source
libifcoremd.dll    00007FFE720594C4  Unknown               Unknown  Unknown
KERNELBASE.dll     00007FFEB9157EDD  Unknown               Unknown  Unknown
KERNEL32.DLL       00007FFEBA301FE4  Unknown               Unknown  Unknown
ntdll.dll          00007FFEBCC1F061  Unknown               Unknown  Unknown

E:\graphs ML\code\node2vec-master\node2vec-master>
eliorc commented 6 years ago

Nothing I can do about it, this library uses Joblib's Parallel function, if it doesn't work I can't fix it since its not my library

Sandy4321 commented 6 years ago

I see thank you for soon answer, Then we have only one way to update your code to be able to run on Windows computer For you it is minor change: just put the main calculation to function. As described in your link https://stackoverflow.com/questions/35452694/python-joblib-parallel-on-windows-not-working-even-if-name-main?utm_medium=organic&utm_source=google_rich_qa&utm_campaign=google_rich_qa it works on Windows if to put the main calculation to function like this: 1 file with calculation

#fun.py
from math import sqrt

def f(x):
    return sqrt(x)

2 file with joblib

#fun_use_joblib.py
from joblib import Parallel, delayed
from fun import f

if __name__ == '__main__':
    a = Parallel(n_jobs=2)(delayed(f)(i) for i in range(10))
    print(a)
q=1

output is correct

Microsoft Windows [Version 10.0.16299.431]
(c) 2017 Microsoft Corporation. All rights reserved.

E:\graphs ML\code\node2vec-master\node2vec-master>python fun_use_joblib.py
[0.0, 1.0, 1.4142135623730951, 1.7320508075688772, 2.0, 2.23606797749979, 2.449489742783178, 2.6457513110645907, 2.8284271247461903, 3.0]

E:\graphs ML\code\node2vec-master\node2vec-master>