Closed Sandy4321 closed 6 years ago
I have tested the parallel part on different machine and they all work.
The code you are supplying here might not be using joblib implementation of Parallel execution - this was the part that was creating the exception in your previous issue.
I have tested the code on the example with 4 workers - everything works fine.
Since I can't replicate your problem I can't solve it. Verify that a parallel code USING JOBLIB is running fine on the same interpreter you are trying to run node2vec.
If that works please supply a program that runs the working code with the failing node2vec code on the same interpreter.
this code runs on one core import networkx as nx from node2vec import Node2Vec
''' https://github.com/eliorc/node2vec previous Python3 implementation of the node2vec algorithm Aditya Grover, Jure Leskovec and Vid Kocijan. node2vec: Scalable Feature Learning for Networks. A. Grover, J. Leskovec. ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2016. https://github.com/aditya-grover/node2vec https://github.com/snap-stanford/snap/tree/master/examples/node2vec https://snap.stanford.edu/node2vec/#code https://snap.stanford.edu/node2vec/
better code: https://github.com/HKUST-KnowComp/MNE '''
EMBEDDING_FILENAME = './embeddings.emb' EMBEDDING_MODEL_FILENAME = './embeddings.model'
graph = nx.fast_gnp_random_graph(n=100, p=0.5)
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers= 1) #May25
model = node2vec.fit(window=10, min_count=1, batch_words=4) # Any keywords acceptable by gensim.Word2Vec can be passed, diemnsions
and workers
are automatically passed (from the Node2Vec constructor)
model.wv.most_similar('2') # Output node names are always strings
model.wv.save_word2vec_format(EMBEDDING_FILENAME)
model.save(EMBEDDING_MODEL_FILENAME) q=1
import networkx as nx
from node2vec import Node2Vec
#https://github.com/eliorc/node2vec
'''
https://github.com/eliorc/node2vec
previous
Python3 implementation of the node2vec algorithm Aditya Grover, Jure Leskovec and Vid Kocijan.
node2vec: Scalable Feature Learning for Networks. A. Grover, J. Leskovec.
ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2016.
https://github.com/aditya-grover/node2vec
https://github.com/snap-stanford/snap/tree/master/examples/node2vec
https://snap.stanford.edu/node2vec/#code
https://snap.stanford.edu/node2vec/
better code:
https://github.com/HKUST-KnowComp/MNE
'''
# FILES
EMBEDDING_FILENAME = './embeddings.emb'
EMBEDDING_MODEL_FILENAME = './embeddings.model'
# Create a graph
graph = nx.fast_gnp_random_graph(n=100, p=0.5)
# Precompute probabilities and generate walks
# original node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
# change number of workers to 1 instead of 4
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers= 1) #May25
# Embed
model = node2vec.fit(window=10, min_count=1, batch_words=4) # Any keywords acceptable by gensim.Word2Vec can be passed, `diemnsions` and `workers` are automatically passed (from the Node2Vec constructor)
# Look for most similar nodes
model.wv.most_similar('2') # Output node names are always strings
# Save embeddings for later use
model.wv.save_word2vec_format(EMBEDDING_FILENAME)
# Save model for later use
model.save(EMBEDDING_MODEL_FILENAME)
q=1
typical output is
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]
Computing transition probabilities: 1%| | 1/100 [00:00<00:10, 9.24it/s]
Computing transition probabilities: 3%|▎ | 3/100 [00:00<00:08, 11.29it/s]
Computing transition probabilities: 5%|▌ | 5/100 [00:00<00:07, 12.49it/s]
Computing transition probabilities: 7%|▋ | 7/100 [00:00<00:07, 12.65it/s]
Computing transition probabilities: 9%|▉ | 9/100 [00:00<00:06, 13.03it/s]
Computing transition probabilities: 11%|█ | 11/100 [00:00<00:06, 12.93it/s]
Computing transition probabilities: 13%|█▎ | 13/100 [00:00<00:06, 13.29it/s]
Computing transition probabilities: 15%|█▌ | 15/100 [00:01<00:06, 13.52it/s]
Computing transition probabilities: 17%|█▋ | 17/100 [00:01<00:06, 13.62it/s]
Computing transition probabilities: 19%|█▉ | 19/100 [00:01<00:05, 13.81it/s]
Computing transition probabilities: 21%|██ | 21/100 [00:01<00:05, 13.98it/s]
Computing transition probabilities: 23%|██▎ | 23/100 [00:01<00:05, 13.94it/s]
Computing transition probabilities: 25%|██▌ | 25/100 [00:01<00:05, 14.00it/s]
Computing transition probabilities: 27%|██▋ | 27/100 [00:01<00:05, 14.03it/s]
Computing transition probabilities: 29%|██▉ | 29/100 [00:02<00:05, 14.02it/s]
Computing transition probabilities: 31%|███ | 31/100 [00:02<00:04, 14.01it/s]
Computing transition probabilities: 33%|███▎ | 33/100 [00:02<00:04, 14.14it/s]
Computing transition probabilities: 35%|███▌ | 35/100 [00:02<00:04, 14.16it/s]
Computing transition probabilities: 37%|███▋ | 37/100 [00:02<00:04, 14.27it/s]
Computing transition probabilities: 39%|███▉ | 39/100 [00:02<00:04, 14.27it/s]
Computing transition probabilities: 41%|████ | 41/100 [00:02<00:04, 14.35it/s]
Computing transition probabilities: 43%|████▎ | 43/100 [00:02<00:03, 14.46it/s]
Computing transition probabilities: 45%|████▌ | 45/100 [00:03<00:03, 14.50it/s]
Computing transition probabilities: 47%|████▋ | 47/100 [00:03<00:03, 14.50it/s]
Computing transition probabilities: 49%|████▉ | 49/100 [00:03<00:03, 14.49it/s]
Computing transition probabilities: 51%|█████ | 51/100 [00:03<00:03, 14.55it/s]
Computing transition probabilities: 53%|█████▎ | 53/100 [00:03<00:03, 14.58it/s]
Computing transition probabilities: 55%|█████▌ | 55/100 [00:03<00:03, 14.60it/s]
Computing transition probabilities: 57%|█████▋ | 57/100 [00:03<00:02, 14.62it/s]
Computing transition probabilities: 59%|█████▉ | 59/100 [00:04<00:02, 14.64it/s]
Computing transition probabilities: 61%|██████ | 61/100 [00:04<00:02, 14.56it/s]
Computing transition probabilities: 63%|██████▎ | 63/100 [00:04<00:02, 14.60it/s]
Computing transition probabilities: 65%|██████▌ | 65/100 [00:04<00:02, 14.67it/s]
Computing transition probabilities: 67%|██████▋ | 67/100 [00:04<00:02, 14.66it/s]
Computing transition probabilities: 69%|██████▉ | 69/100 [00:04<00:02, 14.68it/s]
Computing transition probabilities: 71%|███████ | 71/100 [00:04<00:01, 14.72it/s]
Computing transition probabilities: 73%|███████▎ | 73/100 [00:04<00:01, 14.74it/s]
Computing transition probabilities: 75%|███████▌ | 75/100 [00:05<00:01, 14.73it/s]
Computing transition probabilities: 77%|███████▋ | 77/100 [00:05<00:01, 14.75it/s]
Computing transition probabilities: 79%|███████▉ | 79/100 [00:05<00:01, 14.78it/s]
Computing transition probabilities: 81%|████████ | 81/100 [00:05<00:01, 14.78it/s]
Computing transition probabilities: 83%|████████▎ | 83/100 [00:05<00:01, 14.76it/s]
Computing transition probabilities: 85%|████████▌ | 85/100 [00:05<00:01, 14.75it/s]
Computing transition probabilities: 87%|████████▋ | 87/100 [00:05<00:00, 14.74it/s]
Computing transition probabilities: 89%|████████▉ | 89/100 [00:06<00:00, 14.73it/s]
Computing transition probabilities: 91%|█████████ | 91/100 [00:06<00:00, 14.68it/s]
Computing transition probabilities: 93%|█████████▎| 93/100 [00:06<00:00, 14.71it/s]
Computing transition probabilities: 95%|█████████▌| 95/100 [00:06<00:00, 14.72it/s]
Computing transition probabilities: 97%|█████████▋| 97/100 [00:06<00:00, 14.67it/s]
Computing transition probabilities: 99%|█████████▉| 99/100 [00:06<00:00, 14.72it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:06<00:00, 14.73it/s]
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]
Computing transition probabilities: 7%|▋ | 7/100 [00:00<00:01, 64.84it/s]
Computing transition probabilities: 7%|▋ | 7/100 [00:00<00:01, 62.42it/s]
Computing transition probabilities: 6%|▌ | 6/100 [00:00<00:01, 49.13it/s]
Computing transition probabilities: 7%|▋ | 7/100 [00:00<00:01, 59.00it/s]
Computing transition probabilities: 15%|█▌ | 15/100 [00:00<00:01, 68.23it/s]
Computing transition probabilities: 13%|█▎ | 13/100 [00:00<00:01, 57.14it/s]
Computing transition probabilities: 12%|█▏ | 12/100 [00:00<00:01, 52.55it/s]
Computing transition probabilities: 15%|█▌ | 15/100 [00:00<00:01, 64.67it/s]
Computing transition probabilities: 19%|█▉ | 19/100 [00:00<00:01, 57.88it/s]
Computing transition probabilities: 23%|██▎ | 23/100 [00:00<00:01, 68.94it/s]
Computing transition probabilities: 19%|█▉ | 19/100 [00:00<00:01, 54.41it/s]
Computing transition probabilities: 21%|██ | 21/100 [00:00<00:01, 61.72it/s]
Computing transition probabilities: 25%|██▌ | 25/100 [00:00<00:01, 57.72it/s]
Computing transition probabilities: 29%|██▉ | 29/100 [00:00<00:01, 66.53it/s]
Computing transition probabilities: 25%|██▌ | 25/100 [00:00<00:01, 55.38it/s]
Computing transition probabilities: 27%|██▋ | 27/100 [00:00<00:01, 60.20it/s]
Computing transition probabilities: 32%|███▏ | 32/100 [00:00<00:01, 59.00it/s]
Computing transition probabilities: 36%|███▌ | 36/100 [00:00<00:00, 65.13it/s]
Computing transition probabilities: 31%|███ | 31/100 [00:00<00:01, 55.73it/s]
Computing transition probabilities: 33%|███▎ | 33/100 [00:00<00:01, 60.08it/s]
Computing transition probabilities: 38%|███▊ | 38/100 [00:00<00:01, 57.91it/s]
Computing transition probabilities: 37%|███▋ | 37/100 [00:00<00:01, 56.36it/s]
Computing transition probabilities: 42%|████▏ | 42/100 [00:00<00:00, 63.97it/s]
Computing transition probabilities: 39%|███▉ | 39/100 [00:00<00:01, 59.04it/s]
Computing transition probabilities: 49%|████▉ | 49/100 [00:00<00:00, 63.81it/s]
Computing transition probabilities: 45%|████▌ | 45/100 [00:00<00:00, 57.81it/s]
Computing transition probabilities: 44%|████▍ | 44/100 [00:00<00:00, 56.48it/s]
Computing transition probabilities: 46%|████▌ | 46/100 [00:00<00:00, 58.83it/s]
Computing transition probabilities: 56%|█████▌ | 56/100 [00:00<00:00, 63.99it/s]
Computing transition probabilities: 51%|█████ | 51/100 [00:00<00:00, 57.51it/s]
Computing transition probabilities: 52%|█████▏ | 52/100 [00:00<00:00, 57.86it/s]
Computing transition probabilities: 53%|█████▎ | 53/100 [00:00<00:00, 59.67it/s]
Computing transition probabilities: 63%|██████▎ | 63/100 [00:00<00:00, 64.59it/s]
Computing transition probabilities: 59%|█████▉ | 59/100 [00:01<00:00, 58.71it/s]
Computing transition probabilities: 61%|██████ | 61/100 [00:00<00:00, 61.65it/s]
Computing transition probabilities: 57%|█████▋ | 57/100 [00:01<00:00, 56.45it/s]
Computing transition probabilities: 70%|███████ | 70/100 [00:01<00:00, 64.21it/s]
Computing transition probabilities: 67%|██████▋ | 67/100 [00:01<00:00, 60.51it/s]
Computing transition probabilities: 63%|██████▎ | 63/100 [00:01<00:00, 56.08it/s]
Computing transition probabilities: 68%|██████▊ | 68/100 [00:01<00:00, 61.49it/s]
Computing transition probabilities: 77%|███████▋ | 77/100 [00:01<00:00, 64.65it/s]
Computing transition probabilities: 74%|███████▍ | 74/100 [00:01<00:00, 61.23it/s]
Computing transition probabilities: 71%|███████ | 71/100 [00:01<00:00, 57.97it/s]
Computing transition probabilities: 75%|███████▌ | 75/100 [00:01<00:00, 61.60it/s]
Computing transition probabilities: 81%|████████ | 81/100 [00:01<00:00, 61.72it/s]
Computing transition probabilities: 84%|████████▍ | 84/100 [00:01<00:00, 63.77it/s]
Computing transition probabilities: 78%|███████▊ | 78/100 [00:01<00:00, 58.10it/s]
Computing transition probabilities: 82%|████████▏ | 82/100 [00:01<00:00, 61.70it/s]
Computing transition probabilities: 88%|████████▊ | 88/100 [00:01<00:00, 61.75it/s]
Computing transition probabilities: 92%|█████████▏| 92/100 [00:01<00:00, 64.51it/s]
Computing transition probabilities: 89%|████████▉ | 89/100 [00:01<00:00, 62.06it/s]
Computing transition probabilities: 85%|████████▌ | 85/100 [00:01<00:00, 58.08it/s]
Computing transition probabilities: 99%|█████████▉| 99/100 [00:01<00:00, 63.96it/s]
Computing transition probabilities: 95%|█████████▌| 95/100 [00:01<00:00, 61.23it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 63.65it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
Computing transition probabilities: 91%|█████████ | 91/100 [00:01<00:00, 57.86it/s]
Computing transition probabilities: 96%|█████████▌| 96/100 [00:01<00:00, 61.68it/s] exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 61.64it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 60.86it/s]
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
Traceback (most recent call last):
File "<string>", line 1, in <module>
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
Computing transition probabilities: 98%|█████████▊| 98/100 [00:01<00:00, 58.56it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 58.72it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
self.walks = self._generate_walks() self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()n_jobs = self._initialize_backend()
n_jobs = self._initialize_backend() File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
n_jobs = self._initialize_backend() File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
**self._backend_args)**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError'[joblib] Attempting to do parallel computing ':
ImportError[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information:
[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
'[joblib] Attempting to do parallel computing ''[joblib] Attempting to do parallel computing '
ImportErrorImportError: : [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 0%| | 0/100 [00:00<?, ?it/s]
Computing transition probabilities: 7%|▋ | 7/100 [00:00<00:01, 66.19it/s]
Computing transition probabilities: 6%|▌ | 6/100 [00:00<00:01, 59.64it/s]
Computing transition probabilities: 5%|▌ | 5/100 [00:00<00:02, 44.00it/s]
Computing transition probabilities: 5%|▌ | 5/100 [00:00<00:01, 47.73it/s]
Computing transition probabilities: 14%|█▍ | 14/100 [00:00<00:01, 64.53it/s]
Computing transition probabilities: 12%|█▏ | 12/100 [00:00<00:01, 59.47it/s]
Computing transition probabilities: 12%|█▏ | 12/100 [00:00<00:01, 53.14it/s]
Computing transition probabilities: 10%|█ | 10/100 [00:00<00:01, 48.36it/s]
Computing transition probabilities: 20%|██ | 20/100 [00:00<00:01, 59.96it/s]
Computing transition probabilities: 18%|█▊ | 18/100 [00:00<00:01, 56.89it/s]
Computing transition probabilities: 18%|█▊ | 18/100 [00:00<00:01, 51.88it/s]
Computing transition probabilities: 17%|█▋ | 17/100 [00:00<00:01, 51.22it/s]
Computing transition probabilities: 24%|██▍ | 24/100 [00:00<00:01, 54.25it/s]
Computing transition probabilities: 23%|██▎ | 23/100 [00:00<00:01, 53.59it/s]
Computing transition probabilities: 23%|██▎ | 23/100 [00:00<00:01, 51.20it/s]
Computing transition probabilities: 21%|██ | 21/100 [00:00<00:01, 46.49it/s]
Computing transition probabilities: 29%|██▉ | 29/100 [00:00<00:01, 52.19it/s]
Computing transition probabilities: 28%|██▊ | 28/100 [00:00<00:01, 51.66it/s]
Computing transition probabilities: 28%|██▊ | 28/100 [00:00<00:01, 49.64it/s]
Computing transition probabilities: 26%|██▌ | 26/100 [00:00<00:01, 46.43it/s]
Computing transition probabilities: 34%|███▍ | 34/100 [00:00<00:01, 50.56it/s]
Computing transition probabilities: 33%|███▎ | 33/100 [00:00<00:01, 49.53it/s]
Computing transition probabilities: 33%|███▎ | 33/100 [00:00<00:01, 49.98it/s]
Computing transition probabilities: 31%|███ | 31/100 [00:00<00:01, 46.38it/s]
Computing transition probabilities: 39%|███▉ | 39/100 [00:00<00:01, 50.44it/s]
Computing transition probabilities: 39%|███▉ | 39/100 [00:00<00:01, 50.38it/s]
Computing transition probabilities: 39%|███▉ | 39/100 [00:00<00:01, 50.98it/s]
Computing transition probabilities: 38%|███▊ | 38/100 [00:00<00:01, 48.46it/s]
Computing transition probabilities: 45%|████▌ | 45/100 [00:00<00:01, 50.45it/s]
Computing transition probabilities: 44%|████▍ | 44/100 [00:00<00:01, 50.82it/s]
Computing transition probabilities: 45%|████▌ | 45/100 [00:00<00:01, 50.91it/s]
Computing transition probabilities: 43%|████▎ | 43/100 [00:00<00:01, 48.57it/s]
Computing transition probabilities: 50%|█████ | 50/100 [00:00<00:00, 50.39it/s]
Computing transition probabilities: 49%|████▉ | 49/100 [00:00<00:01, 49.69it/s]
Computing transition probabilities: 50%|█████ | 50/100 [00:01<00:01, 49.05it/s]
Computing transition probabilities: 48%|████▊ | 48/100 [00:01<00:01, 47.77it/s]
Computing transition probabilities: 57%|█████▋ | 57/100 [00:01<00:00, 51.73it/s]
Computing transition probabilities: 57%|█████▋ | 57/100 [00:01<00:00, 50.77it/s]
Computing transition probabilities: 55%|█████▌ | 55/100 [00:01<00:00, 50.09it/s]
Computing transition probabilities: 55%|█████▌ | 55/100 [00:01<00:00, 48.97it/s]
Computing transition probabilities: 65%|██████▌ | 65/100 [00:01<00:00, 53.64it/s]
Computing transition probabilities: 62%|██████▏ | 62/100 [00:01<00:00, 51.19it/s]
Computing transition probabilities: 63%|██████▎ | 63/100 [00:01<00:00, 50.68it/s]
Computing transition probabilities: 61%|██████ | 61/100 [00:01<00:00, 49.40it/s]
Computing transition probabilities: 68%|██████▊ | 68/100 [00:01<00:00, 51.69it/s]
Computing transition probabilities: 72%|███████▏ | 72/100 [00:01<00:00, 53.87it/s]
Computing transition probabilities: 69%|██████▉ | 69/100 [00:01<00:00, 50.74it/s]
Computing transition probabilities: 67%|██████▋ | 67/100 [00:01<00:00, 50.09it/s]
Computing transition probabilities: 74%|███████▍ | 74/100 [00:01<00:00, 52.04it/s]
Computing transition probabilities: 80%|████████ | 80/100 [00:01<00:00, 55.23it/s]
Computing transition probabilities: 75%|███████▌ | 75/100 [00:01<00:00, 50.53it/s]
Computing transition probabilities: 74%|███████▍ | 74/100 [00:01<00:00, 51.43it/s]
Computing transition probabilities: 80%|████████ | 80/100 [00:01<00:00, 52.14it/s]
Computing transition probabilities: 87%|████████▋ | 87/100 [00:01<00:00, 56.08it/s]
Computing transition probabilities: 81%|████████ | 81/100 [00:01<00:00, 52.30it/s]
Computing transition probabilities: 81%|████████ | 81/100 [00:01<00:00, 50.72it/s]
Computing transition probabilities: 87%|████████▋ | 87/100 [00:01<00:00, 53.05it/s]
Computing transition probabilities: 94%|█████████▍| 94/100 [00:01<00:00, 56.92it/s]
Computing transition probabilities: 87%|████████▋ | 87/100 [00:01<00:00, 51.18it/s]
Computing transition probabilities: 89%|████████▉ | 89/100 [00:01<00:00, 53.42it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 56.94it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 93%|█████████▎| 93/100 [00:01<00:00, 52.83it/s]
Computing transition probabilities: 93%|█████████▎| 93/100 [00:01<00:00, 51.54it/s]
Computing transition probabilities: 96%|█████████▌| 96/100 [00:01<00:00, 53.97it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 54.38it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 53.26it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 99%|█████████▉| 99/100 [00:01<00:00, 51.41it/s]
Computing transition probabilities: 100%|██████████| 100/100 [00:01<00:00, 51.43it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 223, in prepare
_fixup_main_from_name(data['init_main_from_name'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 249, in _fixup_main_from_name
alter_sys=True)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
If you manage to run it on one core and not on many, it seems that you have a problem with your system and joblib's Parallel
Look at this SO question and answer, try to implement what he is saying and if it works let me know
thanks for soon answer I run as it is recommended in link you sent me "Under Windows it runs OK if it is run from a script, like python script_with_your_code.py" So I run it like this E:\graphs ML\code\node2vec-master\node2vec-master>python node2vec_example_may23_2018.py unfortunately error is the same
Microsoft Windows [Version 10.0.16299.431]
(c) 2017 Microsoft Corporation. All rights reserved.
E:\graphs ML\code\node2vec-master\node2vec-master>python node2vec_example_may23_2018.py
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 93.25it/s]
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 75.33it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 75.13it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 73.88it/s]
_fixup_main_from_path(data['init_main_from_path'])_fixup_main_from_path(data['init_main_from_path'])
Traceback (most recent call last):
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
run_name="__mp_main__")
run_name="__mp_main__")exitcode = _main(fd) File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)pkg_name=pkg_name, script_name=fname)pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name) mod_name, mod_spec, pkg_name, script_name)
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals) exec(code, run_globals)
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # originalnode2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 71.99it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()self.walks = self._generate_walks()self.walks = self._generate_walks()self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
in enumerate(num_walks_lists, 1))
in enumerate(num_walks_lists, 1)) File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
n_jobs = self._initialize_backend()
n_jobs = self._initialize_backend() File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
n_jobs = self._initialize_backend() File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
**self._backend_args) File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
'[joblib] Attempting to do parallel computing ''[joblib] Attempting to do parallel computing '
ImportErrorImportError'[joblib] Attempting to do parallel computing 'ImportError: :
: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more informationImportError[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
:
[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 74.04it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
Computing transition probabilities: 90%|████████████████████████████████████████▌ | 90/100 [00:01<00:00, 66.90it/s]exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 70.85it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 66.90it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 66.50it/s]
Traceback (most recent call last):
n_jobs = self._initialize_backend() File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
**self._backend_args) exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
prepare(preparation_data)'[joblib] Attempting to do parallel computing '
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 73.98it/s]
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
Computing transition probabilities: 93%|█████████████████████████████████████████▊ | 93/100 [00:01<00:00, 71.02it/s]node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 72.63it/s] File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportErrorTraceback (most recent call last):
: File "<string>", line 1, in <module>
[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
'[joblib] Attempting to do parallel computing '
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 71.06it/s]
Computing transition probabilities: 100%|████████████████████████████████████████████| 100/100 [00:01<00:00, 70.58it/s]Traceback (most recent call last):
File "<string>", line 1, in <module>
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)exitcode = _main(fd)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)prepare(preparation_data)
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])_fixup_main_from_path(data['init_main_from_path'])
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
File "C:\Users\sndr\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
run_name="__mp_main__")
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)pkg_name=pkg_name, script_name=fname)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)mod_name, mod_spec, pkg_name, script_name)
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
File "C:\Users\sndr\Anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)exec(code, run_globals)
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec_example_may23_2018.py", line 31, in <module>
node2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # originalnode2vec = Node2Vec(graph, dimensions=64, walk_length=30, num_walks=200, workers=4) # original
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 121, in __init__
self.walks = self._generate_walks()
self.walks = self._generate_walks() File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
File "E:\graphs ML\code\node2vec-master\node2vec-master\node2vec\node2vec.py", line 204, in _generate_walks
in enumerate(num_walks_lists, 1))in enumerate(num_walks_lists, 1))
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 749, in __call__
n_jobs = self._initialize_backend()n_jobs = self._initialize_backend()
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\parallel.py", line 547, in _initialize_backend
**self._backend_args)
File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
**self._backend_args)
'[joblib] Attempting to do parallel computing ' File "C:\Users\sndr\Anaconda3\lib\site-packages\joblib\_parallel_backends.py", line 305, in configure
ImportError : '[joblib] Attempting to do parallel computing '[joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
ImportError: [joblib] Attempting to do parallel computing without protecting your import on a system that does not support forking. To use parallel-computing in a script, you must protect your main loop using "if __name__ == '__main__'". Please see the joblib documentation on Parallel for more information
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
C:\Users\sndr\Anaconda3\lib\site-packages\gensim\utils.py:1197: UserWarning: detected Windows; aliasing chunkize to chunkize_serial
warnings.warn("detected Windows; aliasing chunkize to chunkize_serial")
Computing transition probabilities: 43%|███████████████████▎ | 43/100 [00:01<00:02, 25.51it/s]forrtl: error (200): program aborting due to control-C event
Image PC Routine Line Source
libifcoremd.dll 00007FFE720594C4 Unknown Unknown Unknown
KERNELBASE.dll 00007FFEB9157EDD Unknown Unknown Unknown
KERNEL32.DLL 00007FFEBA301FE4 Unknown Unknown Unknown
ntdll.dll 00007FFEBCC1F061 Unknown Unknown Unknown
forrtl: error (200): program aborting due to control-C event
Image PC Routine Line Source
libifcoremd.dll 00007FFE720594C4 Unknown Unknown Unknown
KERNELBASE.dll 00007FFEB9157EDD Unknown Unknown Unknown
KERNEL32.DLL 00007FFEBA301FE4 Unknown Unknown Unknown
ntdll.dll 00007FFEBCC1F061 Unknown Unknown Unknown
forrtl: error (200): program aborting due to control-C event
Image PC Routine Line Source
libifcoremd.dll 00007FFE720594C4 Unknown Unknown Unknown
KERNELBASE.dll 00007FFEB9157EDD Unknown Unknown Unknown
KERNEL32.DLL 00007FFEBA301FE4 Unknown Unknown Unknown
ntdll.dll 00007FFEBCC1F061 Unknown Unknown Unknown
forrtl: error (200): program aborting due to control-C event
Image PC Routine Line Source
libifcoremd.dll 00007FFE720594C4 Unknown Unknown Unknown
KERNELBASE.dll 00007FFEB9157EDD Unknown Unknown Unknown
KERNEL32.DLL 00007FFEBA301FE4 Unknown Unknown Unknown
ntdll.dll 00007FFEBCC1F061 Unknown Unknown Unknown
forrtl: error (200): program aborting due to control-C event
Image PC Routine Line Source
libifcoremd.dll 00007FFE720594C4 Unknown Unknown Unknown
KERNELBASE.dll 00007FFEB9157EDD Unknown Unknown Unknown
KERNEL32.DLL 00007FFEBA301FE4 Unknown Unknown Unknown
ntdll.dll 00007FFEBCC1F061 Unknown Unknown Unknown
E:\graphs ML\code\node2vec-master\node2vec-master>
Nothing I can do about it, this library uses Joblib's Parallel function, if it doesn't work I can't fix it since its not my library
I see thank you for soon answer, Then we have only one way to update your code to be able to run on Windows computer For you it is minor change: just put the main calculation to function. As described in your link https://stackoverflow.com/questions/35452694/python-joblib-parallel-on-windows-not-working-even-if-name-main?utm_medium=organic&utm_source=google_rich_qa&utm_campaign=google_rich_qa it works on Windows if to put the main calculation to function like this: 1 file with calculation
#fun.py
from math import sqrt
def f(x):
return sqrt(x)
2 file with joblib
#fun_use_joblib.py
from joblib import Parallel, delayed
from fun import f
if __name__ == '__main__':
a = Parallel(n_jobs=2)(delayed(f)(i) for i in range(10))
print(a)
q=1
output is correct
Microsoft Windows [Version 10.0.16299.431]
(c) 2017 Microsoft Corporation. All rights reserved.
E:\graphs ML\code\node2vec-master\node2vec-master>python fun_use_joblib.py
[0.0, 1.0, 1.4142135623730951, 1.7320508075688772, 2.0, 2.23606797749979, 2.449489742783178, 2.6457513110645907, 2.8284271247461903, 3.0]
E:\graphs ML\code\node2vec-master\node2vec-master>
for example when I use this repo https://github.com/HKUST-KnowComp/MNE
then parallel jobs are happen it is output for example