Open saggu opened 2 years ago
On the Pyrallel's end, if task_done()
and join()
are called, no more new data should be added (by calling add_task()
).
pp = ParallelProcessor(...)
pp.start()
for i in range(100):
pp.add_task(i)
# this makes the exception
if i == 20:
pp.task_done()
pp.join()
pp.task_done()
pp.join()
Running the above code throws:
Traceback (most recent call last):
File "t1.py", line 53, in <module>
pp.task_done()
File ".../pyrallel/pyrallel/parallel_processor.py", line 413, in task_done
self.mapper_queues[i].put((ParallelProcessor.CMD_STOP,))
File ".../envs/py38/lib/python3.8/site-packages/multiprocess/queues.py", line 85, in put
raise ValueError(f"Queue {self!r} is closed")
ValueError: Queue <multiprocess.queues.Queue object at 0x101fc45b0> is closed
The command:
Error: