Exception in thread Thread-50: Traceback (most recent call last): File "G:\ProgramData\Anaconda3\envs\tensorf\lib\threading.py", line 916, in _bootstrap_inner self.run() File "G:\ProgramData\Anaconda3\envs\tensorf\lib\threading.py", line 864, in run self._target(*self._args, **self._kwargs) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\site-packages\tensorflow\python\keras\utils\data_utils.py", line 748, in _run with closing(self.executor_fn(_SHARED_SEQUENCES)) as executor: File "G:\ProgramData\Anaconda3\envs\tensorf\lib\site-packages\tensorflow\python\keras\utils\data_utils.py", line 727, in pool_fn initargs=(seqs, None, get_worker_id_queue())) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\context.py", line 119, in Pool context=self.get_context()) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\pool.py", line 174, in __init__ self._repopulate_pool() File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\pool.py", line 239, in _repopulate_pool w.start() File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\process.py", line 105, in start self._popen = self._Popen(self) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\context.py", line 322, in _Popen return Popen(process_obj) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__ reduction.dump(process_obj, to_child) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) TypeError: can't pickle _thread.lock objects
jupyter cmd log :
Traceback (most recent call last): File "<string>", line 1, in <module> File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\spawn.py", line 105, in spawn_main exitcode = _main(fd) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\spawn.py", line 115, in _main self = reduction.pickle.load(from_parent) EOFError: Ran out of input
cell output :
Exception in thread Thread-50: Traceback (most recent call last): File "G:\ProgramData\Anaconda3\envs\tensorf\lib\threading.py", line 916, in _bootstrap_inner self.run() File "G:\ProgramData\Anaconda3\envs\tensorf\lib\threading.py", line 864, in run self._target(*self._args, **self._kwargs) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\site-packages\tensorflow\python\keras\utils\data_utils.py", line 748, in _run with closing(self.executor_fn(_SHARED_SEQUENCES)) as executor: File "G:\ProgramData\Anaconda3\envs\tensorf\lib\site-packages\tensorflow\python\keras\utils\data_utils.py", line 727, in pool_fn initargs=(seqs, None, get_worker_id_queue())) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\context.py", line 119, in Pool context=self.get_context()) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\pool.py", line 174, in __init__ self._repopulate_pool() File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\pool.py", line 239, in _repopulate_pool w.start() File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\process.py", line 105, in start self._popen = self._Popen(self) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\context.py", line 322, in _Popen return Popen(process_obj) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__ reduction.dump(process_obj, to_child) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) TypeError: can't pickle _thread.lock objects
jupyter cmd log :
Traceback (most recent call last): File "<string>", line 1, in <module> File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\spawn.py", line 105, in spawn_main exitcode = _main(fd) File "G:\ProgramData\Anaconda3\envs\tensorf\lib\multiprocessing\spawn.py", line 115, in _main self = reduction.pickle.load(from_parent) EOFError: Ran out of input