Hi, I'm try to test on windows, meet some error, It's caused by multiprocessing, tf_io_pipline_fast_tools.py line 28.
ref to this link, I add if __name__ == '__main__': in tf_io_pipline_fast_tools.py, the error disappear, is this the right way to slove this problem?
tf_io_pipline_fast_tools.py line 28
if __name__ == '__main__': # Add this line
CFG = global_config.cfg
_SAMPLE_INFO_QUEUE = Manager().Queue()
_SENTINEL = ("", [])
Error Message
(crnntf) D:\Proj\CRNN_Tensorflow>python tools/test_shadownet.py --image_path data/test_images/img_136.jpg --weights_path crnn_synth90k/shadownet.ckpt-80000 --char_dict_path data/char_dict/char_dict_en.json --ord_map_dict_path data/char_dict/ord_map_en.json
2021-08-24 12:28:00.572316: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cudart64_100.dll
2021-08-24 12:28:08.785860: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library cudart64_100.dll
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\spawn.py", line 106, in spawn_main
exitcode = _main(fd)
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\spawn.py", line 115, in _main
prepare(preparation_data)
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\spawn.py", line 226, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\spawn.py", line 278, in _fixup_main_from_path
run_name="__mp_main__")
File "D:\Anaconda3\envs\crnntf\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "D:\Anaconda3\envs\crnntf\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "D:\Anaconda3\envs\crnntf\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "D:\Proj\CRNN_Tensorflow\tools\test_shadownet.py", line 23, in <module>
from data_provider import tf_io_pipline_fast_tools
**File "D:\Proj\CRNN_Tensorflow\data_provider\tf_io_pipline_fast_tools.py", line 28, in <module>
_SAMPLE_INFO_QUEUE = Manager().Queue()**
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\context.py", line 55, in Manager
m.start()
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\managers.py", line 479, in start
self._process.start()
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\context.py", line 313, in _Popen
return Popen(process_obj)
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\popen_spawn_win32.py", line 34, in __init__
prep_data = spawn.get_preparation_data(process_obj._name)
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\spawn.py", line 144, in get_preparation_data
_check_not_importing_main()
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\spawn.py", line 137, in _check_not_importing_main
is not going to be frozen to produce an executable.''')
RuntimeError:
An attempt has been made to start a new process before the
current process has finished its bootstrapping phase.
This probably means that you are not using fork to start your
child processes and you have forgotten to use the proper idiom
in the main module:
if __name__ == '__main__':
freeze_support()
...
The "freeze_support()" line can be omitted if the program
is not going to be frozen to produce an executable.
Traceback (most recent call last):
File "tools/test_shadownet.py", line 23, in <module>
from data_provider import tf_io_pipline_fast_tools
File "D:\Proj\CRNN_Tensorflow\data_provider\tf_io_pipline_fast_tools.py", line 28, in <module>
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\context.py", line 55, in Manager
m.start()
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\managers.py", line 483, in start
self._address = reader.recv()
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\connection.py", line 250, in recv
buf = self._recv_bytes()
File "D:\Anaconda3\envs\crnntf\lib\multiprocessing\connection.py", line 306, in _recv_bytes
[ov.event], False, INFINITE)
KeyboardInterrupt
Hi, I'm try to test on windows, meet some error, It's caused by multiprocessing, tf_io_pipline_fast_tools.py line 28.
ref to this link, I add
if __name__ == '__main__':
in tf_io_pipline_fast_tools.py, the error disappear, is this the right way to slove this problem?tf_io_pipline_fast_tools.py line 28
Error Message