clovaai / deep-text-recognition-benchmark

Text recognition (optical character recognition) with deep learning methods, ICCV 2019
Apache License 2.0
3.71k stars 1.09k forks source link

TypeError: cannot pickle 'Environment' object #321

Open GMXela opened 2 years ago

GMXela commented 2 years ago

Hello,

I have this error and I really don't know what to do. I'm new in coding so, I really need help! :-P

there is all lines in my Terminal : (THX for helping me)

PS C:\Users\guit_\PycharmProjects\Text Recognition\deep-text-recognition-benchmark-master> python train.py --train_data data_lmdb_release/training --valid_data data_lmdb_release/validation --select_data MJ-ST --batch_ratio 0.5-0.5 --Transformation None --FeatureExtraction ResNet --SequenceModeling BiLSTM --Prediction CTC Filtering the images containing characters which are not in opt.character Filtering the images whose label is longer than opt.batch_max_length

dataset_root: data_lmdb_release/training opt.select_data: ['MJ', 'ST'] opt.batch_ratio: ['0.5', '0.5']

dataset_root: data_lmdb_release/training dataset: MJ sub-directory: /MJ\MJ_test num samples: 891924 sub-directory: /MJ\MJ_train num samples: 7224586 sub-directory: /MJ\MJ_valid num samples: 802731 num total samples of MJ: 8919241 x 1.0 (total_data_usage_ratio) = 8919241 num samples of MJ per batch: 192 x 0.5 (batch_ratio) = 96 Traceback (most recent call last): File "train.py", line 317, in train(opt) File "train.py", line 31, in train train_dataset = Batch_BalancedDataset(opt) File "C:\Users\guit\PycharmProjects\Text Recognition\deep-text-recognition-benchmark-master\dataset.py", line 69, in init self.dataloader_iter_list.append(iter(_dataloader)) File "C:\Users\guit.conda\envs\Benchmark_env\lib\site-packages\torch\utils\data\dataloader.py", line 359, in iter return self._getiterator() File "C:\Users\guit.conda\envs\Benchmark_env\lib\site-packages\torch\utils\data\dataloader.py", line 305, in _get_iterator return MultiProcessingDataLoaderIter(self) File "C:\Users\guit.conda\envs\Benchmarkenv\lib\site-packages\torch\utils\data\dataloader.py", line 918, in init w.start() File "C:\Users\guit.conda\envs\Benchmark_env\lib\multiprocessing\process.py", line 121, in start self._popen = self.Popen(self) File "C:\Users\guit.conda\envs\Benchmark_env\lib\multiprocessing\context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(processobj) File "C:\Users\guit.conda\envs\Benchmark_env\lib\multiprocessing\context.py", line 327, in _Popen return Popen(processobj) File "C:\Users\guit.conda\envs\Benchmark_env\lib\multiprocessing\popen_spawn_win32.py", line 93, in init reduction.dump(process_obj, tochild) File "C:\Users\guit.conda\envs\Benchmarkenv\lib\multiprocessing\reduction.py", line 60, in dump ForkingPickler(file, protocol).dump(obj) TypeError: cannot pickle 'Environment' object PS C:\Users\guit\PycharmProjects\Text Recognition\deep-text-recognition-benchmark-master> Traceback (most recent call last): File "", line 1, in File "C:\Users\guit_.conda\envs\Benchmark_env\lib\multiprocessing\spawn.py", line 116, in spawn_main exitcode = _main(fd, parentsentinel) File "C:\Users\guit.conda\envs\Benchmark_env\lib\multiprocessing\spawn.py", line 126, in _main self = reduction.pickle.load(from_parent) EOFError: Ran out of input

AmirHosseinCV commented 2 years ago

I have the same problem.

AmirHosseinCV commented 2 years ago

That sounds to be related to PyTorch. Adding --workers 0 fixed the problem for me. Check this issue.

Chen-chang-yu commented 7 months ago

That sounds to be related to PyTorch. Adding --workers 0 fixed the problem for me. Check this issue.

thanks, it works!