switchablenorms / DeepFashion_Try_On

Official code for "Towards Photo-Realistic Virtual Try-On by Adaptively Generating↔Preserving Image Content",CVPR‘20 https://arxiv.org/abs/2003.05863
826 stars 251 forks source link

broken pipe while running the inference #66

Closed piggybox closed 3 years ago

piggybox commented 3 years ago

I followed the instruction in readme and ran the test.py in the inference folder and got a broken pipe error:

------------ Options -------------
batchSize: 1
beta1: 0.5
checkpoints_dir: ./checkpoints
continue_train: False
data_type: 32
dataroot: ../Data_preprocessing/
debug: False
display_freq: 100
display_winsize: 512
fineSize: 512
gpu_ids: [0]
input_nc: 3
isTrain: True
label_nc: 20
lambda_feat: 10.0
loadSize: 512
load_pretrain: ./checkpoints/label2city
lr: 0.0002
max_dataset_size: inf
model: pix2pixHD
nThreads: 2
n_blocks_global: 4
n_blocks_local: 3
n_downsample_global: 4
n_layers_D: 3
n_local_enhancers: 1
name: label2city
ndf: 64
netG: global
ngf: 64
niter: 100
niter_decay: 100
niter_fix_global: 0
no_flip: False
no_ganFeat_loss: False
no_html: False
no_lsgan: False
no_vgg_loss: False
norm: instance
num_D: 2
output_nc: 3
phase: test
pool_size: 0
print_freq: 100
resize_or_crop: scale_width
save_epoch_freq: 10
save_latest_freq: 1000
serial_batches: False
tf_log: False
use_dropout: False
verbose: False
which_epoch: latest
-------------- End ----------------
CustomDatasetDataLoader
dataset [AlignedDataset] was created
../Data_preprocessing/test_label label
../Data_preprocessing/test_label label
../Data_preprocessing/test_img img
../Data_preprocessing/test_img img
../Data_preprocessing/test_edge edge
../Data_preprocessing/test_edge edge
../Data_preprocessing/test_mask mask
../Data_preprocessing/test_mask mask
../Data_preprocessing/test_colormask colormask
../Data_preprocessing/test_colormask colormask
../Data_preprocessing/test_color color
../Data_preprocessing/test_color color
# Inference images = 2032
latest_net_U.pth
latest_net_G1.pth
latest_net_G2.pth
latest_net_G.pth
?
------------ Options -------------
batchSize: 1
beta1: 0.5
checkpoints_dir: ./checkpoints
continue_train: False
data_type: 32
dataroot: ../Data_preprocessing/
debug: False
display_freq: 100
display_winsize: 512
fineSize: 512
gpu_ids: [0]
input_nc: 3
isTrain: True
label_nc: 20
lambda_feat: 10.0
loadSize: 512
load_pretrain: ./checkpoints/label2city
lr: 0.0002
max_dataset_size: inf
model: pix2pixHD
nThreads: 2
n_blocks_global: 4
n_blocks_local: 3
n_downsample_global: 4
n_layers_D: 3
n_local_enhancers: 1
name: label2city
ndf: 64
netG: global
ngf: 64
niter: 100
niter_decay: 100
niter_fix_global: 0
no_flip: False
no_ganFeat_loss: False
no_html: False
no_lsgan: False
no_vgg_loss: False
norm: instance
num_D: 2
output_nc: 3
phase: test
pool_size: 0
print_freq: 100
resize_or_crop: scale_width
save_epoch_freq: 10
save_latest_freq: 1000
serial_batches: False
tf_log: False
use_dropout: False
verbose: False
which_epoch: latest
-------------- End ----------------
CustomDatasetDataLoader
dataset [AlignedDataset] was created
../Data_preprocessing/test_label label
../Data_preprocessing/test_label label
../Data_preprocessing/test_img img
../Data_preprocessing/test_img img
../Data_preprocessing/test_edge edge
../Data_preprocessing/test_edge edge
../Data_preprocessing/test_mask mask
../Data_preprocessing/test_mask mask
../Data_preprocessing/test_colormask colormask
../Data_preprocessing/test_colormask colormask
../Data_preprocessing/test_color color
../Data_preprocessing/test_color color
# Inference images = 2032
latest_net_U.pth
latest_net_G1.pth
latest_net_G2.pth
latest_net_G.pth
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\spawn.py", line 105, in spawn_main
    exitcode = _main(fd)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\spawn.py", line 114, in _main
    prepare(preparation_data)Traceback (most recent call last):

  File ".\test.py", line 119, in <module>
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\spawn.py", line 225, in prepare
    for i, data in enumerate(dataset, start=epoch_iter):
  File "C:\Users\piggybox\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 352, in __iter__
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
    return self._get_iterator()
  File "C:\Users\piggybox\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 294, in _get_iterator
    run_name="__mp_main__")
  File "C:\Users\piggybox\Anaconda3\lib\runpy.py", line 263, in run_path
    return _MultiProcessingDataLoaderIter(self)
  File "C:\Users\piggybox\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 801, in __init__
    pkg_name=pkg_name, script_name=fname)
  File "C:\Users\piggybox\Anaconda3\lib\runpy.py", line 96, in _run_module_code
    w.start()
mod_name, mod_spec, pkg_name, script_name)  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\process.py", line 112, in start

  File "C:\Users\piggybox\Anaconda3\lib\runpy.py", line 85, in _run_code
    self._popen = self._Popen(self)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen
    exec(code, run_globals)
  File "D:\Projects\DeepFashion_Try_On\ACGPN_inference\test.py", line 119, in <module>
        return _default_context.get_context().Process._Popen(process_obj)for i, data in enumerate(dataset, start=epoch_iter):

  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen
  File "C:\Users\piggybox\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 352, in __iter__
    return Popen(process_obj)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
    return self._get_iterator()
  File "C:\Users\piggybox\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 294, in _get_iterator
    reduction.dump(process_obj, to_child)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump
    return _MultiProcessingDataLoaderIter(self)
  File "C:\Users\piggybox\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 801, in __init__
    ForkingPickler(file, protocol).dump(obj)
BrokenPipeError: [Errno 32] Broken pipe
    w.start()
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\process.py", line 112, in start
    self._popen = self._Popen(self)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen
    return Popen(process_obj)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 46, in __init__
    prep_data = spawn.get_preparation_data(process_obj._name)
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\spawn.py", line 143, in get_preparation_data
    _check_not_importing_main()
  File "C:\Users\piggybox\Anaconda3\lib\multiprocessing\spawn.py", line 136, in _check_not_importing_main
    is not going to be frozen to produce an executable.''')
RuntimeError:
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.

Anyone shed a light on it?

piggybox commented 3 years ago

OK, problem solved. It turns out to be a windows specific issue when not starting the code from python idiom if __name__ == '__main__'

https://pytorch.org/docs/stable/notes/windows.html#multiprocessing-error-without-if-clause-protection