nagadomi / nunif

Misc; latest version of waifu2x; 2D video to stereo 3D video conversion
MIT License
1.58k stars 142 forks source link

F:\1\envs\nunif\lib\site-packages\torch\onnx\symbolic_helper.py:1457: UserWarning: ONNX export mode is set to TrainingMode.EVAL, but operator 'dropout' is set to train=True. Exporting with train=True. warnings.warn( #40

Closed My12123 closed 1 year ago

My12123 commented 1 year ago

python -m waifu2x.export_onnx -i waifu2x/pretrained_models -o waifu2x/onnx_models

(nunif) F:\nunif>python -m waifu2x.export_onnx -i waifu2x/pretrained_models -o waifu2x/onnx_models
2023-06-05 14:26:59,760:nunif: [    INFO] cunet
F:\1\envs\nunif\lib\site-packages\torch\onnx\_internal\jit_utils.py:258: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ..\torch\csrc\jit\passes\onnx\constant_fold.cpp:181.)
  _C._jit_pass_onnx_node_shape_type_inference(node, params_dict, opset_version)
F:\1\envs\nunif\lib\site-packages\torch\onnx\utils.py:687: UserWarning: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function. (Triggered internally at ..\torch\csrc\jit\passes\onnx\shape_type_inference.cpp:1888.)
  _C._jit_pass_onnx_graph_shape_type_inference(
F:\1\envs\nunif\lib\site-packages\torch\onnx\utils.py:687: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ..\torch\csrc\jit\passes\onnx\constant_fold.cpp:181.)
  _C._jit_pass_onnx_graph_shape_type_inference(
F:\1\envs\nunif\lib\site-packages\torch\onnx\utils.py:1178: UserWarning: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function. (Triggered internally at ..\torch\csrc\jit\passes\onnx\shape_type_inference.cpp:1888.)
  _C._jit_pass_onnx_graph_shape_type_inference(
F:\1\envs\nunif\lib\site-packages\torch\onnx\utils.py:1178: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ..\torch\csrc\jit\passes\onnx\constant_fold.cpp:181.)
  _C._jit_pass_onnx_graph_shape_type_inference(
2023-06-05 14:27:04,106:nunif: [    INFO] upcunet
2023-06-05 14:27:13,981:nunif: [    INFO] swin_unet
F:\nunif\waifu2x\models\swin_unet.py:169: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  assert x2.shape[2] % 12 == 0 and x2.shape[2] % 16 == 0
F:\1\envs\nunif\lib\site-packages\torch\onnx\symbolic_helper.py:1457: UserWarning: ONNX export mode is set to TrainingMode.EVAL, but operator 'dropout' is set to train=True. Exporting with train=True.
  warnings.warn(
nagadomi commented 1 year ago

It is a bug in torchvision and it has already fixed. Update torchvsion module to 0.15 or later.

For conda env,

conda update torchvision

For pip env,

pip3 install --upgrade torchvision

Also, I have noted this issue in the source code comments. https://github.com/nagadomi/nunif/blob/70b31c118b7d375616374c78f7cc799e620b60dd/waifu2x/export_onnx.py#L3

My12123 commented 1 year ago

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. torchaudio 0.13.1+cu116 requires torch==1.13.1, but you have torch 2.0.1 which is incompatible. torchtext 0.14.1 requires torch==1.13.1, but you have torch 2.0.1 which is incompatible. Successfully installed torch-2.0.1 torchvision-0.15.2

My12123 commented 1 year ago

installation pip install torch==1.13.1+cu116 torchvision==0.15.0+cu116 torchaudio==0.13.1 torchtext --extra-index-url https://download.pytorch.org/whl/cu116

(nunif) F:\nunif>pip install torch==1.13.1+cu116 torchvision==0.15.0+cu116 torchaudio==0.13.1 torchtext --extra-index-url https://download.pytorch.org/whl/cu116
Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu116
Collecting torch==1.13.1+cu116
  Using cached https://download.pytorch.org/whl/cu116/torch-1.13.1%2Bcu116-cp310-cp310-win_amd64.whl (2433.8 MB)
ERROR: Could not find a version that satisfies the requirement torchvision==0.15.0+cu116 (from versions: 0.1.6, 0.1.7, 0.1.8, 0.1.9, 0.2.0, 0.2.1, 0.2.2, 0.2.2.post2, 0.2.2.post3, 0.12.0, 0.13.0, 0.13.0+cu116, 0.13.1, 0.13.1+cu116, 0.14.0, 0.14.0+cu116, 0.14.1, 0.14.1+cu116, 0.15.0, 0.15.1, 0.15.2)
ERROR: No matching distribution found for torchvision==0.15.0+cu116
My12123 commented 1 year ago
(nunif) F:\nunif>pip install torch==1.13.1+cu116 torchvision==0.15.0 torchaudio==0.13.1 torchtext --extra-index-url https://download.pytorch.org/whl/cu116
Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu116
Collecting torch==1.13.1+cu116
  Using cached https://download.pytorch.org/whl/cu116/torch-1.13.1%2Bcu116-cp310-cp310-win_amd64.whl (2433.8 MB)
Collecting torchvision==0.15.0
  Downloading torchvision-0.15.0-cp310-cp310-win_amd64.whl (1.2 MB)
     ---------------------------------------- 1.2/1.2 MB 751.2 kB/s eta 0:00:00
Collecting torchaudio==0.13.1
  Using cached https://download.pytorch.org/whl/cu116/torchaudio-0.13.1%2Bcu116-cp310-cp310-win_amd64.whl (2.3 MB)
Collecting torchtext
  Downloading https://download.pytorch.org/whl/torchtext-0.15.2-cp310-cp310-win_amd64.whl (1.9 MB)
     ---------------------------------------- 1.9/1.9 MB 3.0 MB/s eta 0:00:00
Requirement already satisfied: typing-extensions in f:\1\envs\nunif\lib\site-packages (from torch==1.13.1+cu116) (4.6.3)
Requirement already satisfied: numpy in f:\1\envs\nunif\lib\site-packages (from torchvision==0.15.0) (1.24.3)
Requirement already satisfied: requests in f:\1\envs\nunif\lib\site-packages (from torchvision==0.15.0) (2.31.0)
INFO: pip is looking at multiple versions of torchvision to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install torch==1.13.1+cu116 and torchvision==0.15.0 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested torch==1.13.1+cu116
    torchvision 0.15.0 depends on torch==2.0.0

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
My12123 commented 1 year ago
(nunif) F:\nunif>python -m waifu2x.export_onnx -i waifu2x/pretrained_models -o waifu2x/onnx_models
2023-06-05 16:12:21,311:nunif: [    INFO] cunet
F:\1\envs\nunif\lib\site-packages\torch\onnx\_internal\jit_utils.py:306: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ..\torch\csrc\jit\passes\onnx\constant_fold.cpp:181.)
  _C._jit_pass_onnx_node_shape_type_inference(node, params_dict, opset_version)
F:\1\envs\nunif\lib\site-packages\torch\onnx\utils.py:689: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ..\torch\csrc\jit\passes\onnx\constant_fold.cpp:181.)
  _C._jit_pass_onnx_graph_shape_type_inference(
F:\1\envs\nunif\lib\site-packages\torch\onnx\utils.py:1186: UserWarning: Constant folding - Only steps=1 can be constant folded for opset >= 10 onnx::Slice op. Constant folding not applied. (Triggered internally at ..\torch\csrc\jit\passes\onnx\constant_fold.cpp:181.)
  _C._jit_pass_onnx_graph_shape_type_inference(
============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

2023-06-05 16:12:25,585:nunif: [    INFO] upcunet
============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

2023-06-05 16:12:35,817:nunif: [    INFO] swin_unet
F:\nunif\waifu2x\models\swin_unet.py:169: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  assert x2.shape[2] % 12 == 0 and x2.shape[2] % 16 == 0
============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================
nagadomi commented 1 year ago

other warnings are not harmful. It is a normal output.

My12123 commented 1 year ago
2023-06-05 19:32:59,379:nunif: [ WARNING] patch_resize_antialias: No Resize node: waifu2x/onnx_models\swin_unet\photo\noise3_scale4x.onnx: name=None
============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

============= Diagnostic Run torch.onnx.export version 2.0.1+cu118 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

2023-06-05 19:34:14,260:nunif: [ WARNING] patch_resize_antialias: No Resize node: waifu2x/onnx_models\swin_unet\photo\scale4x.onnx: name=None

No Resize node: waifu2x/onnx_models\swin_unet\art_scan\noise2_scale4x.onnx: name=None Why is it used first / then \ ?

nagadomi commented 1 year ago

No Resize node

It is due to some experimental codes. ignore it, no problem.

nagadomi commented 1 year ago

Why is it used first / then \ ?

Because you specified -o waifu2x/onnx_models.