ohayonguy / PMRF

Official implementation of Posterior-Mean Rectified Flow: Towards Minimum MSE Photo-Realistic Image Restoration
https://pmrf-ml.github.io
MIT License
534 stars 31 forks source link

stack expects each tensor to be equal size #13

Closed xalteropsx closed 1 month ago

xalteropsx commented 1 month ago
R:\core\PMRF>py inference.py --ckpt_path ohayonguy/PMRF_blind_face_image_restoration --ckpt_path_is_huggingface --lq_data_path input --output_dir result --batch_size 64 --num_flow_steps 25
Z:\software\python11\Lib\site-packages\torchmetrics\utilities\prints.py:36: UserWarning: Metric `InceptionScore` will save all extracted features in buffer. For large datasets this may lead to large memory footprint.
  warnings.warn(*args, **kwargs)
Compiled model
  0%|                                                                                                                                                                                                                                                             | 0/1 [00:00<?, ?it/s]
Traceback (most recent call last):
  File "R:\core\PMRF\inference.py", line 87, in <module>
    main(parser.parse_args())
  File "R:\core\PMRF\inference.py", line 57, in main
    for batch in tqdm(dl):
  File "Z:\software\python11\Lib\site-packages\tqdm\std.py", line 1178, in __iter__
    for obj in iterable:
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\dataloader.py", line 629, in __next__
    data = self._next_data()
           ^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\dataloader.py", line 672, in _next_data
    data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\_utils\fetch.py", line 54, in fetch
    return self.collate_fn(data)
           ^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\_utils\collate.py", line 316, in default_collate
    return collate(batch, collate_fn_map=default_collate_fn_map)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\_utils\collate.py", line 154, in collate
    clone.update({key: collate([d[key] for d in batch], collate_fn_map=collate_fn_map) for key in elem})
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\_utils\collate.py", line 154, in <dictcomp>
    clone.update({key: collate([d[key] for d in batch], collate_fn_map=collate_fn_map) for key in elem})
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\_utils\collate.py", line 141, in collate
    return collate_fn_map[elem_type](batch, collate_fn_map=collate_fn_map)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\data\_utils\collate.py", line 213, in collate_tensor_fn
    return torch.stack(batch, 0, out=out)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: stack expects each tensor to be equal size, but got [3, 140, 140] at entry 0 and [3, 147, 146] at entry 1

R:\core\PMRF>
xalteropsx commented 1 month ago

@chachasammy still their is big issue with natten cannot be install in window gonna skip this library for now

i download the 512x512 instead of 140x140 faces it work but the issue natten require guess gonna skip for now if use linux will think but rip window

Building wheels for collected packages: natten
  Building wheel for natten (setup.py) ... error
  error: subprocess-exited-with-error

  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [118 lines of output]
      Building NATTEN for CPU ONLY.
      Number of workers: 4
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib.win-amd64-cpython-311
      creating build\lib.win-amd64-cpython-311\natten
      copying src\natten\context.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\flops.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\functional.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\na1d.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\na2d.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\na3d.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\natten1d.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\natten2d.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\natten3d.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\nested.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\ops.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\types.py -> build\lib.win-amd64-cpython-311\natten
      copying src\natten\__init__.py -> build\lib.win-amd64-cpython-311\natten
      creating build\lib.win-amd64-cpython-311\natten\utils
      copying src\natten/utils\checks.py -> build\lib.win-amd64-cpython-311\natten/utils
      copying src\natten/utils\log.py -> build\lib.win-amd64-cpython-311\natten/utils
      copying src\natten/utils\misc.py -> build\lib.win-amd64-cpython-311\natten/utils
      copying src\natten/utils\tensor.py -> build\lib.win-amd64-cpython-311\natten/utils
      copying src\natten/utils\testing.py -> build\lib.win-amd64-cpython-311\natten/utils
      copying src\natten/utils\__init__.py -> build\lib.win-amd64-cpython-311\natten/utils
      creating build\lib.win-amd64-cpython-311\natten\autotuner
      copying src\natten/autotuner\fna_backward.py -> build\lib.win-amd64-cpython-311\natten/autotuner
      copying src\natten/autotuner\fna_forward.py -> build\lib.win-amd64-cpython-311\natten/autotuner
      copying src\natten/autotuner\misc.py -> build\lib.win-amd64-cpython-311\natten/autotuner
      copying src\natten/autotuner\__init__.py -> build\lib.win-amd64-cpython-311\natten/autotuner
      creating build\lib.win-amd64-cpython-311\natten\autotuner\configs
      copying src\natten/autotuner/configs\fna_backward_128x128.py -> build\lib.win-amd64-cpython-311\natten/autotuner/configs
      copying src\natten/autotuner/configs\fna_backward_128x64.py -> build\lib.win-amd64-cpython-311\natten/autotuner/configs
      copying src\natten/autotuner/configs\fna_backward_64x64.py -> build\lib.win-amd64-cpython-311\natten/autotuner/configs
      copying src\natten/autotuner/configs\fna_forward_32x128.py -> build\lib.win-amd64-cpython-311\natten/autotuner/configs
      copying src\natten/autotuner/configs\fna_forward_64x128.py -> build\lib.win-amd64-cpython-311\natten/autotuner/configs
      copying src\natten/autotuner/configs\fna_forward_64x64.py -> build\lib.win-amd64-cpython-311\natten/autotuner/configs
      copying src\natten/autotuner/configs\__init__.py -> build\lib.win-amd64-cpython-311\natten/autotuner/configs
      running build_ext
      -- The CXX compiler identification is Clang 17.0.0 with GNU-like command-line
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: X:/amdgraphic/amd2025/hip/5.7/bin/clang++.exe - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      Python: Z:\software\python11\python.exe
      CUDA_TOOLKIT_ROOT_DIR not found or specified
      -- Could NOT find CUDA (missing: CUDA_TOOLKIT_ROOT_DIR CUDA_NVCC_EXECUTABLE CUDA_INCLUDE_DIRS CUDA_CUDART_LIBRARY)
      CMake Warning at Z:/software/python11/Lib/site-packages/torch/share/cmake/Caffe2/public/cuda.cmake:31 (message):
        Caffe2: CUDA cannot be found.  Depending on whether you are building Caffe2
        or a Caffe2 dependent library, the next warning / error will give you more
        info.
      Call Stack (most recent call first):
        Z:/software/python11/Lib/site-packages/torch/share/cmake/Caffe2/Caffe2Config.cmake:87 (include)
        Z:/software/python11/Lib/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
        CMakeLists.txt:56 (find_package)

      CMake Error at Z:/software/python11/Lib/site-packages/torch/share/cmake/Caffe2/Caffe2Config.cmake:91 (message):
        Your installed Caffe2 version uses CUDA but I cannot find the CUDA
        libraries.  Please set the proper CUDA prefixes and / or install CUDA.
      Call Stack (most recent call first):
        Z:/software/python11/Lib/site-packages/torch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
        CMakeLists.txt:56 (find_package)

      -- Configuring incomplete, errors occurred!
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "C:\Users\haide\AppData\Local\Temp\pip-install-33f0jv7u\natten_6fc0f93b01d045ae9738c49d0ac242e6\setup.py", line 243, in <module>
          setup(
        File "Z:\software\python11\Lib\site-packages\setuptools\__init__.py", line 103, in setup
          return distutils.core.setup(**attrs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\core.py", line 184, in setup
          return run_commands(dist)
                 ^^^^^^^^^^^^^^^^^^
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\core.py", line 200, in run_commands
          dist.run_commands()
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\dist.py", line 969, in run_commands
          self.run_command(cmd)
        File "Z:\software\python11\Lib\site-packages\setuptools\dist.py", line 976, in run_command
          super().run_command(command)
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
          cmd_obj.run()
        File "Z:\software\python11\Lib\site-packages\setuptools\command\bdist_wheel.py", line 373, in run
          self.run_command("build")
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\cmd.py", line 316, in run_command
          self.distribution.run_command(command)
        File "Z:\software\python11\Lib\site-packages\setuptools\dist.py", line 976, in run_command
          super().run_command(command)
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
          cmd_obj.run()
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\command\build.py", line 132, in run
          self.run_command(cmd_name)
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\cmd.py", line 316, in run_command
          self.distribution.run_command(command)
        File "Z:\software\python11\Lib\site-packages\setuptools\dist.py", line 976, in run_command
          super().run_command(command)
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\dist.py", line 988, in run_command
          cmd_obj.run()
        File "Z:\software\python11\Lib\site-packages\setuptools\command\build_ext.py", line 93, in run
          _build_ext.run(self)
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\command\build_ext.py", line 359, in run
          self.build_extensions()
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\command\build_ext.py", line 479, in build_extensions
          self._build_extensions_serial()
        File "Z:\software\python11\Lib\site-packages\setuptools\_distutils\command\build_ext.py", line 505, in _build_extensions_serial
          self.build_extension(ext)
        File "C:\Users\haide\AppData\Local\Temp\pip-install-33f0jv7u\natten_6fc0f93b01d045ae9738c49d0ac242e6\setup.py", line 219, in build_extension
          subprocess.check_call(
        File "Z:\software\python11\Lib\subprocess.py", line 413, in check_call
          raise CalledProcessError(retcode, cmd)
      subprocess.CalledProcessError: Command '['cmake', 'C:\\Users\\haide\\AppData\\Local\\Temp\\pip-install-33f0jv7u\\natten_6fc0f93b01d045ae9738c49d0ac242e6\\csrc', '-DPYTHON_PATH=Z:\\software\\python11\\python.exe', '-DOUTPUT_FILE_NAME=natten\\libnatten.cp311-win_amd64', '-DNATTEN_CUDA_ARCH_LIST=', '-DNATTEN_IS_WINDOWS=1', '-DNATTEN_IS_MAC=0', '-DIS_LIBTORCH_BUILT_WITH_CXX11_ABI=0', '-DNATTEN_WITH_AVX=1', '-DPY_LIB_DIR=Z:\\software\\python11\\libs', '-G Ninja', '-DCMAKE_BUILD_TYPE=Release']' returned non-zero exit status 1.
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for natten
  Running setup.py clean for natten
Failed to build natten
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (natten)
ohayonguy commented 1 month ago

Did you check https://github.com/ohayonguy/PMRF/issues/8?

xalteropsx commented 1 month ago
R:\core\PMRF>py -m pip show natten
WARNING: Ignoring invalid distribution ~nnxruntime (Z:\software\python11\Lib\site-packages)
Name: natten
Version: 0.17.2.dev0
Summary: Neighborhood Attention Extension.
Home-page: https://github.com/SHI-Labs/NATTEN
Author: Ali Hassani
Author-email:
License:
Location: Z:\software\python11\Lib\site-packages
Requires: packaging, torch
Required-by:

R:\core\PMRF>py inference.py --ckpt_path ohayonguy/PMRF_blind_face_image_restoration --ckpt_path_is_huggingface --lq_data_path input --output_dir result --batch_size 64 --num_flow_steps 25
Z:\software\python11\Lib\site-packages\torchmetrics\utilities\prints.py:36: UserWarning: Metric `InceptionScore` will save all extracted features in buffer. For large datasets this may lead to large memory footprint.
  warnings.warn(*args, **kwargs)
Compiled model
  0%|                                                                                                                                                                                                                                                                          | 0/47 [00:00<?, ?it/s]W1021 21:33:32.334000 19740 torch\_dynamo\convert_frame.py:987] WON'T CONVERT rms_norm R:\core\PMRF\arch\hourglass\image_transformer_v2.py line 99
.....................
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987]   File "Z:\software\python11\Lib\site-packages\torch\_inductor\scheduler.py", line 2523, in get_backend
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987]     self.backends[device] = self.create_backend(device)
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987]                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987]   File "Z:\software\python11\Lib\site-packages\torch\_inductor\scheduler.py", line 2515, in create_backend
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987]     raise RuntimeError(
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987] torch._dynamo.exc.BackendCompilerFailed: backend='inductor' raised:
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987] RuntimeError: Cannot find a working triton installation. More information on installing Triton can be found at https://github.com/openai/triton
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987]
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987] Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
W1021 21:33:32.679000 19740 torch\_dynamo\convert_frame.py:987]
  0%|                                                                                                                                                                                                                                                                          | 0/47 [00:05<?, ?it/s]
Traceback (most recent call last):
  File "R:\core\PMRF\inference.py", line 87, in <module>
    main(parser.parse_args())
  File "R:\core\PMRF\inference.py", line 60, in main
    estimate = model.generate_reconstructions(dummy_x, y, None, args.num_flow_steps, torch.device("cpu"))[0]
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\lightning_models\mmse_rectified_flow.py", line 215, in generate_reconstructions
    v_t_next = self(x_t=x_t_next, t=t_one * num_t, y=y).to(x_t_next.dtype)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\lightning_models\mmse_rectified_flow.py", line 136, in forward
    return self.forward_flow(x_t, t, y)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\lightning_models\mmse_rectified_flow.py", line 132, in forward_flow
    return self.model(x_t, t)
           ^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\arch\hourglass\image_transformer_v2.py", line 755, in forward
    x = down_level(x, pos, cond)
        ^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\arch\hourglass\image_transformer_v2.py", line 548, in forward
    x = layer(x, *args, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\arch\hourglass\image_transformer_v2.py", line 517, in forward
    x = checkpoint(self.self_attn, x, pos, cond)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\arch\hourglass\image_transformer_v2.py", line 50, in checkpoint
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "Z:\software\python11\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "R:\core\PMRF\arch\hourglass\image_transformer_v2.py", line 422, in forward
    raise ModuleNotFoundError("natten is required for neighborhood attention")
ModuleNotFoundError: natten is required for neighborhood attention
ohayonguy commented 1 month ago

It seems that natten is not installed on your system

xalteropsx commented 1 month ago

It seems that natten is not installed on your system

natten has_fused_na seems like not gonna work with amd handicap zluda support thnx for trying to help may be later check on linux instead of window