kijai / ComfyUI-SUPIR

SUPIR upscaling wrapper for ComfyUI
Other
1.49k stars 82 forks source link

v2 nodes - "Input and output sizes should be greater than 0" #67

Closed cdb-boop closed 6 months ago

cdb-boop commented 6 months ago

Input image is 496x624.

failed-dimension-workflow

** Platform: Windows
** Python version: 3.10.13 | packaged by Anaconda, Inc. | (main, Sep 11 2023, 13:24:38) [MSC v.1916 64 bit (AMD64)]

...

Total VRAM 12288 MB, total RAM 65292 MB
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3060 : cudaMallocAsync
VAE dtype: torch.bfloat16
Using pytorch cross attention

...

Torch version: 2.1.2+cu121

...

Diffusion using bf16
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
Attempting to load SUPIR model: [C:\ComfyUI\models\checkpoints\supir\SUPIR-v0Q-001.ckpt]
Loaded state_dict from [C:\ComfyUI\models\checkpoints\supir\SUPIR-v0Q-001.ckpt]
Attempting to load SDXL model: [C:\ComfyUI\models\xl\juggernautXL_v9Rundiffusionphoto2.safetensors]
Loaded state_dict from [C:\ComfyUI\models\xl\juggernautXL_v9Rundiffusionphoto2.safetensors]
Loading first clip model from SDXL checkpoint
Loading second clip model from SDXL checkpoint
torch.Size([2, 512, 16, 3])
len(tiles):  2
Encoder using bf16
!!! Exception during processing !!!
Traceback (most recent call last):
  File "C:\ComfyUI\execution.py", line 151, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
  File "C:\ComfyUI\execution.py", line 81, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
  File "C:\ComfyUI\execution.py", line 74, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
  File "C:\ComfyUI\custom_nodes\ComfyUI-SUPIR\nodes_v2.py", line 296, in process
    image = F.interpolate(image, size=(H, W), mode="bicubic")
  File "C:\anaconda3\envs\comfyui\lib\site-packages\torch\nn\functional.py", line 4028, in interpolate
    return torch._C._nn.upsample_bicubic2d(input, output_size, align_corners, scale_factors)
RuntimeError: Input and output sizes should be greater than 0, but got input (H: 512, W: 16) output (H: 512, W: 0)

Prompt executed in 16.56 seconds
kijai commented 6 months ago

The image needs to be encoded again after the first stage. Refer to the example workflow in the examples folder of the node.