liuguandu / RC-LUT

27 stars 2 forks source link

作者你好,代码这里有问题,没能跑通 #2

Open guanbrra opened 1 year ago

guanbrra commented 1 year ago

https://github.com/liuguandu/RC-LUT/blame/812a767dc7c47ed1fefd703282da8ac6345f0f17/common/network.py#L200 这里我设定的B为1 于是 x_dense = (Tensot:(1,1,48,48) B:1 C:1 H:48 W:48 在经过 x = F.pad(x, [0, 1, 0, 1], mode='replicate' ###/common/network.py 第199行) 之后 x = (Tensor:(4,1,49,49)) 在经过 x = F.unfold(x, 2) ###/common/network.py 第200行) 之后 x = (Tensor:(4,4,2304)) 与 x = x.view(B, C, 22, HW) -> shape '[1, 1, 4, 2304]' 不符合

Note-Liu commented 9 months ago

When stage is 2, I also encountered this issue.

hezhiyang2000 commented 9 months ago

The code in the reposit is somewhere different from the architecture in the Paper. I've asked the author about the question and received the answer that correct one will soon be updated Besides, this error is due to common/network.py#L144 and common/network.py#L148 Where the third parameter should be 1 but not 4. There are still some other errors in the train code, and the transfer code with infer code is not available yet. So I advice you to pause following this work until the correct one appear.

SuperKenVery commented 9 months ago

What's more, I think the --stages 1 in 5x57x79x9MLP_combined.sh is also wrong. With this, only 3 RC_Module is created (but there are 6 in the graph) and I think only the right half of the network is created.

I added this: (and some other code)

class RC_Module(nn.Module):
    def __init__(self, in_channels, out_channels, out_dim, kernel_size=4, stride=1, padding=0, dilation=1, bias=True, mlp_field=7):
        super(RC_Module, self).__init__()
        print(f"RC_Module: {in_channels}-{out_channels}-{out_dim}")

And got this:

SRNets: modes=['s', 'd', 'y'], stages=1
SRNets: mode s SxN
SRNet: mode=SxN
MuLUTUnit: stage=2
RC_Module: 1-64-1
SRNets: mode d DxN
SRNet: mode=DxN
MuLUTUnit: stage=2
RC_Module: 1-64-1
SRNets: mode y YxN
SRNet: mode=YxN
MuLUTUnit: stage=2
RC_Module: 1-64-1
HR image cache from: ../data/DIV2K/cache_hr.npy
LR image cache from: ../data/DIV2K/cache_lr_x4.npy

There are only 3 RC_Module.