Holmes-Alan / ABPN

Attention based Back Projection Network (ABPN) for image ultra-resolution in ICCV2019
75 stars 21 forks source link

About model structure #3

Open Beta233 opened 4 years ago

Beta233 commented 4 years ago

Nice work.While, the model structure is not clear according the paper downloaded from arxiv. According the code in main_4x.py, the model ABPNv5 is the used model, which is far from the model described in the paper. As the 'weight_up', 'weight_down' are not mentioned in the paper, which introduce much skip connections between BP stages. Can author give an explanation for this problem?

Holmes-Alan commented 4 years ago

Thanks for your comment. For the inner structure of the back projection block, it is modified from our previous work HBPN (http://openaccess.thecvf.com/content_CVPRW_2019/html/NTIRE/Liu_Hierarchical_Back_Projection_Network_for_Image_Super-Resolution_CVPRW_2019_paper.html). We did not touch too much on the details of it in this paper. Hence, if you are interested in, please check the paper for more information. We introduce the SAB for the hierarchical structure of multiple BP blocks because we want to globally combine features for super-resolution. Within the block, the back projection process is a complete process of the residual feature updating. The skip connection is still important that cannot be replaced by attention.