ChaofWang / AWSRN

PyTorch code for our paper "Lightweight Image Super-Resolution with Adaptive Weighted Learning Network"
MIT License
159 stars 22 forks source link

can't download benchmark dataset results #2

Closed oldie77 closed 5 years ago

oldie77 commented 5 years ago

Thank you for publishing your paper and code!

I'm trying to download your Set5 etc results. But I can't seem to be able to. Chrome/Google translate cannot handle the Baidu website. So I'm trying to make the download work "blind" because I don't understand the language.

Whenever I try to download your results, I get a BaiduNetdisk_6.7.3.exe download instead. I definitely don't want to run an EXE on my PC, for obvious reasons. I just want to download the images. Is that possible somehow?

Thanks!

ChaofWang commented 5 years ago

Hello, this should be a Baidu cloud installation package, but this is not my operation. For convenience, I will re-upload the data to Google drive.

oldie77 commented 5 years ago

That would be great, thank you very much! :)

(It's not just convenience. Installing some unknown EXE seems bad for anti-virus reasons, also I don't want to slow down my computer by having additional software installed.)

BTW, did you also experiment with WDSR-B? In my own tests, WDSR-B seemed to be slightly more efficient than WDSR-A.

Also, considering how well your "AWSRN-SD" model worked for you, have you tried doing "AWSRN-MD" and "AWSRN-D"?

I'm experimenting with all this myself just now, but training is soooo slow. Takes an eternity to try countless variations of depth and network structures etc...

ChaofWang commented 5 years ago

Yes, you are right. Although Baidu Cloud can provide me with more than 4T space for free, I don't like the way to install the client to download data. This is a Google Drive link. Although WDSR-B seems to be more effective in the paper, I did not use WDSR-B. More considerations using WDSR-A are the flexible modification of the input and output dimensions, which affects the overall amount of parameters of AWSRN. Of course, WDSR-B can achieve the same purpose, but this does not affect what I want to express in my paper. I have tried training "AWSRN-MD" and "AWSRN-D" on scale 2, and the results are not as much as "AWSRN-SD" can be improved. But also reduce some parameters, such as "AWSRN-MD" only 979K, and "AWSRN-D" only 1295K on x2. I even tried the AWSRN with 10M parameters, which can achieve the result of the 22M parameter RDN. But since I accidentally deleted this model, I am not listed here. And as you said, training all the models on these scales is too slow. I don't have enough GPUs to handle this.

oldie77 commented 5 years ago

Thank you!