huawei-noah / Pretrained-IPT

Apache License 2.0
434 stars 63 forks source link

Fine-tune Dataset for SR task #12

Open Guanyu-Lin opened 3 years ago

Guanyu-Lin commented 3 years ago

Thank you for sharing this great repo!

Excuse me, I am not clear about your fine-tuning dataset for SR task. May you have not mention it in the paper or code repo?

I will appreciate it if you can help me. Thank you.

HantingChen commented 3 years ago

We use the DIV2K to fune-tune SR task

laulampaul commented 3 years ago

I want to ask in Fig. 6 of the paper, what is the task, dataset and setting when you compare the IPT with other CNNs.

HantingChen commented 3 years ago

I want to ask in Fig. 6 of the paper, what is the task, dataset and setting when you compare the IPT with other CNNs.

We compared them in Set 5 dataset for 2x SR.

laulampaul commented 3 years ago

It seems that the Fig. 6 in arxiv v1 and CVPR21 are different.

HantingChen commented 3 years ago

It seems that the Fig. 6 in arxiv v1 and CVPR21 are different.

Sorry for the mistake. We compared them in Set 5 for 2x SR in arxiv v1, while compared them in Urban100 for 2x SR in CVPR21 version (because the different in Set5 is too small).

cmhungsteve commented 3 years ago

How long does it usually take for fine-tuning (from ImageNet-pretrained model to DIV2K-finetuned model)? How many GPU days and which GPU?