Open MasterScrat opened 5 years ago
Hi @MasterScrat Yes, need to train/fine-tune specifical networks for other factors like x2, x3.
Maybe we can first upsample by 4, then use some easy cubic or bilinear downsampling methods, but I am not sure the final results will be OK.
I see. How much time and hardware resources were necessary to train RRDB_PSNR_x4
?
@MasterScrat I used a Titan XP and it cost about one week. really slow...
You may speed up by multiple-GPU training.
@xinntao Is it possible to use scale x1? I want to train the network to remove artifacts of compression jpg without scale.
@DeltaDesignRus Though it is possible to use scale x1 (remove the upsampling layers), it is recommended to use models that are designed for compression artifact removal. There may be some better designs for the models of compression artifact removal, like U-Net for speeding up.
Is there any easy way to tune the
upscale=4
factor? would that need a specifically trained network?