sanghyun-son / EDSR-PyTorch

PyTorch version of the paper 'Enhanced Deep Residual Networks for Single Image Super-Resolution' (CVPRW 2017)
MIT License
2.42k stars 668 forks source link

Request for skip_batch functionality #18

Closed muneebaadil closed 6 years ago

muneebaadil commented 6 years ago

Hi, there was a skip_batch functionality in Lua that skipped the batch for training if the error was above some certain threshold value. Although, it wasn't used frequently in smaller models, but it was extremely useful when training bigger models (such as final version of EDSR); it lead to more stable performance.

Any chance you'd be adding that?

sanghyun-son commented 6 years ago

Hello.

I added a skip_batch function in our training script. You can enable it by setting --skip_threshold [threshold]. Please check it!

Thank you.

muneebaadil commented 6 years ago

Apologies if I wasn't able to get the point across. I meant threshold relative to previous batch error, as done here.

I tried doing it myself but I couldn't find the variable representing the mean loss value of previous batch.

sanghyun-son commented 6 years ago

Oh, sorry for misunderstanding.

Actually our first skip_batch function was look like that, so I unconsciously implemented very simple version .

I will include the appropriate version soon.

Thank you.