sanghyun-son / EDSR-PyTorch

PyTorch version of the paper 'Enhanced Deep Residual Networks for Single Image Super-Resolution' (CVPRW 2017)
MIT License
2.42k stars 668 forks source link

I don't understand why repeat number is linked to the batch size. #268

Open guhuozhengling opened 4 years ago

guhuozhengling commented 4 years ago

Hello, thank you for sharing your work. I don't understand why repeat number is linked to the batch size.In file data.srdata.py,line 59: if train: n_patches = args.batch_size args.test_every n_images = len(args.data_train) len(self.images_hr) if n_images == 0: self.repeat = 0 else: self.repeat = max(n_patches // n_images, 1) This leads to the fact that if I want to increase the batch size to make the GPU more efficient, then my dataset will also get longer. Do I need to change this?

Doreenqiuyue commented 3 years ago

I have the same problem. Do you understand now?

guhuozhengling commented 3 years ago

I have the same problem. Do you understand now?

I don't understand now.But I modified this in my code.I set a batch_size and a repeat number which is independent of each other.