Closed PalgunaGopireddy closed 5 months ago
Thanks. I have the following doubt:
multiple different patches from the same image in an epoch (or) the same patch of an image multiple times in an epoch?
I see in the code that:
the random patch of an image is created using sar_dataset.RandomCropPil
transform in sar_dataset.PlainImageFolder
function
Then trainsetiters
is used as
trainset = torch.utils.data.ConcatDataset([trainset] * trainsetiters)
i.e. each random patch is multiplied by trainsetiters
.
Then dataloader is prepared using torch.utils.data.DataLoader
Let's say the trainsetiters = 640.
So does this dataloader gives different patch from the same image extracted using sar_dataset.RandomCropPil
function for each of the 640 times (or) Same patch of the image 640 times is repeated.
RandomCropPil produces a different crop each time obtaining multiple different patches from the same image in an epoch.
Thank you very much. Before closing the case, I just want to ask something. Is this the only way the number of epochs in the code is decided or is there anywhere it is mentioned to run only for 50 epochs.
def learning_rate_decay(self, epoch):
if epoch < 30:
return 1
elif epoch < 50:
return 0.1
else:
return 0
trainsetiters indicates the number of time that an image is analyzed in an epoch, it is used to extract multiple patches from the same image in an epoch