Open LT1st opened 6 months ago
I've just dopuble checked the dataset, it's right.
from torch.utils.data import DataLoader
import tqdm
dataloader = DataLoader(dataset, batch_size=4, shuffle=True, num_workers=2)
for i, x in enumerate(dataloader):
print(f'Batch {i}:')
print(x['image'].shape, x['depth'].shape)
# print('Data:', data.shape)
# print('Label:', label)
Reducing the learning rate or increasing the weight decay might solve this problem, but it could also impact performance.
I am using your framework on image translation task.
The loss was fine at the very begining, but was changed into Nan in Epoch 006. Have you ever solved this problem?
The log infomation:
The output goes:
It seems to be something wrong, but the inputs are right at the beging.