Open Shah-imran opened 1 year ago
When I introduce random noise to the input loss.item() decreases. Part of the code for training is included here.
for epoch in range(epochs): # model.train() epoch_loss = 0 epoch_step = 0 with tqdm(total=n_train, desc=f'Epoch {epoch + 1}/{epochs}', unit='img', ncols=50) as pbar: for i, batch in enumerate(train_loader): global_step += 1 epoch_step += 1 images = batch[0] bboxes = batch[1] paths = batch[2] images = images.to(device=device, dtype=torch.float32) images.requires_grad = True bboxes = bboxes.to(device=device) bboxes_pred = model(images) loss, loss_xy, loss_wh, loss_obj, loss_cls, loss_l2 = criterion( bboxes_pred, bboxes) print("Without Noise - ", loss.item(), loss_xy.item(), loss_wh.item(), loss_obj.item(), loss_cls.item(), loss_l2.item()) grad = torch.autograd.grad(loss, [images], create_graph=True, retain_graph=True, allow_unused=True)[0] # loss.backward() noise = torch.randn(images.size()) imgs = images.detach().cpu() + noise imgs = imgs.to(device=device, dtype=torch.float32) bboxes_pred = model(imgs) loss, loss_xy, loss_wh, loss_obj, loss_cls, loss_l2 = criterion( bboxes_pred, bboxes) print("With Noise - ", loss.item(), loss_xy.item(), loss_wh.item(), loss_obj.item(), loss_cls.item(), loss_l2.item())
Output - Without Noise - 20.36496353149414 0.0 0.0 20.36496353149414 0.0 6.040985584259033 With Noise - 1.169505000114441 0.0 0.0 1.169505000114441 0.0 0.05301150307059288 --------- looping again Without Noise - 25.24418830871582 0.0 0.0 25.24418830871582 0.0 7.514528274536133 With Noise - 1.8068726062774658 0.0 0.0 1.8068726062774658 0.0 0.12245824933052063 --------- looping again Without Noise - 1.1534618139266968 0.0 0.0 1.1534618139266968 0.0 0.2820863425731659 With Noise - 0.8495630025863647 0.0 0.0 0.8495630025863647 0.0 0.034061141312122345 --------- looping again Without Noise - 2754.517333984375 3.337832450866699 2730.53369140625 18.649364471435547 1.9963617324829102 5465.08642578125 With Noise - 2763.16845703125 2.5691161155700684 2741.94189453125 13.955108642578125 4.702309608459473 5486.26513671875
Without Noise - 20.36496353149414 0.0 0.0 20.36496353149414 0.0 6.040985584259033 With Noise - 1.169505000114441 0.0 0.0 1.169505000114441 0.0 0.05301150307059288 --------- looping again Without Noise - 25.24418830871582 0.0 0.0 25.24418830871582 0.0 7.514528274536133 With Noise - 1.8068726062774658 0.0 0.0 1.8068726062774658 0.0 0.12245824933052063 --------- looping again Without Noise - 1.1534618139266968 0.0 0.0 1.1534618139266968 0.0 0.2820863425731659 With Noise - 0.8495630025863647 0.0 0.0 0.8495630025863647 0.0 0.034061141312122345 --------- looping again Without Noise - 2754.517333984375 3.337832450866699 2730.53369140625 18.649364471435547 1.9963617324829102 5465.08642578125 With Noise - 2763.16845703125 2.5691161155700684 2741.94189453125 13.955108642578125 4.702309608459473 5486.26513671875
When I introduce random noise to the input loss.item() decreases. Part of the code for training is included here.
Output -
Without Noise - 20.36496353149414 0.0 0.0 20.36496353149414 0.0 6.040985584259033 With Noise - 1.169505000114441 0.0 0.0 1.169505000114441 0.0 0.05301150307059288 --------- looping again Without Noise - 25.24418830871582 0.0 0.0 25.24418830871582 0.0 7.514528274536133 With Noise - 1.8068726062774658 0.0 0.0 1.8068726062774658 0.0 0.12245824933052063 --------- looping again Without Noise - 1.1534618139266968 0.0 0.0 1.1534618139266968 0.0 0.2820863425731659 With Noise - 0.8495630025863647 0.0 0.0 0.8495630025863647 0.0 0.034061141312122345 --------- looping again Without Noise - 2754.517333984375 3.337832450866699 2730.53369140625 18.649364471435547 1.9963617324829102 5465.08642578125 With Noise - 2763.16845703125 2.5691161155700684 2741.94189453125 13.955108642578125 4.702309608459473 5486.26513671875