senguptaumd / Background-Matting

Background Matting: The World is Your Green Screen
https://grail.cs.washington.edu/projects/background-matting/
4.78k stars 663 forks source link

Question: Why the GAN loss was not used? Typo? #51

Open hejm37 opened 4 years ago

hejm37 commented 4 years ago

Hi, thanks for your work.

I found in compose_image_withshift() in functions.py, image_sh is wrapped by torch.autograd.Variable(), which will detach it from previous computation graph, preventing gradient back-propagation from loss_ganG to the generator. So I was wondering if loss_ganG was not used?

Could I change it to:

def compose_image_withshift(alpha_pred,fg_pred,bg,seg):

    image_sh=torch.zeros(fg_pred.shape).cuda()

    for t in range(0,fg_pred.shape[0]):
        al_tmp=to_image(seg[t,...]).squeeze(2)
        where = np.array(np.where((al_tmp>0.1).astype(np.float32)))
        x1, y1 = np.amin(where, axis=1)
        x2, y2 = np.amax(where, axis=1)

        #select shift
        n=np.random.randint(-(y1-10),al_tmp.shape[1]-y2-10)
        #n positive indicates shift to right
        alpha_pred_sh=torch.cat((alpha_pred[t,:,:,-n:],alpha_pred[t,:,:,:-n]),dim=2)
        fg_pred_sh=torch.cat((fg_pred[t,:,:,-n:],fg_pred[t,:,:,:-n]),dim=2)

        alpha_pred_sh=(alpha_pred_sh+1)/2

        image_sh[t,...]=fg_pred_sh*alpha_pred_sh + (1-alpha_pred_sh)*bg[t,...]

    # return torch.autograd.Variable(image_sh.cuda())
    return image_sh
mozpp commented 4 years ago

will torch.autograd.Variable() cut the back-propagation?

mozpp commented 4 years ago

will torch.autograd.Variable() cut the back-propagation?

it will