Closed elahehraisi closed 7 years ago
Maybe the model is underfitting or there's something wrong with the training procedure. We're using the GitHub issues only for bug reports and feature requests not for general help. If you have any questions, please ask them on our forums, but we can't help you debug any model you have. There are lots of things that can make training unstable, from data loading to exploding/vanishing gradients and numerical instability.
Hi, I have created a simple model consisting of two 1-layer nn competing each other. So, I have my own loss function based on those nn outputs. It is very similar to GAN. The problem is that for a very simple test sample case, the loss function is not decreasing. For now I am using non-stochastic optimizer to eliminate randomness. Here is the pseudo code with explanation
n1_model = Net1(Dimension_in_n1, Dimension_out) # 1-layer nn with sigmoid n2_model =Net2(Dimension_in_n2, Dimension_out) # 1-layer nn with sigmoid
n1_optimizer = torch.optim.LBFGS(n1_model.parameters(), lr=0.01,max_iter = 50) n2_optimizer = torch.optim.LBFGS(n2_model.parameters(), lr=0.01, max_iter = 50)
for t in range(iter): x_n1 = Variable(torch.from_numpy(...)) #load input of nn1 in batch size x_n2 = Variable(torch.from_numpy(...)) #load input of nn2 in batch size
and here is the definition of my loss function:
def my_loss_function(n1_output, n2_output, n1_parm, n2_param): sm = torch.pow(n1_output - n2_output, 2) reg = torch.norm(n1_parm,2) + torch.norm(n2_param,2) y = torch.sum(sm) + 1 * reg return y
when I plot loss function, it has oscillation; I expect it to decrease during training.