Closed makaishi2 closed 3 years ago
I created following model:
class Net(torch.nn.Module): def __init__(self, n_feature, n_output): super(Net, self).__init__() self.l1 = torch.nn.Linear(n_feature, n_hidden) self.l2 = torch.nn.Linear(n_hidden, n_hidden) self.l3 = torch.nn.Linear(n_hidden, n_output) def forward(self, x): h1 = self.l1(x) h2 = torch.sigmoid(h1) h3 = self.l2(h2) h4 = torch.sigmoid(h3) h5 = self.l3(h4) h6 = F.log_softmax(h5, dim=0) return h6 net = Net(cols, n_output).to(device) optimizer = optim.Adam(net.parameters(), lr=lr) criterion = nn.CrossEntropyLoss()
And I got attached output. Would you please tell me why LogSoftmaxBackward node seemed to be dupulicated?
nn.CrossEntropyLoss applies log_softmax, and you apply it too, so it appears twice in the graph.
I created following model:
And I got attached output. Would you please tell me why LogSoftmaxBackward node seemed to be dupulicated?