szagoruyko / pytorchviz

A small package to create visualizations of PyTorch execution graphs
MIT License
3.18k stars 279 forks source link

LogSoftmaxBackward node seemed to be dupulicated? #47

Closed makaishi2 closed 3 years ago

makaishi2 commented 3 years ago

I created following model:

class Net(torch.nn.Module):
    def __init__(self, n_feature, n_output):
        super(Net, self).__init__()
        self.l1 = torch.nn.Linear(n_feature, n_hidden)
        self.l2 = torch.nn.Linear(n_hidden, n_hidden)
        self.l3 = torch.nn.Linear(n_hidden, n_output)

    def forward(self, x):
        h1 = self.l1(x)
        h2 = torch.sigmoid(h1)
        h3 = self.l2(h2)
        h4 = torch.sigmoid(h3)
        h5 = self.l3(h4)
        h6 = F.log_softmax(h5, dim=0)
        return h6

 net = Net(cols, n_output).to(device)
 optimizer = optim.Adam(net.parameters(), lr=lr)
 criterion = nn.CrossEntropyLoss()       

And I got attached output. Would you please tell me why LogSoftmaxBackward node seemed to be dupulicated?

スクリーンショット 2021-01-18 21 42 33

szagoruyko commented 3 years ago

nn.CrossEntropyLoss applies log_softmax, and you apply it too, so it appears twice in the graph.