waleedka / hiddenlayer

Neural network graphs and training metrics for PyTorch, Tensorflow, and Keras.
MIT License
1.79k stars 266 forks source link

How exactly should look the second parametr? #88

Open povolann opened 3 years ago

povolann commented 3 years ago

Hello, I wanted to use hiddenlayer, but I am not sure about the second parameter (torch.zeros([1, 1, 512, 512]).to(device)), how exactly it should look? I think that the last 3 number are channels and size of image, but what exactly is the first number? So far I have implemented it like this:

summary(net, (1, 512, 512))
# Build HiddenLayer graph
hl_graph = hl.build_graph(net, torch.zeros([1, 1, 512, 512]).to(device))
# Use a different color theme
hl_graph.theme = hl.graph.THEMES["blue"].copy()  # Two options: basic and blue
hl_graph.save(path=os.path.join(dirname, outputDir) , format="png")

But I'm getting this error

Unsupported: ONNX export of Pad in opset 9. The sizes of the padding must be constant. Please try opset version 11.

The output from summary seems to work ok:


        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1         [-1, 64, 512, 512]             640
       BatchNorm2d-2         [-1, 64, 512, 512]             128
              ReLU-3         [-1, 64, 512, 512]               0
            Conv2d-4         [-1, 64, 512, 512]          36,928
       BatchNorm2d-5         [-1, 64, 512, 512]             128
              ReLU-6         [-1, 64, 512, 512]               0
        DoubleConv-7         [-1, 64, 512, 512]               0
         MaxPool2d-8         [-1, 64, 256, 256]               0
            Conv2d-9        [-1, 128, 256, 256]          73,856
      BatchNorm2d-10        [-1, 128, 256, 256]             256
             ReLU-11        [-1, 128, 256, 256]               0
           Conv2d-12        [-1, 128, 256, 256]         147,584
      BatchNorm2d-13        [-1, 128, 256, 256]             256
             ReLU-14        [-1, 128, 256, 256]               0
       DoubleConv-15        [-1, 128, 256, 256]               0
             Down-16        [-1, 128, 256, 256]               0
        MaxPool2d-17        [-1, 128, 128, 128]               0
           Conv2d-18        [-1, 256, 128, 128]         295,168
      BatchNorm2d-19        [-1, 256, 128, 128]             512
             ReLU-20        [-1, 256, 128, 128]               0
           Conv2d-21        [-1, 256, 128, 128]         590,080
      BatchNorm2d-22        [-1, 256, 128, 128]             512
             ReLU-23        [-1, 256, 128, 128]               0
       DoubleConv-24        [-1, 256, 128, 128]               0
             Down-25        [-1, 256, 128, 128]               0
        MaxPool2d-26          [-1, 256, 64, 64]               0
           Conv2d-27          [-1, 512, 64, 64]       1,180,160
      BatchNorm2d-28          [-1, 512, 64, 64]           1,024
             ReLU-29          [-1, 512, 64, 64]               0
           Conv2d-30          [-1, 512, 64, 64]       2,359,808
      BatchNorm2d-31          [-1, 512, 64, 64]           1,024
             ReLU-32          [-1, 512, 64, 64]               0
       DoubleConv-33          [-1, 512, 64, 64]               0
             Down-34          [-1, 512, 64, 64]               0
        MaxPool2d-35          [-1, 512, 32, 32]               0
           Conv2d-36          [-1, 512, 32, 32]       2,359,808
      BatchNorm2d-37          [-1, 512, 32, 32]           1,024
             ReLU-38          [-1, 512, 32, 32]               0
           Conv2d-39          [-1, 512, 32, 32]       2,359,808
      BatchNorm2d-40          [-1, 512, 32, 32]           1,024
             ReLU-41          [-1, 512, 32, 32]               0
       DoubleConv-42          [-1, 512, 32, 32]               0
             Down-43          [-1, 512, 32, 32]               0
         Upsample-44          [-1, 512, 64, 64]               0
           Conv2d-45          [-1, 512, 64, 64]       4,719,104
      BatchNorm2d-46          [-1, 512, 64, 64]           1,024
             ReLU-47          [-1, 512, 64, 64]               0
           Conv2d-48          [-1, 256, 64, 64]       1,179,904
      BatchNorm2d-49          [-1, 256, 64, 64]             512
             ReLU-50          [-1, 256, 64, 64]               0
       DoubleConv-51          [-1, 256, 64, 64]               0
               Up-52          [-1, 256, 64, 64]               0
         Upsample-53        [-1, 256, 128, 128]               0
           Conv2d-54        [-1, 256, 128, 128]       1,179,904
      BatchNorm2d-55        [-1, 256, 128, 128]             512
             ReLU-56        [-1, 256, 128, 128]               0
           Conv2d-57        [-1, 128, 128, 128]         295,040
      BatchNorm2d-58        [-1, 128, 128, 128]             256
             ReLU-59        [-1, 128, 128, 128]               0
       DoubleConv-60        [-1, 128, 128, 128]               0
               Up-61        [-1, 128, 128, 128]               0
         Upsample-62        [-1, 128, 256, 256]               0
           Conv2d-63        [-1, 128, 256, 256]         295,040
      BatchNorm2d-64        [-1, 128, 256, 256]             256
             ReLU-65        [-1, 128, 256, 256]               0
           Conv2d-66         [-1, 64, 256, 256]          73,792
      BatchNorm2d-67         [-1, 64, 256, 256]             128
             ReLU-68         [-1, 64, 256, 256]               0
       DoubleConv-69         [-1, 64, 256, 256]               0
               Up-70         [-1, 64, 256, 256]               0
         Upsample-71         [-1, 64, 512, 512]               0
           Conv2d-72         [-1, 64, 512, 512]          73,792
      BatchNorm2d-73         [-1, 64, 512, 512]             128
             ReLU-74         [-1, 64, 512, 512]               0
           Conv2d-75         [-1, 64, 512, 512]          36,928
      BatchNorm2d-76         [-1, 64, 512, 512]             128
             ReLU-77         [-1, 64, 512, 512]               0
       DoubleConv-78         [-1, 64, 512, 512]               0
               Up-79         [-1, 64, 512, 512]               0
           Conv2d-80          [-1, 1, 512, 512]              65
          OutConv-81          [-1, 1, 512, 512]               0
================================================================
Total params: 17,266,241
Trainable params: 17,266,241
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 1.00
Forward/backward pass size (MB): 3768.00
Params size (MB): 65.87
Estimated Total Size (MB): 3834.87
----------------------------------------------------------------