cedrickchee / capsule-net-pytorch

[NO MAINTENANCE INTENDED] A PyTorch implementation of CapsNet architecture in the NIPS 2017 paper "Dynamic Routing Between Capsules".
Other
168 stars 50 forks source link

Activation functions on FC decoder layers #12

Closed ygorcanalli closed 6 years ago

ygorcanalli commented 6 years ago

First, thanks for this very good code!

The original paper proposes a decoder with three fully connected layers:

FC+ReLU -> FC+ReLU -> FC+Sigmoid.

But your decoder code seens to do FC -> FC -> FC -> ReLU -> Sigmoid:

self.fc1 = nn.Linear(num_classes * output_unit_size, fc1_output_size) # input dim 10 * 16.
self.fc2 = nn.Linear(fc1_output_size, fc2_output_size)
self.fc3 = nn.Linear(fc2_output_size, self.fc3_output_size)
# Activation functions
self.relu = nn.ReLU(inplace=True)
self.sigmoid = nn.Sigmoid()

Would not be correct this way?

self.fc1 = nn.Linear(num_classes * output_unit_size, fc1_output_size) # input dim 10 * 16.
self.relu = nn.ReLU(inplace=True)
self.fc2 = nn.Linear(fc1_output_size, fc2_output_size)
self.relu = nn.ReLU(inplace=True)
self.fc3 = nn.Linear(fc2_output_size, self.fc3_output_size)
self.sigmoid = nn.Sigmoid()
ygorcanalli commented 6 years ago

Sorry, dumb comment.

cedrickchee commented 6 years ago

No worries :slightly_smiling_face: