MrGiovanni / ModelsGenesis

[MICCAI 2019 Young Scientist Award] [MEDIA 2020 Best Paper Award] Models Genesis
Other
729 stars 139 forks source link

bug? #27

Closed linchundan88 closed 4 years ago

linchundan88 commented 4 years ago

demo in file "ModelsGenesis/tree/master/pytorch"

self.out_glb_avg_pool = F.avg_pool3d(self.base_out, kernel_size=self.base_out.size()[2:]).view(self.base_out.size( [0],-1)

view(self.base_out.size( [0],-1) ???

linchundan88 commented 4 years ago

keras code: x = GlobalAveragePooling3D()(x) x = Dense(1024, activation='relu')(x)

why pytorch using two dense layers? self.dense_1 = nn.Linear(512, 1024, bias=True) self.dense_2 = nn.Linear(1024, n_class, bias=True)

linchundan88 commented 4 years ago

view(self.base_out.size( [0],-1) should be view(self.base_out.size(0), -1)?

MRJasonP commented 4 years ago

keras code: x = GlobalAveragePooling3D()(x) x = Dense(1024, activation='relu')(x)

why pytorch using two dense layers? self.dense_1 = nn.Linear(512, 1024, bias=True) self.dense_2 = nn.Linear(1024, n_class, bias=True)

Keras code uses two dense layers between the output of GlobalAveragePooling3D layer and the final output as well.

x = GlobalAveragePooling3D()(x)
x = Dense(1024, activation='relu')(x)
output = Dense(num_class, activation=activate)(x)

view(self.base_out.size( [0],-1) should be view(self.base_out.size(0), -1)?

It's actually a missing of another half parenthesis. It should be.view(self.classification_feature.size()[0],-1). Thank you for pointing it out, it's been fixed now.

linchundan88 commented 4 years ago

Yes, I neglected the next dense layer in your keras sample code.