Open dpujol04 opened 2 years ago
Hello!
Just checking the code of the DenseNet in Pytorch I found in the UpSample class, specifically, in the forward method, that the line 17 is incorrect. https://github.com/ialhashim/DenseDepth/blob/master/PyTorch/model.py#L17
The line of code is: return self.leakyreluB( self.convB( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) )
return self.leakyreluB( self.convB( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) )
But it should be: return self.leakyreluB( self.convB( self.leakyreluA( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) ))
return self.leakyreluB( self.convB( self.leakyreluA( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) ))
The first leaky relu is not included, but it should be. So, is it a bug? or it is intentionally? and if so, why?
Thank you, David
Hello!
Just checking the code of the DenseNet in Pytorch I found in the UpSample class, specifically, in the forward method, that the line 17 is incorrect. https://github.com/ialhashim/DenseDepth/blob/master/PyTorch/model.py#L17
The line of code is:
return self.leakyreluB( self.convB( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) )
But it should be:
return self.leakyreluB( self.convB( self.leakyreluA( self.convA( torch.cat([up_x, concat_with], dim=1) ) ) ))
The first leaky relu is not included, but it should be. So, is it a bug? or it is intentionally? and if so, why?
Thank you, David