octree-nn / ocnn-pytorch

Octree-based 3D Convolutional Neural Networks
MIT License
156 stars 17 forks source link

HRNet Overfitting and Dropout #17

Closed harryseely closed 1 year ago

harryseely commented 1 year ago

Hello,

I noticed that OCNN HRNet is more prone to overfitting the the LeNet implementation. This can be seen by the attached learning curve. I am assuming that this is in part due to the lack of dropout layers in the network. The dropout section of code in HRNet is commented out in the new release:

` # self.header = torch.nn.Sequential(

ocnn.modules.FcBnRelu(512, 256),

#     torch.nn.Dropout(p=0.5),
#     torch.nn.Linear(256, out_channels))

`

Should I uncomment this code to initiate dropout? Why is it commented out?

Thanks!

image

wang-ps commented 1 year ago

Yes, the number of parameters of HRNet is much larger than Lenet, and HRNet is more prone to overfitting. The dropout could be added to alleviate the overfitting issue.

By the way, according to the error curve, I think it is better to increase the training epochs. On ModelNet40, the network is trained by 200 epochs.

harryseely commented 1 year ago

Ok great I will add. I was just wondering if there was a specific reason it was commented out?

wang-ps commented 1 year ago

In my experiments, I found that dropout did not improve the validation accuracy for HRNet.