Closed harryseely closed 1 year ago
Yes, the number of parameters of HRNet is much larger than Lenet, and HRNet is more prone to overfitting. The dropout could be added to alleviate the overfitting issue.
By the way, according to the error curve, I think it is better to increase the training epochs. On ModelNet40, the network is trained by 200 epochs.
Ok great I will add. I was just wondering if there was a specific reason it was commented out?
In my experiments, I found that dropout did not improve the validation accuracy for HRNet.
Hello,
I noticed that OCNN HRNet is more prone to overfitting the the LeNet implementation. This can be seen by the attached learning curve. I am assuming that this is in part due to the lack of dropout layers in the network. The dropout section of code in HRNet is commented out in the new release:
` # self.header = torch.nn.Sequential(
ocnn.modules.FcBnRelu(512, 256),
`
Should I uncomment this code to initiate dropout? Why is it commented out?
Thanks!