thangvubk / SoftGroup

[CVPR 2022 Oral] SoftGroup for Instance Segmentation on 3D Point Clouds
MIT License
339 stars 80 forks source link

BatchNorm not training #196

Open LeonardoHoltz opened 7 months ago

LeonardoHoltz commented 7 months ago

I understand the customization of fixed modules defined in the configuration files and its purpose. But I did not understand the reason why the nn.BatchNorm1d modules are always in evaluation mode, specially in the function train() in softgroup.py. They always use the default values 1 and 0 for the scale and shift. Is there any reason why they are not being trained by default? If I remove the eval restriction of these modules, what type of changes can it cause to the segmentation?