haosulab / ManiSkill2-Learn

Apache License 2.0
77 stars 15 forks source link

Pointnet++ integration #24

Open ErikKrauter opened 7 months ago

ErikKrauter commented 7 months ago

The init.py file of the "modules" package tries to import the pn2_modules file. However, that script does not exist. I would like to ask whether there are implementations of Pointnet++ for ManiSkill2-Learn available?

If there are none available, I would like to ask for some guidance, on how to correctly integrate the Pointnet++ architecture into the ManiSkill2-Learn framework.

I have implemented a Pointnet++ module based on this repository However, I am not sure how to integrate it correctly into the ManiSkill2-Learn framework so that it correctly parallelizes over multiple GPUs using DDP. All I have done was to create a thin wrapper around the original implementation from the repository mentioned above. The wrapper inherits from ExtendedModule:

from pointnet2.models.pointnet2_ssg_cls import PointNet2ClassificationSSG

@BACKBONES.register_module(name='PointNet2')
class PointNet2(ExtendedModule):
    def __init__(self, hparams):
        super(PointNet2, self).__init__()
        print("CONSTRUCTING POINTNET++")
        print(hparams)
        self.model = PointNet2SemSegSSG(hparams)

    def forward(self, pointcloud):
        return self.model.forward(pointcloud)

During the construction of the RL agent the pointnet++ backbone is built through the Backbones registry. The underlying network structure (MLPs, ConvMLPs, activation functions etc.) are not built through the registry and they do not inherit from ExtendedModule. I am not sure if this approach is compatible with how ManiSkill2-Learn's multi-GPU parallelization works.

xuanlinli17 commented 7 months ago

I believe it's fine to do so. Each GPU will receive its copy of the agent network. We already handled e.g., BatchNorm to SyncBatchNorm conversion, as long as the module inherits the BatchNorm class.