Closed machengcheng2016 closed 4 years ago
Hi @machengcheng2016 Yes, you are right. This should be a mistake.
Two negatives make a positive, I guess? The codes still run correctly...
Yes, because for the Chamfer loss, the roles of two point clouds can be exchanged. So it still works well.
class ChamferLoss(nn.Module): def forward(self, preds, gts): P = self.batch_pairwise_dist(gts, preds)
But when using ChamferLoss, the scripts are:output = self.model(pts) loss = self.model.get_loss(pts, output)
Which seems just opposite