Open forvd opened 4 years ago
Is it possible that the entire batch is empty. This would happen in the input points don't fall in the cube [0, self.spatialSize-1]^3 ?
Can you print out as input that generates the error, please?
my input of self.sparseMode: [coords, features]
shape:torch.Size([12400, 3]) torch.Size([12400, 4]) [tensor([[ 12, 120, 223], [ 12, 120, 221], [ 12, 120, 220], ..., [ 4, 111, 124], [ 5, 115, 125], [ 5, 116, 124]], device='cuda:0', dtype=torch.int32), tensor([[30.9434, 0.0519, 1.2550, 0.2350], [30.4116, 0.1957, 1.2379, 0.2700], [30.2993, 0.2905, 1.2339, 0.1400], ..., [ 1.3888, -2.4004, -1.1622, 0.0000], [ 1.6063, -1.2806, -0.8258, 0.0000], [ 1.4836, -1.0410, -0.7130, 0.0000]], device='cuda:0')]
I had the same error and @btgraham 's comment works for me. After arranging the coords within [0, spatialSize]^3, loss.backward()
works OK. It would be even better if there are some instructions/explanations on this spatialSize
.
I've tried to use SparseConv in point cloud segmentation. Got some errors. After some steps, got this weird error. I use spconv.VoxelGerneratorV2 to gernerate the voxels and coors here is my code
class Network(nn.Module): def init(self, num_classes=20, use_norm=True, cfg=None, use_xyz=True): super().init() self.backbone = SCN_UNet(cfg) self.fc = nn.Sequential(pt_utils.Conv1d(32, 128, bn=True, bias=False), nn.Dropout(0.5), pt_utils.Conv1d(128, num_classes, activation=None))
class SCN_UNet(nn.Module): def init(self, config): nn.Module.init(self) self.dimension = config.Backbone.dimension self.spatialSize = config.Backbone.spatialSize self.numFeatures = config.Backbone.numFeatures self.reps = config.Backbone.reps self.nPlanes = config.Backbone.nPlanes self.sparseModel = scn.Sequential().add( scn.InputLayer(self.dimension, self.spatialSize, mode=3)).add( scn.SubmanifoldConvolution(self.dimension, 4, self.numFeatures, 3, False)).add( scn.UNet(self.dimension, self.reps, self.nPlanes, residual_blocks=False, downsample=[2, 2])).add( scn.BatchNormReLU(self.numFeatures)).add( scn.OutputLayer(self.dimension)) self.linear = nn.Linear(config.Backbone.numFeatures, 20)