Open Someone1Wu opened 2 years ago
The main idea here is change the "self.stage4" module to PointLSTM module. You can compare the usage of them here[L66-L74 v.s. L76-L82] to implement this.
The main idea here is change the "self.stage4" module to PointLSTM module. You can compare the usage of them here[L66-L74 v.s. L76-L82] to implement this. First of all thank you for your reply, I tried to change the code of stage 3 and stage 4 to the following:
in_dims = fea2.shape[1] * 2 - 4
pts_num //= self.downsample[1]
ret_group_array3 = self.group.st_group_points(fea2, 3, [0, 1, 2], self.knn[2], 3)
ret_array3, inputs, ind = self.select_ind(ret_group_array3, inputs,
batchsize, in_dims, timestep, pts_num)
fea3 = self.pool3(self.stage3(ret_array3)).view(batchsize,-1,timestep,pts_num)
fea3 = torch.cat((inputs, fea3), dim=1)
# stage 4: inter-frame, late
in_dims = fea3.shape[1] * 2 - 4
pts_num //= self.downsample[2]
output = self.lstm(fea3.permute(0, 2, 1, 3)) ##
fea4 = output[0][0].squeeze(-1).permute(0, 2, 1, 3)
ret_group_array4 = self.group.st_group_points(fea3, 3, [0, 1, 2], self.knn[3], 3)
ret_array4, inputs, ind = self.select_ind(ret_group_array4, inputs,
batchsize, in_dims, timestep, pts_num)
fea4 = fea4.gather(-1, ind.unsqueeze(1).expand(-1, fea4.shape[1], -1, -1))
Unfortunately, it reported an error with the following:
RuntimeError: Sizes of tensors must match except in dimension 2. Got 64 and 32 (The offending index is 0)
I don't know where it went wrong.
You can check whether the number of points and the channel dimension are consistent.
How to modify the code if the pointlstm layer is deployed in stage 4?