Hi dMaSIF team and all the developers, I have this problem when reproducing the code, when pre-processing the training dataset, it appears that the points are all deleted when removing the points that may be trapped inside. Looking forward to your replies!
Preprocessing training dataset
0%| | 0/2958 [00:00<?, ?it/s]
Traceback (most recent call last):
File "main_training.py", line 55, in <module>
train_dataset = iterate_surface_precompute(train_loader, net, args)
File "/hy-tmp/dMaSIF-master/data_iteration.py", line 427, in iterate_surface_precompute
P1, P2 = process(args, protein_pair, net)
File "/hy-tmp/dMaSIF-master/data_iteration.py", line 126, in process
net.preprocess_surface(P1)
File "/hy-tmp/dMaSIF-master/model.py", line 457, in preprocess_surface
P["xyz"], P["normals"], P["batch"] = atoms_to_points_normals(
File "/hy-tmp/dMaSIF-master/geometry_processing.py", line 305, in atoms_to_points_normals
points, batch_points = subsample(z, batch_z, scale=resolution)
File "/hy-tmp/dMaSIF-master/geometry_processing.py", line 126, in subsample
batch_size = torch.max(batch).item() + 1 # Typically, =32
RuntimeError: operation does not have an identity.
Hi dMaSIF team and all the developers, I have this problem when reproducing the code, when pre-processing the training dataset, it appears that the points are all deleted when removing the points that may be trapped inside. Looking forward to your replies!