DeformableFriends / NeuralTracking

Official implementation for the NeurIPS 2020 paper Neural Non-Rigid Tracking.
https://www.niessnerlab.org/projects/bozic2020nnrt.html
MIT License
188 stars 38 forks source link

Is there source code for generating the graphs? #6

Closed Algomorph closed 3 years ago

Algomorph commented 3 years ago

Hi, gods of 3d reconstruction!

I send good wishes and gratitude from your fans at UMD, Maryland, U.S.. Big THANK YOU for posting your code! A few questions:

  1. Does the repository include the routines for generating the graph data? (I noticed some things for node processing in csrc/cpu/graph_proc.cpp, e.g. sample_nodes, but no actual usages besides imports in model.py and dataset.py. Are those it, or is there something else?)

  2. If the answer to (1) is 'yes', could you provide a quick example on how these can be used to generate the graphs?

  3. If the answer to (1) is 'no', can you provide any pointers to existing code that can be used to generate the graphs?

pablopalafox commented 3 years ago

hey @Algomorph! thanks a lot for your interest!

So we definitely have in mind to also release the part for generating the graph data. As you point out, in csrc there are some of the routines we use for generating the data, but indeed the scripts for generating the data are not released. Probably in some weeks after a deadline we have, we'll release it :) I'll keep you posted

BaldrLector commented 3 years ago

Hi, @pablorpalafox , thanks for your reply, I also interest in how the graph is constructed. After reading the graph_proc.h and graph_proc.cpp, I understand that most needed is already provided. However, there still some hyper-parameters and threshold are not clear for me. Could you provide those hyper-parameters and threshold, and briefly explain how should I to chose it, the hyper-parameters and threshold are as follows: 1) nIterations, minNeighbors in erode_mesh function, 2) nodeCoverage in sample_nodes 3) nMaxNeighbors, maxInfluencein compute_edges_geodesic 4) nMaxNeighborsin compute_edges_euclidean 5) neighborhoodDepth, nodeCoverage in compute_pixel_anchors_geodesic 6) nodeCoveragein compute_pixel_anchors_euclidean 7) edgeThreshold, maxPointToNodeDistance, maxDepthin construct_regular_graph

Algomorph commented 3 years ago

@BaldrLector: It seems like nodeCoverage corresponds to sigma in Appendix A of the publication, and the value is set to 0.05 m.

@pablorpalafox: Confirmation of this and missing values would be appreciated.

For sample_nodes, it would be also useful to know if erosion is used at all (last boolean parameter).

Also, it seems like you use either the compute_edges_geodesic or compute_edges_euclidean for computing node relationships over the motion graph. Am I correct in thinking that compute_edges_geodesic yields better results? (Quote from the manuscript: "Edges E (green lines) are computed between nodes based on geodesic connectivity among the latter.")

I would add to the list:

  1. radius in filter_depth if that is actually used to pre-process depth.
BaldrLector commented 3 years ago

@Algomorph Thanks for your reply. I think compute_edges_geodesic yields better results, while compute_edges_euclidean runs faster and does not need faceIndices.

Besides, graph_edges_weights and graph_clusters are still missing, I don't find a way to get them at graph_proc.

AljazBozic commented 3 years ago

Hi @Algomorph @BaldrLector! Thank you for raising the issue, I'm sorry it took so long to address it.

I just added create_graph_data.py file, with adapted C++ methods as well, and a train set example data. You can now try out the generation on two frames, and also visualize results using example_viz.py.

Let me know if you encounter any problems. Be aware that the code had to be adapted a little because of different raw data preprocessing, so the generated graphs might not be exactly the same to provided files, but it should result in the same performance, since logic is exactly the same.

Algomorph commented 3 years ago

@AljazBozic, thank you for posting the code for graph generation.

I'm noticing that now you are pre-loading a scene flow image in order to compute the mesh like so: https://github.com/DeformableFriends/NeuralTracking/blob/8d6a3458d872cc7bc8473b6efe060198f7a2a734/create_graph_data.py#L85 https://github.com/DeformableFriends/NeuralTracking/blob/8d6a3458d872cc7bc8473b6efe060198f7a2a734/create_graph_data.py#L182-L189

How is the scene flow image obtained for this purpose? Is it just properly formatted output of PWC-Net prediction pre-run separately from the differentiable optimizer?

AljazBozic commented 3 years ago

Hi @Algomorph! The code is actually intended to generate training data for NeuralTracking network, so it is assumed that scene flow ground truth is given (from the DeepDeform dataset). But if you want to generate graphs at test time, you would need to adapt it slightly, to generate mesh without scene flow image. Then you also cannot generate ground truth node deformations, but that is actually what you want to compute in any case using the deformation graph, you don't really have the ground truth at test time. Let me know if there is any confusion about it.

Algomorph commented 3 years ago

@AljazBozic, thank you for getting back to me so promptly!

My bad, I must have only looked at the test sequences in the data when I was looking for the .sflow files. I see them now. However, I am trying to conjure up a complete DynamicFusion-like pipeline, piece by piece, so I'll definitely need to come up with the replacement that you're talking about (vertex_pixels is the current holdup, but I'll see if I can figure that one out on my own).

I suppose remaining problems I'll be facing have more to do with #2 , so I suppose this issue can be closed. @BaldrLector , please feel free to reopen if there's something still missing on your end.

BaldrLector commented 3 years ago

Hi, thanks to both @AljazBozic and @Algomorph for the code sharing and kind suggestions. I just read the code, which solved some of my problems. Meanwhile, vertex_pixels is not utilized in the my DynamicFusion-like pipeline, I replace compute_pixel_anchors_geodesic with compute_pixel_anchors_euclidean. I almost finish the reconstruction part, and once i finish it, i will update it at #2 .

BadourAlBahar commented 3 years ago

Hi, thank you for the great work.

How would I generate the graphs at test time? All I have is the RGBD frames and object mask. You said to do that.. "adapt it slightly". Would you please explain how to adapt the code?

Thanks!

Algomorph commented 3 years ago

@BadourAlBahar, you can check out the alternative version of the compute_mesh_from_depth function from my fork: https://github.com/Algomorph/NeuralTracking/blob/380acc3871572765179f2c2087fa4f6324e3ea40/csrc/cpu/image_proc.cpp#L431-L572 Also, take a look at how it's used in my adaptation of create_graph_data.py: https://github.com/Algomorph/NeuralTracking/blob/380acc3871572765179f2c2087fa4f6324e3ea40/create_graph_data.py#L121-L135

The vertex_pixels is just the inverse mapping from mesh vertices to the coordinates of the pixels used to generate them.