Closed ihmc3jn09hk closed 3 years ago
Hi @ihmc3jn09hk
Sorry for the late replay. What is the size of your meshes (number of vertices and faces)?
Hi @ihmc3jn09hk
Sorry for the late replay. What is the size of your meshes (number of vertices and faces)?
@pvnieo Thanks for the reply. As suggested from the main page, I used the original FAUST dataset which the *.ply given has around 110k vertices.
I noticed that using --no-dec
could reduce the memory needed but the out will lost the eigen decomposition.
Moreover, could you please share more (maybe an eval.py? ) on how to perform evaluation in order to find the point-to-point correpondance from 2 input meshes after training?
P.S. I would like to use the shot.cpp directly without the python interface for performance. However, the SHOT compute() part by PCL crashes all the time. I will share more information on the code modified.
Hi @ihmc3jn09hk I think that the memory size is big because of the computation of the geodesic matrix (which has the size of 110k x 110k = 12B, which will not fit in memory).
If you really want to use the Original version of faust, you should disable the computation of the geodesic matrix, or use a remeshed version of faust, which will have less vertices (such as the one provided here).
For the shot descriptor, the python computation is pretty fast, you can find installation instruction here.
For the eval script, I will update the repo in the near future, in the meantime, you can get inspiration from this repo.
Hope this answer your question!
Hi @ihmc3jn09hk I think that the memory size is big because of the computation of the geodesic matrix (which has the size of 110k x 110k = 12B, which will not fit in memory).
If you really want to use the Original version of faust, you should disable the computation of the geodesic matrix, or use a remeshed version of faust, which will have less vertices (such as the one provided here).
For the shot descriptor, the python computation is pretty fast, you can find installation instruction here.
For the eval script, I will update the repo in the near future, in the meantime, you can get inspiration from this repo.
Hope this answer your question!
I see. Will try to use a sinplied dataset.
Thank you for the suggestions. I will check them out.
Hi @ihmc3jn09hk I think that the memory size is big because of the computation of the geodesic matrix (which has the size of 110k x 110k = 12B, which will not fit in memory).
If you really want to use the Original version of faust, you should disable the computation of the geodesic matrix, or use a remeshed version of faust, which will have less vertices (such as the one provided here).
For the shot descriptor, the python computation is pretty fast, you can find installation instruction here.
For the eval script, I will update the repo in the near future, in the meantime, you can get inspiration from this repo.
Hope this answer your question!
Hi @ihmc3jn09hk I think that the memory size is big because of the computation of the geodesic matrix (which has the size of 110k x 110k = 12B, which will not fit in memory).
If you really want to use the Original version of faust, you should disable the computation of the geodesic matrix, or use a remeshed version of faust, which will have less vertices (such as the one provided here).
For the shot descriptor, the python computation is pretty fast, you can find installation instruction here.
For the eval script, I will update the repo in the near future, in the meantime, you can get inspiration from this repo.
Hope this answer your question!
Hi @pvnieo , I tried the one you suggested and that's great yet it requires landmarks else the matching is not accurate. The SURFNet unsupervised approach deserve a deeper look-into. I think I should discuss it in the SURFNet repo. None the less, thanks for your help.
Hi @pvnieo , thank you for sharing the implementation. As suggested, FAUST dataset is used with preprocess.py. The program just stopped and reported 13x GiB memory is required to be allocated. Is there any way to reduce the amount required?