Closed countywest closed 3 years ago
Hi. The volumetric geodesic calculation is the most time consuming step. I used a cluster of nodes to accelerate. With a single machine, you may consider down-sampling. Take a look at quick_start.py, where I have a down sampling option. You may first consider 1K or even 512 vertices, and then use nearest neighbor to upsample distance map to full resolution.
Thanks a lot! It seems processing is much boosted with the approximation.
Hello! thanks to sharing your great work. I want to apply this framework to my dataset, but it is too slow to processing the volumetric geodesic distances. I used your code(geometric_proc/compute_volumetric_geodesic.py) as it is, and it seems to take 10 minutes ~ 40 minutes per 1 obj. Also, it requires a lot of RAM memory, over 100GB for 1 obj. The numbers I wrote above are not averaged values, but it seems too slow to processing data for my dataset. My dataset includes some complex shapes which have up to 12000 vertices, but most of shapes in the dataset have under 4K vertices similar to your dataset. Is it normal that processing time? or am I missing something?